TFGL2021 - S3 - Ep 8 - Pain Cave

Welcome to this episode of the Tech For Good Live podcast.

Joining host Bex we have TFGL team members Greg Ashton and Tom Passmore.

Our special guest is Louise Corden. Louise has been Lead Digital Producer at NSPCC for the last four years, and is just about to move to the Design, Data and Tech team at Citizens Advice.


Transcript

Bex: Hello and a warm podcast welcome to everyone except Tony Mizzoti. This is yet another episode of a Tech For Good Live podcast. If you're new here, I'm very sorry. I hope you're okay. If you're a regular listener, hi, how's it going, Steve, did you get that problem with your fence sorted? I hope so. Anyway, we've got a hot episode of the podcast lined up for you today. The planet is dying and the chances of dwindling. So yeah, we'll be chatting about that. Surely a podcast can fix it. We've got some positive news about partnership between the wonderful Chayn and Bumble, and there's some legitimate concerns about Apple's new privacy plans. So we'll be talking about that and much more coming right up. Heading into the end of times with me, we have the newlywed, Greg Ashton. Hello, congratulations. 

Greg: Hello.

Bex: And if we are heading towards the apocalypse, we're all going to be badass names to survive. So your apocalypse name is the brand of your favorite car followed by the last item you purchased. What is it?

Greg: Um, so I'm not a big fan of cars, but I did enjoy driving a Nissan Leaf once and I bought some milk today. So my, uh, end of the world name sounds like a vegan slur, which is Leaf Milk.

Tom: Leaf Milk [laughs] so bad-ass.

Bex: So bad-ass. Tom, you’re here as well. Hello. Same question to you. What's your apocalypse name?

Tom: Again, not a car person. I'm looking out the window for a car to help me because I drive an Audi. So let's say that's my favorite brand. Um, and then what was the last thing I purchased? I bought some tiny little model climbers to go on a wine rack. I think that's the last thing I bought. So that makes me Audi mini climber. 

Bex: That sounds more like a pop star for some reason, than an apocalypse. I dunno. I like that. Audi mini climber. That's quite good. 

Tom: Yeah. I’m striking with it. 

Bex: Also I've seen the mini climbers, uh, on your tweet and they're also very good. 

Tom: Thank you. 

Bex: And then a little bit with it all. So to clarify as well, your wine rack has been made out of a rock, so it makes sense that there’s climbers on it.

Tom: Yes. 

Bex: And I'm Bex Rae-Evans, my apocalypse name, I'm a recovering car person. So when I was 16, I used to make military aircraft, which is also a bad thing to do. I realized that and left, but we were all into cars obviously. Cause we went to engines and stuff and I do really like Mustangs. They've got an electric version now. It's crap. It doesn't look like a Mustang. I'm going to disappoint you with it. But anyway, let's just say that. Electric Mustang, like, because we're all going to be electric, environmentally friendly theme. Electric Mustang hot sauce, which actually I think is amazing. And I might just start going by that now. We're in the apocalypse now, anyway, probably so that's my new name. And we have a guest with us today. Louise Corden is on the podcast. Louise has been lead digital producer and PCC for four years and is about to move to the design data and tech team. As it sends advice, please. Hello, please feel free to answer the question or not. It's completely up to you [laughs] 

Louise: Hello. Um, so yeah, my favorite car, I mean, not massively, I can't pass person either, but yesterday we did have to scrap our 19-year-old Toyota Yaris. So, I'll go with that in her honor. And I'm about to do an endurance cycle, so purchasing shammy cream. So I think that would make me Yaris Shammy cream. 

Bex: Again, that's like an excellent pop singer name. I'm into it [laughs]

Greg: [laughs]

Bex: Well, thank you for joining us. How's everyone? Great job. I suppose we've got two jobs going on at the minute, in a way. Both sound great. What have you been up to? What's the plan? Yeah, so I'm just wrapping up at the NSPCC, which I've opened up for about four years and there I've been managing the donation platforms and looking after ways people can give us money. But also looking after kind of emerging tech stuff. So, um, chat bots and voice assistance. We've launched a bit of a few of those in the last couple of years. Um, And yeah, I'll show you shortly be joining a similar sort of tech-type team at Citizens Advice in September. So yeah. All about the sorts of charity, nonprofit tech stuff.

Bex: All sounds fun. Can I ask about chatbots? How do they go? Were you gonna ask the same Tom? I feel like you're going to ask the same. 

Tom: I love the idea of chatbots. 

Greg: We all love the idea of chatbots. 

Louise: So there are, there are some bad chatbots out there. I will grant on you. And I think we will use some of those definitely. Yeah, we've launched a couple at NSPCC in my time there. We did some sort of playing around with some more kind of basic FAQ ones, including for a campaign, that’s sort of a behavior change campaign, um, and parents. Um, but recently we launched one, that I actually, I think is a really good chatbot. So it falls into that rare territory we've launched on Childline. So, um, it went live in April and it took us about 18 months. We co-designed it with young people, did loads of testing. Um, and yeah, it is, it's a bot for when young people. Want to speak to a Childline counselor, but before they, while they're in the queue waiting to speak to someone. And because we like many other charities, um, sometimes have a shortage of volunteers who work in our counseling service. So young people can sometimes be waiting a little bit longer than we'd like, so the bot’s there to kind of keep them company. Help answer any kind of information questions they've gotten the meantime and, uh, yeah. Kinda keep them entertained so that you can still hang on and wait for the support that we can give them.  So, yeah, we've seen that went live a few months ago and seen really good interaction with it and really positive responses from the young people. So, yeah. Hopefully I would say it's, it's looking like it's fallen into the good chatbots camp [laughs]. 

Bex: I think there are some really good uses of chatbots. I am like a, generally a fan. I get a lot of charities asking about them and whether they should make them or not. Sometimes I don't have really good reasons sometimes not so much. But do you have top chatbot tips for any other charities you might want to implement a chatbot?

Louise: I say, definitely be working with your users. That's the good user center design, a classic tip there, but, um, yeah, definitely work out actually, if your users want to use one and if it will be useful in the kind of service that you're offering and then make sure that you're designing it with them, if that's at all possible. And also really factor the idea that it, unlike a kind of a website or something else, you're not going to have it all figured out on day one. Particularly for a chatbot, you’ll need to be constantly looking at the kind of information that the questions that people are asking to it and improving your content and your response is based on that. So, yeah, do assume that it's going to be a kind of ongoing service that you're going to continue to improve. Don't think you're just going to launch it and then can leave it on the shelf. And no one needs to think about it ever again, because that’s very much, not the way they will work well. Yeah, a couple of tips there, I think would be good. 

Bex: Thank you. Awesome. Well, we'll move straight on to stat on the week. After that nice, positive chat about chatbots, we're going to go to terrible, terrible, terrible news. It's not a stat. It's just put in the notes. It's just a fact. And the fact is we're fucked. Is that right, Greg? Please tell us more. 

Greg: Yup. yup. We’re all fucked. There you go.

Tom: [laughs] Why this week?

Bex: It said in the briefing we weren’t gonna go too hard on the swears and we’re on the swears straight away. 

Greg: Come on. Literally that I have been doing spiraling this the past few days, really, since the UN's intergovernmental panel on climate change released their report at the start of this week.  Which basically says we're all thought, uh, but it's okay, we could claw back some of this and only a few million people will die. Most likely. We'll see.

Bex: [laughs] 

Greg: But yeah, basically this is a huge report, which is probably one of the first ones to not pussyfoot around the climate issue. I think, you know, they've come to a point where they've actually gone, you know what guys, we need to be really honest and frank with you here. So all of that kind of vague, you know, is it, is it down to human interaction, are there natural kind of, um, variances? They've kind of said, nope. We've had the hottest, uh, five years since 1850 on record. The recent rate of C-level has nearly tripled between 1901 compared with 1901 and 1971. And human influence is 90% the main driver of the global retreat of glaciers since the 1990s. And it's virtually certain that the hot extremes, including heat waves that have become more frequent and more intense since the 1950s, all events have become less frequent and less severe. That basically it's a real damning report on how we are driving a lot of what's happening to us. And we see it in Greece, in the Pacific Northwest now, they've got another heatwave coming in and you know, we're really kind of seeing the fruits of our labors, basically. Um, and it's going to get worse. We're going to see a lot more wet weather in the UK running up to 2050 and beyond most likely. Because basically what they've said is no matter what we do to kind of reduce the increase now, it's going to take a lot longer to kind of drive that down. So we’ve reached a point where we've done that damage. So we're going to have to deal with what we're facing now for a lot longer. But now we're kind of at that tipping point where we need to do something quickly or, uh, it could get a lot worse. 

Tom: So can I jump in with two comments? One, one of our members of staff is that. And he was saying that the wildfires in Greece at the moment, aren’t naturally occurring. They are actually being set by people. So it's like, it's not even that like global warming has caused it. There are Greek people setting fires to cause wildfires, which is just bonkers within itself. 

Bex: Is it just because they're having misadventure or is it deliberate farming or something? 

Tom: It’s a political act. It seems to be. And I can't figure out the logic around how setting a fire that rages into a wildfire that causes both environmental and kind of ecological damage undermines the government. Well, that's fine. And the other thing was, we’re just in, we hope, kind of the tail, hopefully, the tail end of the pandemic when the whole world is shut down and very little carbon is released. So surely we will be able to know exactly in this little case study of the last eight months, how much carbon we can reduce the whole world stops. So are we safe? Do we know if we're safe or not? Did we have a huge impact? 

Greg: There was a definite indicator that there was a massive reduction in kind of greenhouse gases from kind of the transport industry. But there were other indicators that, that was kind of offset in other areas. So, you know, we w we were in a pretty strict lockdown in the UK, but a lot of the world just kind of carried on as normal. So we saw improvements here. But yeah, I mean, yeah, we had like the stat the other week where the rainforest in Brazil was, was producing more CO2 than it was storing. And you know, the big challenge we've got is whether the political will is there to make some of these tough decisions. So you've got examples coming over from America, where people from the far right are literally opposed to anything related to climate. Cause they see that as a left issue and they want to be seen to be opposed to that regardless of how that will impact the economy. They can't be seen to agree with the left. So they've just passed this huge infrastructure bill through the Senate. It's got to go through the House and the indicator is, because it includes stuff about climate people will protest it. So you've got a right-wing representative, Marjorie Taylor-Green, had a criticism that said the focus on electric vehicles would enslave America to China because of the need for vehicle batteries that are manufactured in that country. That’s kinda how those parties are viewing it. And when you've got that kind of happening and there's literally like a tight deadline of, we need to reduce massively. You know, I'm kinda like, what do you do? You can't vote these people out before they've done the damage. So what happens next? 

Louise: Yeah, I saw a, yeah, hinking about what you said about the sort of right-leaning parties in the US, I was sort of thinking, well, we wouldn't be quite so extreme here in our kind of political divide, but I did see a New Gov report in response to the climate change report that said something about, there was a question I jotted it down. And the question that was asked was, do you have concerns about climate or do you think concerns about climate change have, or have not been exaggerated and 14% of UK adults that they thought concerns were exaggerated, but when you broke it down by how those people said that they voted by political party, it was 25% of conservative voters said that they thought concerns were exaggerated. So it was much more than the right-leaning parties, were much more likely to say that. And I just don't know what to do about that. It seems so far, far kind of away from my own thinking on the issue. I kind of don't know how you can broach that.

Bex: No idea. I mean, I was literally gonna say, if anyone knows, answers on a postcard. We don’t have a postal address but a Twitter postcard. I don’t know. Is that a thing? [laughs] And how this all goes back to tech as well, like, Facebook has been spreading climate misinformation. Thanks again, Facebook for all that you do in this world to make it a better place. 

Greg: Yeah, so there was a think-tank. I think they’re based actually. So they looked at, um, adverts on Facebook from the petroleum giants, like Exxon Mobile, um, and found that they were, they changed that. So they weren't kind of saying five, it changes our green energy is, is bad. They're basically saying we're part of the solution and they've spent millions on pushing this idea. And pushing a lot of misinformation through. And Facebook said, well, you know, we're doing whatever we can and that kind of stuff. But whilst they've removed some of the ads, same old story, they're failing to include disclaimers and really capture all of the ads that are being pushed out there because they're spending so much money in pushing out so many different ads that, you know, they're just not able to, to tackle that. So if you're thinking about those people that don't think it's a big issue and maybe have the wrong information, this is probably a likely source of that misinformation. Is people like the petroleum giants who are desperately trying to cling onto power that, you know, they're definitely a root cause of that. But it's not all bad. 

Tom: [laughs]

Greg: You know, the UK actually has a really, really strong green energy market, you know, where they’re adding more and more jobs. So it's a real, uh, economic builder for the UK. And I think, you know, the conservative party and right-leaning people, they're not going to shy away from that because they can see the benefits in it. So we're in a positive place where we're able to kind of build on that success. The problem we've got is we're not one of the big producers. We're reliant on the likes of China who have basically said yeah, we’re going to peak our coal in 2030, which is way too late. And then we're going to slowly phase it out by 2060, which just means, you know, a lot more people will die. But yeah, I did come across one, tweet today, which basically was like, look guys, you know, don't have this attitude of hopelessness because then you won't want to change anything. Um, it was basically saying from Jessica The Law. So thank you very much. You brightened my day up basically saying that, you know, we're doing a lot to kind of deal with this and it is working, um, this action that's being taken and she referenced the ozone layer, which is fixed now. And I was like, shit, yeah, that's a really good point. Like that was going to be the end of the world at one point. And we actually, I know. 

Bex: Yeah, it's so good that we can say. 

Greg: It is so good that we can say that we did that. So yeah, maybe there is a spark of hope amongst all the death and destruction.

Louise: I like that [laughs] 

Tom: Yayyyy. Maybe this is Spotify [laughs] Yeah. I'm in a bit of pain cave after that. 

Greg: [laughs]

Bex: Well, you should read Jessica the Law’s thread that if you haven't already because it does help.

Tom: I haven’t. I'll do that this afternoon to make myself feel like it's all positive [laughs]

Bex: And charity news of the week again. I'm not sure whether it's positive when I get to it because I mean, it's kind of, yeah, it's a bit both. Chayn is a charity we know and love. Hera is a regular on the podcast or hasn't been so much this year. She’s actually been really busy. But they support survivors of sexual assault. It’s a sad thing that that exists. And they're going to team up with dating app Bumble to provide online trauma support. Which is brill.

Greg: Yeah, really good. Really good. And I think this is an interesting one because of that interaction and, and I think there's some stuff that we can talk about throughout all of this. Louise, I'd be really interested to hear your opinion on this from your background and kind of instructions and finding people where they, where they naturally are. But yeah, that teaming up with Bumble to provide that service Bloom, which is a remote trauma support service for survivors. So, people who report sexual abuse or assault, or be given access to, uh, the bloom courses, which are customised for the users, they have one-to-one chat with the team and there are six therapy sessions with a trauma-informed therapist. And obviously, it's sad that those people are having to access that service. But I just think that making that service more accessible in a space where they're spending a lot of time potentially is really, really cool. It's going to be rolled out in English and Spanish and they intend to offer it to Bado users next year, which is cool. But yeah, Louise, what are your thoughts on this? Cause, you know, I think there are some similarities with the kind of work that you've been doing with NSPCC and I know from my past work, you know, that idea of often it's difficult to access people who've experienced trauma, the channels that they're using, and if you can make it really simple for people, then it can really help.

Louise: Yeah. I completely agree. I think because it's such a widespread problem; kind of sexual harassment and violence, it was a couple of months ago, there was a report in the guardian about, um, something like 80% of women have said that they've experienced some kind of sexual harassment and violence. And that was even higher in the younger age groups, who you would tend to think of it, the larger users of something like Bumble and other dating apps. So I think definitely getting your service into the place where your potential users already are, is a really, is just a great thing to do in my opinion. And I think it's really interesting as well that, it’s Bumble is the dating app that they've partnered with because it's the only one of the major dating apps that was founded by a woman and has tried to have some of these kinds of safety features kind of baked in. And I think that that just that reflects the point more generally about the fact you need to have kind of diverse teams building things. And when you've got people who kind of represent that point of view from the outset, they can help create kind of products and platforms that work for lots more of their users. So I think it's a really good idea that the charity has used this as the channel to get that service out there. And particularly that they've chosen to go with Bumble as one of the kind of first partnerships for that. So it'd be really interesting to hear whether they get much uptake from there so that if it's only just launched, I guess they haven't had many results to share from that. But yeah, I think great. 

Bex: Yeah. I've been working with Chayn for a while, so I had like an inside tip that this was going to happen. And I can't actually remember which way round it was. Whether it was Bumble that went to Chayn or whether it was kind of like decided. Well, I know that Bumble already had a lot of support in place, which was lovely. And you don't often hear off, I think like a big challenge and again, inside information, but was that not a lot of people taking up that support after something happened to them, um, which is interesting in itself. And we don't really know why we're doing user research at the minute to find out what the barrier is. And I guess I have like some, some suspicions that you know, as soon as you're like, well, you have to go and have therapy now. It's like, well, I only just, this little thing happened to me. Why do I need therapy? And therapy seems like such a big thing. And also, you know, sexual assault and harassment seem like such a little thing because it happens so often. So like, it's just, yeah, it's really important that this support is offered, but important that people recognize that they might need some support as well. So both of those things at the end tackled as a part of this, which I think is, is great. 

Louise: Yeah, that's pretty good to hear. Yeah. 

Greg: Yeah. And I think it is, it's a great partnership. When you said that Louise, I was like, oh yeah. How would I feel if this was Tinder? Would I feel differently? Would it feel a little bit like they were using Chayn as a kind of a heat shield to take some of the pressure?

Tom: Yeah. That’s a really interesting, kind of, point yeah Greg. Like, kind of knowing this, kinds of cooperation and partnership, which feels like that's a benefit the users rather than just be like, oh, we do this. Give us more money.

Louise: Yeah. 

Tom: I hadn't thought about it that way, but yeah, I think it would leave a bit of a different taste in your mouth if it was one of these like larger, more terrified, I've never used a dating app, so I don't know how all that side of it. So I find them massively intimidating at the best of times. So, yeah. 

Greg: The big difference between Bumble and the other ones, is they put the power in women's hands. So the woman, once you've matched, the woman gets to speak first and starts the conversation. Because that little maneuver kind of puts them in the driving seat a little bit more, but they've done other things as well. So they've moved away from just the traditional dating app approach and they've got kind of like the ability to meet friends on there and things like that.

Tom: Okay, cool. 

Bex: Yeah. Which was my interesting story about Tinder. I joined up to Tinder because I genuinely thought it was about making friends the first time around because that's how they pushed it at first. And I was like, oh, I'll find some friends. And they very quickly found that it wasn't about finding friends. 

Tom: It was ‘friends’ [laughs]

Bex: I exited the app pretty quickly because I was in a relationship [laughs] 

Tom: [laughs]

Greg: [laughs]

Bex: But yeah, this is great. And Bloom's really interesting in itself. Bloom has been running anyway, as a course that Chayn does, and it's run by survivors for survivors and it uses survivor stories heavily. So although you don't actually interact with any of the survivors on the course, because that would be like a bit much, they collect the kind of homework for each week and share the homework with people because you know, through the research that I've been doing with Chayn over the years, like it helps them to hear other survivors’ stories, just to know they're not alone and it's not weird that they've gone through this thing, that the people go through the thing, it can be really, really beneficial. So I think Bloom is a really cool way of doing a course as well. Plus the formal therapy alongside that can be really powerful. So yeah. Interested to see how that goes for them. Onto tech news of the week. The ethical challenge for Apple of balancing privacy and protecting children. This is the classic privacy thing as well. I always use it. I used to absolutely think privacy was the most important thing in the world, but the more I've worked in the charity sector, you know, and I know somebody who works in police, digital policing and you know, most of their work is about protecting children and privacy. A lot of privacy stuff stops them from protecting children. So I've come down hard kind of on the other side at times. So what's going on with Apple? I haven't actually kept on top of this story, but please tell me more Greg.

Greg: Part of why I wanted to have this conversation, cause it's a difficult ethical challenge. Really. So Apple has announced details of a system to find child sexual abuse material, uh, on just on US customers' devices at this stage. So before an image is stored onto the iCloud photos, they're going to use this automated system to identify potential abuse material and then that's passed to a human reviewer who then reports it to, uh, law enforcement if in fact it is the case that they assess it as being abuse material. They've claimed it has really high levels of accuracy and ensures that less than one in 1 trillion chance per year of incorrectly flagging a given account. Obviously, the pushback from this has been they're running an automated system, which is assessing, uh, people's images on their devices and flagging it to an external assessor, which many view as being a breach of privacy. So the electronic frontier foundation said the system could be used for broader abuse. It's impossible to build a client-side scanning system that can only be used for sexually explicit images. That's not a slippery slope. That's a fully built system, just waiting for external pressure to make the slightest change. So they've, along with 6000 others have signed a petition, um, basically saying that Apple should pull this back. Um, yeah. It's like you say, Bex is not an easy one because when you get out there and you look at the kind of stuff that's happening online, you could kind of see why this would be beneficial. But then you've got to look at that flip side of the coin, which is yeah, having a system on your phone, which scans all of your pictures. I mean, we saw the Pegasus project. It's basically that but being done by Apple. 

Bex: So if it, like, it's kind of like want to be hotline on this and say, if your privacy is like, uh, slightly compromised and it saves one kid from all the terrible things that happen, then do it. But it's more than that, isn't it? It's that slippery slope. It's yeah, it's been built for this one thing, but it can be misused in so many wider-ranging ways. And that's what scares me the slippery slope and who has access to this and the security around that. Yeah, it's big and complicated. I don't know, Louise, do you have some thoughts?

Louise: I mean, I don't, I'm sure there'd be an official, NSPCC kind of lying on some of this and I'm not quite sure what that would be if I’m honest. But yeah, I think it is, my own personal view is that it is really difficult because, I mean, the scale of sharing of child sexual abuse imagery worldwide is enormous. In a way that I didn't, I had no idea about before I joined the NSPCC and then started learning a bit about that. But, so I think before I knew a bit about it, I probably had just assumed it was a handful of people across the world who had access to some of this stuff. Actually, it's kind of millions of images worldwide held by tens of thousands, if not hundreds of thousands of people. So, yeah, the potential to help children seems great but it does make me uncomfortable, the idea of building in all of these backdoors for privacy, like actually just putting in lots of deliberate ways of skirting people's privacy, even though I agree, I probably agree with this particular use case, but I realise it can never, once you've built that technology and let it out, it can be used for any number of other things, which I don't know how comfortable I've been about putting all of that decision making and policing in the hands of a private company. 

Tom: Shareholders [laughs] 

Louise: Yeah. Exactly. 

Tom: Yeah. It is a proper Pandora's box. It's hard. Cause it's just like, yeah, cool. That's the question I have is, did someone sit down and go, we need to protect children. This is a tool that can do that. Or did someone go, this is a tool that we can create, how can we get it green that around the world or? Well, in the US. Because they are two very separate things. It's just like, oh, we need to protect, we need to protect this group of individuals. Okay. Let's figure out some technology to do that. Brilliant. We've created it. Here we go. Like at least the morals are in the right place. Whereas if the morals are how do we get this agreement and under the radar to make it so quite with good people so we can make extra billions who can't spend, then it's like, ah, I just wish everything was transparent. I want to know the decision-making process. Which I never will.

Greg: I totally agree with you on that Tom. There was an article this week about a hidden surveillance slush fund for the New York police department. So they'd had millions and millions of dollars basically fed to them through this secret supply of money, which allowed them to spend loads on surveillance technology without any oversight. So it just feels a little bit like this would go through and then you potentially, you know, you have additional things added on where it's kind of like, well, you can't tell anybody about this because it's related to the government but we're going to get you to use that system to look at X, Y, and Z as well. So yeah, you can see how it's a slippery slope, but I don't know what the answer is because I just keep going back and forth. 

Tom: Yeah. 

Greg: You know, traditional policing is fine when you're looking at small-scale kind of localised stuff. What the internet has done has massively changed the environment and we've, you know, it's a much bigger scale, so you have to change to kind of adapt to that. But you’re basically, you know, you're basically saying let's search everybody's house to make sure they’re not breaking the law and we don't have that. That's not how the law works. So I don't, where's the balance. You need to find that kind of balance between the two. So say how do we identify somebody that’s doing something wrong without basically having to strip search every single person. 

Louise: Yeah, I think I'd probably be interested to know a bit more about the specifics of how they're proposing to work at. So if they're saying that everything gets scanned by an algorithm and then sent it to human moderators. There's probably some ethical stuff you want to think about what kind of images you put in front of human moderators, which is, you know, something I know has been in the news we're in terms of, um, companies like Facebook. And then B, is there any kind of,  which, I mean kind of like an arbitrator, if, if, if someone's images get flagged up, but actually it's not an illegal image. Cause that's one thing that gets talked about as being child sexual abuse images, and that's when they involve a young person in a sexual sort of situation, a person under 18. So I guess there's an algorithm that is only ever going to be a bit of a blunt tool for working out. Is this an 18 year old who is consensually taking part in this? Um, that image could also get flagged at the same time as a 17-year-old. And you know, if someone then has repercussions because their whole Apple has identified because they have an image that might not actually have been an illegal image. I dunno. I think there are some kind of details that I’d kind of want the reassurance of as well from that kind of process angle too Uh, it's a tricky one.

Bex: And then I always think, yeah, Greg good call about bringing it back to what the real-life example would be and them searching your home. Maybe it's a robot searching your home with. Let's move on to something a little bit nicer though. I'm because, you know, part of me thinks, and it's more about the custodianship, like who owns this tool and access to it. And, you know, the problem is that it has, as Tom said, it's owned by someone who has shareholders and can easily be corrupted by somebody giving them some money. Could it be like owned some way, be a third-party thing that's owned by charities who do not have that monetary stake in it. And that links us back to, you know, NSPCC and the IWF have recently launched a tool to help young people get nude images of themselves removed from the internet, which is a great start. You know, putting the power back in the user's hands is great. I mean, it shouldn't have to be, but at least, you know, we're not worried there about who's going to be using or looking at these images. 

Louise: Hmm. Yeah. And that was that. So that's the tool we launched a couple of months ago. We've had a pilot of it live for over a year. I was involved in the pilot stage way back ages, ages, ages ago. But, uh, yeah, it's a tool where young people can report, this is specifically talking about kind of self-generated images. So the kind of use case we talked about internally and within the organisation was imagining a 15 year old has taken a sexualized image to send to her boyfriend. And then he’s then shared around on Facebook, or it's been shared uploaded to other websites without her consent. And it's a way to get those images removed. Um, so we worked with the Internet Watch Foundation who are the kind of independent body, who look at all of these images that are potentially illegal. Images across anywhere on the internet and can get them removed from whatever platform that they're on and they work with law enforcement then to pass over that information, that then gets used in prosecutions and everything beyond that point. And so, yeah, we built this tool on Childline. So a young person can sort of log the fact they think there is an image. They can do this sort of anonymously that I need to get, give a name and they have to upload the actual image that gets doesn't get handled by the NSPCC. It's just handled by IWF who then have some amazing technology in the background, that creates a sort of digital thing, a printer ash of the image, and it can, uh, they then work with sites like Facebook and other big platforms and can have it removed if it exists on that site. Or they can have it sort of blocked so that it can never be uploaded onto that site in future. Yeah, it's helping put some of that control back in people's hands in what is otherwise, you know, potentially a really incredibly upsetting situation for them. So, yeah, we've launched that. Seems to be going alright [laughs] 

Greg: Always a tough one. Isn't it, on something like this where you're like, it's going well. And then you're like, oh God, is that the right term? 

Tom: [laughs] Lots of people have been using this service...awww shit. You want to get to a point where you're like, this hasn't been used for the last two years.

Louise:  Exactly. 

Tom: Just, it's just it's now it's redundant and needs to go sit on a shelf somewhere because actually as a species we've matured [laughs]

Louise: Yeah. That would be lovely. 

Greg: You know what I like about this one? And it just occurred to me, like what's different to the Apple case that we were talking about there is, it comes from the perspective of the victim. So how do we help the victim? Rather than, you know, how do we enforce the law? It's really thinking about how do we improve things for the victim in this case. And I think that's for me is why this feels better than the Apple idea. If they could frame it differently, you know, so that they thought more about the victim rather than how do we help law enforcement, maybe that it would be better. I don't know. 

Bex: Yeah, that's actually really interesting. Like, I really think that people should be brought to justice for what they've done but time and time again, we find out that it doesn't happen in this sort of arena. They don't get brought to justice and all it does is cause more trauma for the victim. So yeah, really interesting outlook on that. It's a shame that that's the way round we have to look at. Anyway, back to nice news. Nice of the week. Louise, you're going to be cycling for London to Jomo groups. Well done. Good luck. You start tomorrow morning as well. 

Louise: Yeah. Thank you. Yeah. I get the sleeper train down to Cornwall tonight. I'm, uh, currently surrounded by lots of, uh, pannier bags and all of that stuff. Um, But yeah, uh, very excited slash nervous slash terrified. But hopefully, I’ll be alright. 

Tom: Is it just for fun, is it for charity? Is it like just you doing it unsupported or you support as part of a bigger event? 

Louise: It's just me and my partner. So we have no other support. We're carrying all of our stuff with us. 

Tom: Excellent. 

Louise: I think I've slightly talked her into doing this. I'm not, it was definitely on my things to do list for quite a few years, but it's only relatively joined heres.

Greg: [laughs]

Louise: I'm hoping she's not going to hate me when we get up those first big hills. But yeah, we're otherwise unsupported, but we've rooted it so we’re staying with lots of family and friends across the country. Particularly after the last 18 months or so when we've hardly been able to leave London, it seemed a nice chance to kind of get out and also to kind of catch up with friends and see a bit more of the UK as well. Because we both love traveling elsewhere, but actually there's loads of parts of the UK that we want to see some more of. And we're doing it in aid of Childline and another kind of youth charity called Just Like Us. They do lots of good stuff working with sort of LGBTQ+ young people as well. 

Tom: That is fantastic. I want to get into some technical geeky questions about the gear and stuff, but this is not the show for that.

Bex: Tom is a bit long-distance cycling person. 

Louise: Ohhhh.

Tom: Yeah. So I won't get into it. Um, one quick question about it. 

Bex: But [laughs]

Tom: What bikes, what are your steads? What are your bikes that you’re going on?

Louise: It’s a sort of kind of gravel road bike. Um, I would put this on a, uh, just a specialized road bike that actually she's had for about nine years. It’s kind of guessing around town bikes and need. New bikes specifically for this purpose. We're just getting along with what we've used for a while, but yeah. Keep your fingers crossed for me, please. 

Tom: Definitely. I'm super excited. I'm jealous. I'm jealous of you. That's brilliant. Brilliant. You'll have loads of fun. 

Louise: Yeah. Any tips from you, Tom, from your long-distance cycles?

Tom: Single best tip that we instigated very early on our thing was, um, force yourself to eat every 25 miles or two and a half hours. Like you might not, you'd be like, oh, I'm not hungry, but then probably you’ll just bunk and then be like, ah, I hate this. You’ll just have a carrot with some peanut butter and be like, oh, I'm fine again. So that's the only tip I've got. It’s every 25 miles or two hours, have something to eat.

Greg: I think I remember you telling me that. And you literally had like a little pouch in the front of the bike and you were just in like munching away as you were riding.

Tom: Yeah. It's like carrots and peanut butter are a great little snack.

Louise: Ohhh.

Tom: It's like, as in, not in the real world. It's a weird one in the real world. But when you're on a bike, it's like salty, it's crunchy, but it's fresh as well.  Yeah, carrot and peanut butter.

Louise: Great. Alright. I’ll give that a go. 

Tom: [laughs] No worries. Enjoy. 

Bex: Yeah, enjoy and good luck.  And we’re on to our and finally of the podcast. Something good about big cities. Greg, tell us more. 

Greg: Yeah. So, the University of Chicago has produced a new report that found that big cities can be helpful for protecting people from depression. So, even the most superficial interactions, social interactions throughout cities can help defend people from depression. 

Bex: I love this. I think it goes back to the first story in some ways. Like, I really believe part of the solution. I watched a documentary about 20 years ago and it really stuck with me. I think it was close to me like The Human Scale and it was just about living within walking distance of everything you need is a big thing that will help the environment.  Lment.ike you should have a community of people that you want to visit and you can walk to them. You should have shops that you can walk to. You should work locally, if possible.  And also roads should just bog off and we should be able to have community squares, where we can all hang out and that will, will like improve wellbeing massively, but also help the environment because we aren’t travelling constantly to get to places. And it really stuck with me is just this idea of everything should be at human scale, not at car scale, which everything currently is. And I just thought that was amazing. And I guess that's where this is going with it. You won’t need to travel so much, I guess. 

Greg: Yeah. I think that is the case. You've got all your community around you. Not in every city, I guess, but certainly, in the ones that they use for study things, some are built better than others.

Louise: I think for me, I've one of the things I've missed over the last year of being kind of locked indoors is all of those little unexpected interactions that you have with people. So you can kind of keep in touch with your loved ones by deliberately calling them up or having a zoom call or whatever, but, uh, yeah, it was those little extra saying hello to the coffee guy and waving at the, I don't know, well, I mean, I don't wave at many people in London [laughs]

Greg: [laughs] 

Louise: The stuff you can't, that you wouldn't sort of deliberately try to recreate. So yeah. I wonder if that's something. 

Bex: Yeah, serendipity. 

Louise: Yeah, absolutely.

Greg: I would highly recommend getting married in London because we got married and everybody was so nice. It was so nice. Like we had such a nice day. Loads of people shouting compliments at you. Just being really lovely. And somebody told me that if you get married, you can get a black app for free. They're not allowed to charge you apparently. I don't know if that's true. 

Bex: Just walk around London everywhere in a wedding dress.

Greg: Yeah, I’d wear it all the time. 

Tom: I think it'd be cheaper to not buy a wedding dress and just pay for the taxis [laughs] 

Greg: [laughs] There is that, yeah. 

Bex: On that note, that is all we have time for today but thank you for listening. Louise, how did that go? Was that alright?

Louise: Great. Had al ovely chat with you all. Yeah.Thank you for having me. 

Bex: Aww thank you for coming. Where can people find you on the internet? 

Louise: I'm on Twitter at Lou Corden. I don't tweet very often. I will warn you and it is a variety of charity and tech and cat-based things.

Bex: Sounds perfect. And is there anything you’d like to promote or plug?

Louise: Well, I mean, if people are interested in the cycle or either of those charities that we are doing it for, we've got a Just Giving page. If you look for Louise Cardon, you’ll find me on there. 

Bex: Amazing. We'll put a link in the thing that the podcast is promoted on [laughs]. Excellent. Listen to this. What did you think we'd love to hear your thoughts? Get in touch on Twitter at tech for good live or email, hello at tech for good dot live and give us a review on iTunes. Apparently, that's a good thing to do, we have been told. Thank you to our producers. And don't forget that this podcast is run by surprisingly run by volunteers and we survive on sponsorships and donations. So if you've ever tuned into one of our podcasts attended one of our events, hung out in a Slack channel or whatever, please consider chucking in for the price of a cup of coffee. And that's tech for the good dot live forward slash donate. Surprisingly, thank you to Podcast.co for hosting us as well. And that's everything. It's all my thank yous. Goodbye. This is the end. 

Greg: Bye. 

Louise: Bye. 

Tom: Bye


PodcastHarry Bailey