TFGL2021 - S3 - Ep 6 - TikTok Danger and Spyware

Welcome to this episode of the Tech For Good Live podcast.

Joining host Bex in this fight against evil we have TFGL team members Greg Ashton and Fay Schofield. 

And we have a return guest with us! The excellent Lauren Coulman from social impact consultancy Noisy Cricket


Transcript

Bex: Hello and welcome to the hellmouth. This is an episode of the Tech For Good Live podcast, a show all about using technology to do social good. In this episode, we'll be peeking behind the curtain of Tik Tok’s algorithm. For those listeners who are my age, Tik Tok is a social media product and not a tiny mint that provides up to two hours of freshness. We'll be talking about the RNLI team getting some grief from losers and haters. And we'll also be talking about Pegasus. Not the winged horse that we all know and love, but instead those awful men who are probably listening to everything I say. Not Instagram. The other awful men. And last but not least, we'll be talking about a lovely idea to combat wasteful tech. All this and maybe more, coming right up. Let's get podcasting. Joining me in this fight against evil, we have Greg Ashton. Greg, hello. If you were a member of the Scooby gang from Buffy or you know, the actual Scooby gang, who would you be?

Greg: Well, I think probably Zander because he starts out as a kind of light comedy guy and then slowly gets grumpier and grumpier towards the end of the season.

Bex: It does but it's funny on rewatch, so I rewatch it every year or so the entire thing and the older I get, the changed view I have on the character. Actually, Xander is a proper tosser. He’s like, really bad. He gets better towards the end but he’s a proper tosser at the start. So at first, I just thought he was like a funny idiot, but like, really bad character now, I realise.

Greg: Yeah. Yeah. He proper, like, embodies lots of douchey lab culture. 

Bex: He absolutely does. But you feel free to own that Greg [laughs]

Greg: Well, yeah, you know, he grows and he just gets more grumpy and less douchey as he goes on. He gets a job, moves out of his mom's basement. Literally, my life.

Bex: Fay is back on the podcast. Hello, Fay. Same question to you, although I assume it would be glory. 

Fay: So really, embarrassingly, I guess, I've never seen an episode of Buffy in my entire life. Like, not even by accident. The only thing I even remember from Buffy, because I think Buffy used to be on before the Simpsons on BBC Two back in the day, it was like the little thing, the little thing at the end that goes across the credits and it goes our arrrr or whatever, because I used to go arrr our arc. Because I used to watch the Simpsons. 

Bex: You revealed this on the pre-record chat and I was shocked and horrified. Because you seem like the type that would have seen Buffy and I don't know what that type is, now I'm reflecting on it but I assumed that you were a fan.

Fay: But who's Glory? You’re saying that you think I would be glory, but who the hell is glory?

Bex: So if you google glory and look at the images or DuckDuckGo glory or Ecosia glory, Bing glory, but why would you do that, she's beautiful, so you will feel great about it. 

Fay: Aw.

Bex: However, she's also the big bad evil person in that particular series and tries to destroy the universe but you know, never mind that.

Greg: Yeah, she does it with misinformation. 

Fay: Oh, God, is that why, all right. Okay, fake news Schofield. Right. Ha ha. It’s not funny anymore. It's past its time. But to answer your other question of the Scooby gang, which I do like Scooby Doo, I'd probably be Scrappy, because I'm kind of little and annoying. 

Greg: [laughs]

Fay: Thanks Greg! [laughs] Cheers for that pal.

Greg: [laughs] 

Bex: And me. I'm on the podcast. I am Bex of the Rae-Evans variety and my answer would be whichever character gets to kick Joss Whedon into oblivion because honestly, anyone called Joss tends to be the worst but particularly Joss Whedon. Not a good guy. And that's my answer. And we have a return guest with us today. Excellent. Lauren Coulman from social impact consultancy, Noisy Cricket. Hello, Lauren. Same question if you want to answer it. 

Lauren: So also confession time, I've never watched Buffy. I think you are right. 

Bex: Again, I thought you would’ve seen buffy. 

Fay:  Yes, I am not alone.

Greg: [laughs]

Lauren: It is my jam but I also feel like I'm now way too vintage to go back and watch it. The most I know about Buffy is, is there a character called Willow who's bisexual. Right. Great. Another reason potentially to go back and watch but yeah, similar to Fay, if I had to go for like a Scooby and the gang kind of character it would be a mash up between Scrappy, because I'm quite tenacious, but then also Velma because I overthink everything. So yeah, I would say like a nice blend of the two.

Fay: I'm just picturing like a little dog in like a red wig now. 

Lauren: Yeah. 

Greg: [laughs]

Bex: So, Lauren, thanks for coming back. How are you doing? Generally, what's going on?

Lauren: I'm good. Yes, really good. We are about to kind of level up on what we've been doing around responsible tech out of Noisy Cricket, which is exciting. So doing a lot of work across Greater Manchester with ethnic equality in tech people powered smart cities, which you know a lot about Bex, given that you've been instrumental in that piece of work. And yeah, we're looking at how we make sure the tech industry in Greater Manchester first puts people first when it comes to its creation. So it's going to be an exciting couple of years.

Bex: I can't wait. Looking forward to it all. Well, thanks for joining us. And we've got stat of the week first, and as always, then charity News of the Week, tech news of the week, and a little bit of a rant at the end. But we'll start with the stat of the week. And apparently, it takes less than two hours for Tik Tok to work out what videos it wants to show you. What does this mean, Greg, tell us more about this. Why is this good or bad? I presume it's bad, but go on.

Greg: So on the surface, working out what he wants to show you isn't necessarily a bad thing. But the Wall Street Journal basically conducted a bit of research. So they created dozens of automated accounts that watched or didn't watch thousands of videos. So to basically try and work out how well Tik Tok knows you and how it assigns videos to certain accounts. And Tik Tok kind of said that it looks at various markers, like video views, interactions, the likes, shares, that kind of thing. But they found that the kind of main thing that drove your Tik Tok profile was either a view of a video or even just a hover over a video. So if you linger over a specific video, it would then kind of add that into your profile. So yeah, fine, offering you more specific content. However, the dozens of profiles that they created, they gave them personalities. They didn't import any of the information into Tik Tok but they would give them certain markers like depression. And what they found was, as these accounts started to look at this content more, because of the way that algorithm works, it would very, very easily mean that you could go down a rabbit hole. And that has two effects. One is you're constantly confronted with content around potentially sensitive or difficult topics, which could cause people, particularly someone with depression to spiral. Also, it means that because of there's so many videos on Tik Tok, when they first start looking at, at Tik Tok, they get given a wide variety of videos with high viewers, high number of viewers, and a lot of those videos tend to be vetted, because there's high numbers of viewers. But once you get down into the depths, the bowels of Tik Tok, those other videos that you start to view, which are more specific to your content types, wouldn't necessarily be vetted because they have much lower viewing numbers. So you could have you've got this double whammy of somebody going down a rabbit hole of potentially harmful content and the content becoming increasingly more risky and harmful, because it hasn't been vetted like the stuff that you see on the surface. So yeah, really, really interesting kind of exploration of how an algorithm which is working perfectly fine and have really negative consequences.

Fay: Just on a, that is a very serious topic I know. But just to reveal the age of everybody who's on this podcast, that there’s a note on the agenda, it's called top talk instead of Tik Tok [laughs]

Greg: [laughs] It's just a typo. 

Fay: I know. I know. 

Lauren: No, yeah, no, I call it top top because I’m ancient. 

Greg: [laughs]

Lauren: And the thing you're absolutely right Fay. It’s a really serious topic. The vetting piece especially. And I would like to see vetting for finality on Tik Tok, I think that you're really helpful. And I understand like the, you know, the automated accounts they used were there to kind of dig deeper on particular topics, but genuinely who has two hours to spend to train an algorithm on Tik Tok, especially with the kind of content that comes out of there.

Greg: Yeah, it's I mean, you're not even spending two hours though. I think that's the issue is people, because of the way it's built, you just, you're doomed, you know, not doom scrolling but you're just scrolling is made to get people to just constantly consume and there seems to be never ending content in there. And like you say, banality like I have not not got tik tok. I've not. I know, you know how it works and all that kind of stuff. I've seen content from Tik Tok, but that's usually the stuff that's kind of bubbled to the surface and left. From watching this report. I was like, Oh my God, there is some terrible content on there, like absolutely shocking. Like, one of the things that they were talking about, which was around this topic of depression, was kind of videos with music over the top with like, somebody just narrated some kind of, not even a story, just some random clips and comments, to a car driving. Like, just a video of a road and they're talking about relationships. And I was like, who the fuck watches this shit?

Fay: People like me, unfortunately. I will, I will fully put my hand up. I am obsessed with Tik Tok, I get to the point where like, as if you've been scrolling, and if you're on the app for an hour, it gives you a warning. 

Bex: Oh God. 

Fay: I know. So it’ll come up and be like…

Bex: Fay.

Fay: I know. Shut up. Lockdown. You spend like a solid hour kind of on Tik Tok. Like, do you think you need to do something else? But it is so true. Like the rabbit holes that you can go down is just mad. Like, at the weekend, I found, again, it was just like, oh no, this is like, this is insane, where they'd basically taken 911 calls in America and were kind of playing like, the between between like, obviously, you know, the person calling and then the responder on the other end. I’m on the app the next day and I've just got like police content, and I was like, oh my God, and yeah, go back and like, watch some cat videos. But it's, I don't know, it's such an interesting app. Because the way that it's kind of being taken by kind of like the younger generation to kind of mobilise especially during the election. I mean, the US election I showed up, and it was like a great story about, you know, where Tik Tok has bought tickets to a Trump rally. And then he just ended up with like, nobody there. And I know like nh I think NHS staff are kind of using Tik Tok at the moment as well to like, try and combat vaccine myths and reach younger people that way. So this is a really interesting platform in how that content can reach people. But there's also like a lot of danger that comes with that as well.

Bex: Isn't it really good that it's really good that people can spread those messages, but I just have this creeping suspicion that more negative and bad and wrong and incorrect messages are being spread than good. And I don't know what we do about that. And also, obviously Tik Tok has a platform has been criticised so many times for a variety of different mistakes he's made as a platform and how it moderates and sorts out content. I think what flags for me in this article is their claim that they use disruptive content to prevent people going down a rabbit hole, but actually, this particular study found that they only disrupted content, bunny quotes, that they found was advertisements and I don't think that counts. I don’t think you can just disrupt somebody's Tik Tok viewing with an advert and it may and it fixes it. I don't think that's how that works. And so they basically just lied about the disruptive content thing basically and that really upsets me because we need disruptive content and people are going down these rabbit holes.

Greg: Yeah. And then I think they tried to push back and say this was you know, it was automated, it wasn't a genuine kind of person but the Wall Street Journal, you know, they provided evidence to show that they used complex automation. They weren't just kind of focusing solely on single hashtags and things like that. So yeah, I think that you know, there may have some points in Tik Tok. I think ignoring those concerns is much much worse.

Bex: Personally as well and Fay you said it as well and I wonder if at least is that definitely the personal way that I use it. I get stuck on Tik Tok for longer than any other platform and I feel stuck in it. And the reason why cuz I'm like, this is rubbish, this is rubbish. This is rubbish. Eventually I'll get to a good one. And that good one is like oh, good one. This is rubbish. This is what I've spent and not only have I spent 20 minutes on Tik Tok when I should have been doing something else and what that would have enriched my life much more. I've also just spent twenty minutes on content. I don't really like just hoping for something good. And I just feel that's really dangerous. I'm not one to do that. But Tik Tok out of all the platforms really makes me do it. I don't scroll on any other platform really in the way that I doon Tik Tok. Danger Zone.

Greg: What's the definition of madness? Hearing the same thing again and again and expecting a different result [laughs]

Fay: It's so, it's just so addictive though because it is like, as you say Bex, that scrolling to find the funny one that you then like, and then you you know, it's just, you do get like massively stuck in a loop. But I think it's just such an interesting one. And there's been loads and kind of the news recently about like, teenagers kind of dying because they're taking part in different challenges on kind of Tik Tok. So there's one thing it's called, like the blackout challenge or something like that. And then a 12 year old kind of passed away. They were based in the US for like taking part in it. And there’s kind of, people are obviously doing stunts and kind of Tik Tok where they're on holiday. And it's like health warnings having to be issued and all this kind of stuff. Like I was scrolling on one and it was a guy like doing a mock commentary of gymnastics for the Olympics, which was hilarious. But it also then issued a warning, like a health and safety warning, which was like, do not try these stunts at home. And that's the first time that I've seen that and I was like, Oh my God, is this also becoming something that aside from hey, stop scrolling and go outside and like breathe fresh air rather than sit on this goddamn app. Are they also now having to like issue health and safety warnings for stunts, which apparently so.

Lauren: It's such a good share, the content creator, like angle of this as well, it's really important and obviously there's they're more dangerous like participation. I also think as well that like when when you're dealing with sort of short bursts of content, like where's the creativity in that, where's the ability to like, explore ideas, kind of convey more complex topics, and because so much of Tik Tok is like participating in themes and trends like you flagged Fay, I think, gosh, where's like, where's the expression beyond that and the ability to kind of think more laterally or holistically about some of the stuff that people are passionate about. So yeah, all around lots of questions.

Bex: However, the sea shanties are amazing. 

Greg: [laughs]

Lauren: What?

Bex: Go check it out. Don't go on Tik Tok.

Greg: [laughs]

Fay: I’ll send you a link to a Tik Tok video that was uploaded to Twitter so that you then don’t have to go on Tik Tok.

Bex: Ok. That’s sea shanties. Charity news of the week. An RNLI volunteer was verbally abused for doing their job. Another bad negative story, Greg, what's going on here?

Greg: Good segue, I like that [laughs] I wonder if RNLI do sea shanties and if they don’t….. 

Bex: They should. 

Greg: Guys, come on. Come on. That is like, you were just asking for that. Yeah, so not a great story. There's been a huge uptick in the number of refugees making the crossing over the channel, and RNLI, you know, lifeboat service has literally been doing its job in rescuing people from the sea. Because often these thingies and boats are overloaded and, you know, ready to collapse or broken, all those kinds of things. So they're doing their job in helping people out at sea. And I mean, this wasn't even on the coast. It was a lifeboat station on the Waterloo Bridge in London. One of the crew members reported being abused by two people. Verbally abused. There's not really much detail about this. RNLI have kind of kept it fairly under wraps. But yeah, I just wanted to include it because that whole idea of, you know, regardless of what you think about immigration or refugees, which isn't immigration, that's people escaping prosecution in another country, not making an economic decision about oh, I'm gonna go live here. Regardless of what you think about that, you shouldn't be having a go at people for literally doing their job. And these are like volunteers. So yeah, it just struck me as ridiculous.

Bex: Yeah, I have to say, absolutely. Look, I saw the tweet that the RNLI did in response, I think it was in response to this, it feels like it was. They said our charity exists to save lives at sea. Our mission is to save everyone, like our life savers are compelled to help those in need without judgement of how they came to be in the water. I absolutely love that. I just think that's the perfect answer to that. Like shut up, this is what we do. It doesn't matter who they are. This is what we do. And I just think that's a really good response to it and good on them for standing up for it because it is a political issue, isn't it and it gets messy. But who knows who fundraises and gives them money and it could put their fundraising in jeopardy and stuff, but I just love that they just stuck with it and defended it.

Greg: Yeah, I just think, you know, you start doing this stuff and you're running the risk of saying, well, why are doctors treating certain groups of people? You know, one minute it's, well, they shouldn't be, you know, treating murderers, and the next minute it's they shouldn't be treating black people or Jews. Like, you know, we're one step away from that. And the more and more I hear about these kinds of things where people are getting annoyed, the more I just keep thinking, we're flashing back 100 years, and, you know, we're gonna be looking at the Reich rising up. Because it didn't come about from one person's vision, he came about from people like this supporting these kinds of ideas and everybody else just stepping aside and letting them do it.

Lauren: Yeah, he does. He does. I was really shocked when I read this in the kind of brief notes for today. And it just shows like, culturally, where the slippery slope is starting to like, tilt down if we're, if we're arguing and, you know, with a charity for fulfilling their mission and delivering against what it is they are funded and kind of commissioned to do. And it shows just how much that broader political context is slipping into everything from Brexit to kind of cutting down our aid budget internationally, and the conversations and kinds of actions happening around deportation. And, you know, it's pretty bad when you get into a place where charity workers have been abused. So I thought it was a really sad indictment of, of where we're heading, the kind of shift that's happening and one you really kind of dig into.

Fay: Yeah, it's a tricky one as well, because it's just, you know, it's so again, talking about the good side and the bad side of social media, kind of, you know, going back to our previous Tik Tok kind of conversation, and you it's just, you know, platforms like this, unfortunately, will allow people to kind of, I guess, what am I trying to say? Communicate an opinion that they probably would never ever say in front of somebody, you know, in front of somebody’s face, and just kind of, there was the other story that came out. That came out over the weekend as well about, I think it was the speech that was taken part in Trafalgar Square, and it was an ex-nurse, who was on the podium talking about like, hanging doctors and nurses, and just, you know, obviously, Twitter kind of blew up over that. But you've always got a, you know, it's, there's, there's people behind these brand accounts as well. So like, they are an ally, or like the NHS. NHS is a great example, for kind of what they've been through, I have no doubt in kind of in the past year, and it's just, it's just, it's just such a it's just such a mad one because having worked in and been on the receiving end of some of this stuff, like in a previous life, like it is mad when I'm like, would you say that? You wouldn't say that to somebody's face. Like, what gives you the right to say it online, I know, that's wading into kind of freedom of speech and all that kind of stuff. But on the flip side, you know, of kind of the bad side of social media, just the people that rallied, you know, for the hour and ally, as well, as well, as you know, kind of the hate is what the people that were there to kind of rally for them. And then like the NHS gone off the back of this story, over the weekend, as well, like, does continue to give me a tiny, tiny sliver of hope, on occasion about like, the good that can come out of social media, even though at some points, it is only a sliver. And then I go back to Tik Tok to distract myself from like, you know how bad that is. But yeah. This is why I spend so much time on Tik Tok [laughs] just to distract myself from the pain of reality. That sounds really dramatic. 

Greg: [laughs] 

Fay: I'm totally fine. 

Greg: On that point, though Fay, you’re right. It's very much, I think, the thing to remember and why it's so loud, the negative stuff often is so loud. I think the more and more I look at it recently. So last week, we talked about the misinformation dozen.  Fay was in there. Along with Robert F. Kennedy's nephew. I'm saying you were one of the misinformation dozen Fay.

Fay: Oh right.

Greg: [laughs]

Fay: I was like, I wasn’t on the podcast last week. I get it. You're making this fake news joke again.

Greg: [laughs] Yeah. Yeah. But the point was, a lot of that negative information was just from a small number of people. And I think that's the thing that we seem to be forgetting, is that a lot of this is magnified. And actually, when you look at the numbers, the people that have these views are in the minority. But the problem is, is the way that so much of this is structured, like social media, like our governments, is to give voice to those minorities. Look at what's happening in America at the minute with the way that the voting rights are being suppressed and actually that seems to be, you know, a powerful minority who wouldn’t be able to do that. 

Bex: Yeah, no. But social media also oppresses those voices as we've seen, like a lot as well. Like we know that content from people who are talking about this sort of thing is being suppressed and not amplified. And you know, I think a really good example of this and I think we talked about it a few weeks ago, was the Tik Tok advertising was weird as well. Like, it was blocking a load of content, that was like positive, positive black content. But then, like, white supremacy content was fine. It wasn't blocked. It was like a really strange set of criteria, which we talked about two weeks ago, if you want to go and listen to us talk more about that. Because it was horrendous. So yeah, I don't know. We used to talk about it being the great leveller but I worry that it actually isn't at all anymore. 

Lauren: I agree. I think the kind of like, democratically speaking, social media has helped historically to level voices and give people marginalised groups a voice, allow people to organise, and that's still happening. But you're right, because because I guess social media platforms and their algorithms plus this kind of like savvy influencer culture, you see certain voices getting gamed above others, which is where the kind of, you know, the the 12, like dissenting voices around, like vaccinations kind of gets a peek. And then when you start to look at it, and I know, we're going to talk about this but when you're looking at what's happening with the Pegasus spyware, you just seen an increasing number of ways in which Facebook groups, for example, what happened with the Myanmar military and kind of infiltrating those groups and influencing perceptions around at the government in that country, then you had data being utilised through Cambridge analytical and this spyware piece feels like the next layer of like, pushing or like utilising what was good or what could be used for good and kind of taking that away. And because they’re targeting activists and journalists, you know, dissenting voices in countries like India, it's quite scary, but there's so much suppression. And it ties into this bigger trend that's happening in this kind of episode of human and human rights and democratic, like processes. I read a really brilliant article last week around trade, truth and power, and how, you know, these terms are being used to kind of take back power, where people who were citizens were communities and started to kind of like, scrape it back. And you think about, you know, the stuff that's happened in the last few years in the UK. So the probe into parliament in 2019, pushing back against legislation that would help the safe reign of democracy. You see what's happening around the attacks with the BBC in the last couple of years. Really scary stuff, where they're trying to discredit. They're trying to discredit the institution. This last year, we've started to see moves towards suppressing protesting, and where we're starting to look at journalists being silenced or charged for speaking out of turn. And this spyware piece, it just feels like the technological mirror to what's happening is another way to kind of suppress people's voice and it's just absolutely petrifying. And it speaks into this, like, you know, what voices are we allowing to have space? How do we get to kind of question and challenge and even beyond that, think about alternative ways of doing things, Which technology is no longer helping.

Bex: Thanks, Lauren. Great segue into the next bit. So the spyware thing, the Pegasus thing is our tech news of the week. And Greg, for those who weren't here. We did talk about it last week. And for those who weren't here last week or those who haven't read about this or seen it anywhere [clears throat] Me. 

Greg: [laughs] 

Bex: Can you please give us a very brief update before we start talking about the latest developments?

Greg: Yeah, so we touched on it very briefly last week, in a kind of wider conversation with David Higgins about cyber security. And this was just kind of breaking at the time. And it's huge. It is probably one of the biggest stories or has global ramifications and is a really big piece. So the Guardian ran a lot of the news, but it's really been worked on by a real global group, comprising charities like Amnesty and also other news agencies. And essentially what it focuses on is this product called Pegasus. And Pegasus is a product which was created by a tech startup based in Israel called NSO. And it's essentially a spying tool. So governments would, according to NSO, go and be vetted by their teams. And the aim of it was following the likes of 911. You know, this whole view, the classic thing of, we could stop terrorists if we could access their phones and find out what the, how they're communicating, what they're saying, what their plans are. But there's increasing use of encryption tools like WhatsApp and others that mean that law enforcement can’t access that information. And what Pegasus does is it uses some kind of day one weaknesses. And you don't even have to click on a link. So they can send you a video through WhatsApp or an email with a link in and you don't even have to click it. And then it installs on your phone. And it's literally like cliche spy movie stuff, like they can turn your microphone on, they can track your location, they can turn your camera on, they can read absolutely everything on your phone, including any messages that are sent through encrypted channels. So pretty much absolutely all information is available. Now they were like, it's all above board. And it is. So you know, it's a legal organisation selling this to countries and governments all over the world. The positive thing about this is that argument will now hold absolutely zero weight, because literally everything that any kind of activists or journalist warned about, that could happen if these tools were built and used, as happened. So the report, you know, there's been a number of articles written about this, but we've got things like Emmanuel Macron, you know, President of Paris. Of Paris? Of France. Has indicated that it was used on his phone, along with former Prime Ministers and twenty of his Cabinet. You've got in many cases, journalists and activists being targeted by governments of far right and fascist leanings. So you've got cases in Mexico, who were the first and most popular customers of Pegasus, where basically, it looks like everyone and their dog had access to the tool, and we're using it right down to local governor level, to basically orchestrate whatever the things that they were doing. You've got the ruling party in India using it against the opposition party. There's potential links to the Jamal Khashoggi case, and his murder assassination. Pretty much any kind of human rights issue globally that you could think of in the last 10 years, has some potential link with this. So the case of the Saudi Princess that's kind of been written about more recently, that happened a while ago, there's definite indications that this tool was used to track her. To the point where they could literally trace, knowing what they did about the events that happened, they could trace the actions of the security services in relation to what she was doing and their use of this tool, in order to access her friends and colleagues phones in order to trace where she was. It is probably one of the biggest security cyber scandals that we've seen ever and it has really far reaching implications. Not just for this organisation, but also for what governments do about this in the future.

Bex: Who wants to start? It's a big one. I think my first thought is that it's so big, and I don't think anybody knows enough about it. And this is the kind of thing where, you know, I go on my rants on these podcasts with no evidence or research to back it up, and it always feels a little tinfoil hat. But this is it. Like it happens and it's here and it's actually happening. And this is really bad. And it has such wide reaching implications. And like, I feel speechless, and I don't really know what happens next or what we do about it. Or if we can do anything about it. Somebody say something good.

Lauren: I agree. I am struggling to get my head around the kind of ramifications of this. I mean, you can't go far beyond, like people's deaths. The kind of using to kind of play around with truth but it feels, I tried to take a step back when it when it first came through and go okay, what what is this like, symptomatic or what is it saying a big picture level about humanity and where we're moving and it feels like a really strong backlash. That voice I was talking about earlier. So as marginalised groups surprise people, communities are starting to challenge capitalism. You know, you can see the fallout that's happened in the last week around the Wacky Races. The wacky race to space between Richard Branson and Bezos. And you know, this, this kind of push against the kind of money, the capitalism, you know, then there's equality, the wider movements around equality, Black Lives Matter, trans rights, people who traditionally wouldn't have had a voice or have a say, starting to wield some at least social power, social capital. And then, you know, people who are finding ways to have influence and it just feels like the most abusive power play to kind of remove voice to weaponize truth, to use legal and political and economic sanctions to like suppress people. And it's a slippery slope you were talking about Greg earlier. And it's the most like, in a kind of non qualifying way, it's the most perfect and I'm doing bunny ears, perfect example of how that power is being utilised and weaponised and beyond that, I've not got much further.

Bex: Well, I think that's a positive take on it for me anyway, because you reminded me of something I read, like years ago, and I never remember where I read it, or who said it, or how true it was. But it really stuck with me to help me frame things positively. That generally things are going on an upward positive, you know, in history towards, you know, doing good in some way or another, it always goes up. But there are little dips. There's always a backlash when something's going well, there's a backlash. But on the whole, we're going up. And maybe we're just in that backlash piece. Now, and that definitely feels like that. And I always like to try and think of things in that way that broadly we're going in a direction. There's just some people who need to get mad for a bit while we're doing it. And that makes me feel better reminding me that hopefully we're generally [laughs]

Lauren: That might help me tonight as I'm trying to sleep [laughs] 

Bex: Just in a backlash, it'll be fine.

Fay: We'll get over it. Onwards and upwards. It's just, it is just completely mad because it's one of those where it's like, where do you even where do you even begin to try and like deconstruct this? I mean, the fact that it's come out as like, you know, kind of a expose or whatever, is really positive, you know, the fact that this has broken, you know, trying to find the positives in this, again, just kind of thinking back to sort of the reaction to kind of Cambridge Analytica. Like back when that kind of, you know, when that was kind of coming out, and just almost kind of this slight shift in kind of people moving away from social. Again, thinking very specifically about Facebook, but like this, you know, this kind of abuse of using tech in this way, hopefully, further down, you know, when people have had time to kind of dive or, you know, digest what it means. And, you know, governments and other agencies or whatever, are trying to kind of tackle it, will it also then have like, another kind of positive impact that people will almost question tech more in the way that people have been doing in the past kind of few years, you know, again, just thinking about, like, my own personal experience, and the amount of friends that I've got, who have left Facebook. You know, the uprise, of Signal and stuff like that. So I think, maybe, you know, both of you are right, that it is, even though it's so kind of mind blowing, and so like, Oh my god, where do we even begin? Maybe that's kind of what, maybe that's kind of what humanity needs to to kind of wake up and just kind of realise, this is how, you know, this is how the people in power are actually wielding the tools and the tech and, you know, phones or whatever that you use every day to control you. Like we all know, there's always some form of government control, you know, even if they say that there isn't there is, but this is so kind of entrenched across so many different countries, that part of me does hope that it does cause a bit of a kind of tech uprising, and people don't want to stand for this stuff anymore. In the same way that they began to do when it was, you know, Cambridge Analytica are coming out and the way that algorithms have kind of played into like disastrous events, like what happened in the Capitol in the US and all this kind of stuff. So a positive note, maybe it might shift people's perceptions on how tech should be used.

Lauren: That's interesting. I have to admit this. This was the first time that cyber and what like a massive cyber risk like this, and I work in tech ethics and responsible tech and this one is the first one that really gut punched me. And I think with things like Cambridge Analytica, it felt too remote, too distant. Like it wasn't, it didn't give this like the same, like humanity related subcomponents that this has. But the idea that something could land on your WhatsApp or in your phone without you even having to open it or click on it, that level of invasiveness feels frightening. And the other thing I was thinking about when you were talking, Fay, was like, currently, most governments were, you know, that looking to sway opinion and influence policy or, you know, challenge opponents, you know, they use the media as a blunt weapon. And I was thinking, gosh, now they could be much more specific. And I thought, given how, you know, most governments across the world have handled the pandemic and data to kind of utilise, I was like, how effective are these suckers going to be? I mean, obviously, that's not the case with India, because there's some horrific, horrific things happening but I think you're right there, it kind of really hits home in terms of the impact on humanity with this one. So there is potential for it to shift, at least awareness in the first instance, if not, like deeper understanding. So you're right, there are some like, shoots of light around this.

Greg: It's all fun and games until somebody gets hurt. But actually, it's not just somebody, it's like somebody they give a shit about. And the thing that's going to shift the focus on this is the fact that it is governments, That you've got, you know, the President of France, who's been potentially a target of this tool. And, you know, other politicians. And I think Edward Snowden made a really good point, which was you, these are weapons. And these are basically, this is on the level of like a nuclear or a chemical weapon, and there's zero, and the policing around it, we've just, and the problem is, it's not like a chemical or nuclear weapon, you know, the access to the thing that you need to make these is much easier, it's much more accessible, you don't need to, you know, alter uranium or have a bio lab, you can do this out of a garage. So we've got these weapons of mass destruction that anybody can build. And I think governments have kind of thought, well, it's fine, we're protected. And I think now they're going to realise, well, shit, we're not actually. This is a real evidence that the, you know,

Bex: It's the thing though. They're doing bad as well. There's this thing in your notes Greg where apparently NSO were hatched to deal with Dubai. And the reason was that they were like, oh, it's because of drug dealers. It's UK mobile numbers. But it's like, oh, no, but it's not in the UK. It's because we're looking at people using foreign SIM cards. Just drug dealers, you know, just our dealers. It's fine. We're just like our drug dealers. And then it turns out and the list came out, it wasn't a big deal at all. It was human rights activists. And well, dude it's human rights activists and by like, you got me. And then what happened? Fuck all, right. I'm fuming. Yeah. Yeah. Talking about fuming. Shall we move on to the rant of the week? Just to cheer things up a little bit. Lauren, you've got really good, what I say is good. It's meaty.

Lauren: It's yeah, there's definitely stuff to chew on. So yes, yesterday there was news that the government in the UK, in partnership with the NHS and apparently charities, you know, organisations working in the health space and particularly around food, have started to explore a points based system where people who are obese can lose weight and win prizes. Vouchers. It's a very capitalist approach to a personal health issue. And I was quite aghast, one because I am a fat person and the idea that a couple of prizes could encourage me to address you know, the root causes of (I have a binge eating disorder; full disclosure) could address the very complex mental and emotional health issues underlying. You know, in my particular instance, my weight was flabbergasting. It shows a massive lack of understanding around why people are overweight or obese as well as the kind of mental and emotional challenges around. There are other considerations like people's genetics. And I come from a chunky family. We are prone to run to fat, but also as well, its environmental and it felt so reductive, so infantilizing to think of, you know, groups of people that are overweight and obese. And yes, you know, there is an upward trend around people gaining weight and that is impacting their health. But by and large, it's a class issue at, you know, people who experienced poverty, people who struggle to afford to live tend to have access to poor quality food. They have access to junk food that they live in environments where healthy, you know, access to green spaces and places to exercise aren't as readily available. And it just felt like there's been absolutely zero understanding of the root causes of obesity or being overweight. And just such a painful response. And you know, the target figure for this is 6 million and there are so many different ways you can spend that money in helping people if they want to be helped. Again, you know, this is a health issue and it comes down to people's personal stuff. So yeah, I was freaking livid.

Bex: Thank you, Lauren, I think it's such an important issue. And I'm really glad that you've felt you've been able to tell your own personal story as well, because I think it really adds context to it. Like, I think my personal story recently, I have continuously paid for personal trainers my entire life, and I do get really feel like the entire health health industry is stuck on this. We'll just do these things and it'll be fine. Like just do this exercise and do this eating thing and it'll be fine. It's like, well, I definitely know the theory there. But you know, there's other things, there's a lot of mental things for me that block me from not eating or not exercising. And it's much, much more complicated than just saying, Do this exercise and eat this thing. Like, I know that. I know that everyone knows that, right? We all know, generally, the gist of healthy eating and healthy exercise. Like we all know that, I think. And I think the people that really struggle with it goes way beyond that. And I think that feels really obvious as well. So it's really shitty of the government to just reduce it to that, again. I feel like you're just constantly reduced to that. And I also feel that what you're saying about the poverty stuff. Like I was on benefits for a short period of my life, and it was a really, really short period of time, but I really felt that penny counting sort of thing. And there’s so much other stuff on your mind, like thinking about what to eat is like another mental thing to think about. And it just became very, very challenging to add that on top of looking for a job on top of what bills we're going to pay this month. It's like mental fatigue as well as you know, easy access to McDonald's £1.99 for a meal is amazing. But also on top of that, what should I eat that’s healthy, what meals should I put together. It's too much mental capacity on top of everything else. So I really take that point as well and feel like it's important to not reduce it to what the government has. I'm fuming about all this. 

Fay: Oh, sorry.

Lauren: Sorry, go on Fay. 

Fay: No, no, go on. I was just gonna be like, yeah, I'm fuming too. But go on, I was listening and joined in to what you were saying.

Lauren: You know, food, I don't think we have enough of an understanding about how food can play a role in numbing pain and helping people feel safe. So you know, eating disorders, when it comes to mental health, are so poorly understood. And you know, mental health services are desperately stretched within the NHS anyway. So then breaking it into eating disorders and there's a whole range of eating disorders. So you know, an experience of anorexia and bulimia, which is about control around food and often ties into society sizes, expectations around how her body should look and women disproportionately are impacted by eating disorders. Binge eating is very different. And, you know, from me my experience of it has been more akin to addiction. So the need to kind of, you know, the way people use alcohol, and shopping, sex, anything that helps kind of like numb pain and distract you from from what's happening in your life. So you're right to think like, think about people's capacity to make decisions and choices around food, when you then think about the mechanisms that are in place around food and losing weight, by and large, that's dominated by the diet industry. And you know, it's pretty well established at this point that, you know, in 95% of this diets don't work actually, it's another form of disordered eating, if not bordering on eating disorders and you know, it can take people into that territory around controlling food and the desperate need to lose weight. So like, it feels like another stick to beat fat people with and when you think about the language being used, the war on obesity like Good Lord me and my fast ass is public enemy number one, great, I already feel like shitty and now you’ve told me you know, my fat body is not okay. You know, thank you for waging war on me, that's definitely gonna incentivise me to address the problem. And you know, if I can win an iPad, you know, all the better for it. Like no, this is not the answer. I'm just disappointed and frustrated that we can't do better. And especially when it comes to, you know, people's health.

Fay:  I just think it's, and kind of first, you know, first of all, thank you for sharing your own personal kind of story, Lauren, and, you know, I think that's, yeah, thank you for sharing that because it it is there's gonna be so many people out there who have personal stories about how they deal with health or restrictions on their health or what they eat, or whatever else, I think, you know, it's really important to kind of share those, obviously, when we feel comfortable to make to make people feel like they're not on their own. The thing that like, blags my mind with this, right, is that the entire world pretty much as just lived through like probably the most damaging two years in terms of like mental health and like mental well being that the pandemic has put us in. Like, it's almost as if, like, not just the UK population, but you know, the world population is coming out of a trauma. It is. We've been living in fear, and living and feeling vulnerable for like 18 months, two years.  And, you know, whatever, whatever your personal stances on like the country opening back up or not, or whatever, to then go from kind of, Okay, so you've all been locked inside for like two years. And now we all just want you to get out there lose weight, and we're just gonna dangle like a little iPad on a stick in front of you to make you run fucking faster. It's just so disjointed, from kind of how, how the UK population, not just the UK population, but how the population has kind of, how our mental health has just been, like so damaged coming out of this pandemic, that it’s just almost feels that it's going to take such a long time for people to kind of heal not only physically but also mentally from this pandemic. So then to kind of introduce something like this is just so it's just so tone deaf, that it's laughable. 

Bex: Yeah.

Fay: And that's kind of what makes me really angry. You know what the UK government's been very much like, we are prioritising mental health. We know you've all been in a shit situation, blah, blah, blah, and it's just, it's just a fucking mouthpiece and they just have no idea. They just have no idea, like the true impact of how this is gonna affect people. And then, you know, kind of thinking about the structure of society anyway. And just, you know, as you sort of said, Lauren, like, can't you know, people who are living on the poverty line, it's like, Okay, well, if you want people, if you want people on the poverty line to eat healthier, or have access to fruit and veg or whatever, you need to make that accessible, rather than, you know, as you said, Beck's this, oh, you know, I can just go Mackey's and buy like a pound cheeseburger and that's my lunch. Like, it takes a whole system to make this work. Rather than just oh, we're just gonna download an app and the amount of people who've disabled the bloody NHS thing anyway, because they don't want to get pinged, it kind of makes me think well, who's like, who really is gonna kind of take this up? I mean, I'm sure there will be people out there who would want to take part but it's just, from a very personal point of view, I just kind of think, Oh, my God, like, don't give me something else that I need to worry about.

Bex: Yeah. And also what you just said about systems really stuck with me there because I'm sick of I'm absolutely sick of people going, yo people fix this thing. And people are like dude I’m really trying but it's really hard because the system fucking sets me up to fail. And no one says, hey, system, sort your shit out. Everyone just says, Hey, individuals.

Fay: Yeah.

Bex: Sort your shit out, when it's actually just easier for everybody if the system stops being a dick, but no one looks at that level. And really, I'm fed up a lot lately, like system change, people actually just turning around and saying to other people, you need to change what you do. No, the system needs to.

Lauren: Yeah, it's personal responsibility. You're right. And it feels like social gaslighting. That's how I received this news yesterday. I'm very grateful for this opportunity to rant about it. And oh my God, it's not just me that’s livid.

Greg: I’d just like to say, as somebody who does actually use a rowing machine while dangling a tablet in front of me, I put it in front of me and have like, TV on while I'm rowing. It's very relaxing. Like gamification does work. Like, it has been proven to work. But I think, you know, listen to this, the thing that bothers me about this, you know, it feels uncomfortable. But I think the underlying reason why it feels uncomfortable, is the implication that you just needed encouragement to sort that thing out. But it goes back to that issue with why diets don't work. It's, you know, the issue isn't that you just needed to get over that hump and everything would be fine. It's, you know, the guy that, you know, the increasing guidance about diet and exercise is that it's a lifestyle change. Not a, here's a tablet and everything's fine now. So you're just gonna you know it, the people who do sign up to it, and the vast majority of them will have tablets, and then have the same issues that they've always had and never be getting any better. So I think that's where the problem is. Like, the the kind of points, you know, behaviour change thing works to a limit. But it's the longer term stuff that this doesn't take into account.

Bex: It is a really good point, actually, to convince the government, what you've just said to me, I read as it's also a waste of money, it won't work, and it will waste your money because you will give out iPads, and then people still won't change that behaviour, because it wasn't a good enough incentive. And that's not how it works. So yeah, there's going to be a waste of money also. That might help change people's perspective. That's what matters to people, right? Oh, we are over time. So I'm going to move on very quickly to the and finally, because we do need to be cheered up after this episode. So a nice idea for wasteful tech, Greg. The pod swap. Tell us about this.

Greg: Yeah, so this is a US only thing. I can't remember who shared this. But yeah, so if you own a pair of air pods, you are a complete dick. Because they are really bad for the environment, much the same as most of Apple's stuff. The reason being is that they don't make this stuff to last and, you know, you've got two batteries there that are going to be adding to the waste. But somebody thought, oh, hang on a minute, let's try and resolve that. So they've created this group in the US called The Pod Swap, where you can send back your Airpods and get a new pair or a relatively new pair, and they're refurbishing them for you. So yeah, it's a really nice way of stopping that waste of pods, which is great. Hopefully, we'll see something similar in the UK soon. 

Bex: Oh, yes, I love this. Yeah, thank you Pod Swap. And finally, I've got one last shout out to do on a positive note. Shout out to Pauline Roche, who has been on this podcast before. She has been nominated for a tech for good award for the Women in Tech awards. And, Pauline is so prolific in the world of tech for good, particularly in the West Midlands. Champion of small community groups. I would be here all day if I listed all of the cool stuff that she's involved with. But her Twitter is @paulineroche. You should go and follow her and vote for her so she wins it and not just nominated. But well done Pauline. I thought it was just a nice thing to shout out. That is all we have time for today. But thank you for listening. Lauren, how did that go? Was that all right?

Lauren: Yeah, I look like a parboiled tomato. I'm not like red and het up.

Bex: It’s how we like our guests [laughs]

Greg: [laughs]

Lauren: But I feel much better for having released all of that pent up anger and energy. Thank you.

Bex: And where can people find you on the internet? And do you want to promote anything that you're working on?

Lauren: You can find me at Lauren Coulman on Twitter. Note though my last name is spelled C O U L M A N. Everyone gets it wrong. That's okay. No. I talked to you about the responsible tech stuff earlier. I would just say keep an eye out for that. We've got some really exciting stuff planned. And if you are interested in getting involved in doing tech more ethically internally to your organisation, call us.

Bex: Thank you so much for joining us. We've had fun and not speaking on behalf of Fay and Greg but I've had fun. Have you had fun Fay and Greg? 

Greg: Yeah. 

Fay: It was all right. I'm joking. Yeah. It's been great. 

Bex: It’s fine. It’s fine. [laughs] Listeners, what did you think? We'd love to hear your thoughts. Get in touch on Twitter at tech for good live or email hello at tech for good dot live. And we'd love it if you gave us a nice iTunes review or told your mates about this podcast. Not that you have any mates because you listen to this podcast. Thank you to our producers for producing this podcast. Also, don't forget that this podcast is run by volunteers and we survive on, gasp shock horror, who would have thought we were all volunteers based on the quality of this production. [laughs] We survive on sponsorships, so do consider chipping in the price of a cup of coffee that's at Tech for good.ly forward slash donate and thank you to those who've already donated, it's much appreciated. finally come and join our tech for good slack, if that’s your kind of thing. There are jobs and events posted there. I only said that because I was going to talk about all the jobs and events but it would have taken ages so just go find it yourself because I'm really lazy. Yeah, we talk about people doing nice things with technology or at least people trying to be a bit less shit. That is on surprise surprise tech for good dot live forward slash slack. And thank you to Podcast.co for hosting this podcast. That is it. Why are you all laughing? Did I say something weird? 

Fay: It’s the way you said slack. Slack, Like Sean Connery.

All: [laughs]

Bex: Don’t know why I did that. Bye. 

Fay: Bye.

Greg: Bye.

PodcastHarry Bailey