TFGL2020 - S1 - Ep7 - Abuse (An International Women's Day Special)
This is a special episode because it’s International Women’s Day! We’ve recorded a special episode on this day over the past few years. You can check them out on our website if that sounds like fun for you.
Although, they’re probably not relevant any more because everything is totally fine and ok in the world now. It’s 2020. We’ve fixed all the issues plaguing the human race. There is no more inequality. Everything is fine. Everyone is happy. Honest.
Warning: Because this topic is around gender and tech, we may be discussing subjects that might be distressing for some people, so take care when listening.
Our transcribed episodes are made possible by Happy Porch. Happy Porch provides strategy, technology and development for purpose driven organisations. Find out more about them at happyporch.com
Show notes
Joining host Bex are some excellent TFGL Women, Hera Hussain and Sacha Wynne.
Bex was thinking about gender and tech before it was cool. Now it’s cool, there are UN hackathons and AI roundtables a-plenty. Don’t get her started on that, she’s got lots to stay and she will save it for another day!
Hera lives and breathes gender and tech so it’s going to be hard to rein her in, but I suppose it is International Women’s Day. She can go wild with her commentary. But only today - like corporate feminism - everything goes back to sweatshops and unequal pay tomorrow.
Sacha, like the many men who come out of the woodwork to put #HeforShe in their twitter bio once a year, her presence is a rarity!
Discussed on the show:
Stat of the week
Charity news of the week
Parents beg Pornhub to remove explicit sex photos of underage teenagers more quickly
Tech news of the week
A Rant or Nice of the week? (or both…)
And finally…
eBay bans sellers from using coronavirus to 'profit from tragedies and disasters'.
Full show transcript
Introduction:
Hello and welcome to Tech For Good Live. Today is a special episode because it's International Women's Day. We've recorded a special episode on this day over the past few years. You can check them out on our website, if that sounds like fun for you although probably not relevant anymore because everything is totally fine and okay, in the world now is 2020. We fixed all the issues plaguing the human race. There is no more inequality. Everything is fine. Everyone is happy. Honest
Warning: Because this topic is around gender and tech, we may be discussing subjects that might be distressing for some people to please take our listening.
Bex: Joining me today we have Hera Hussain. Hera lives and breathes gender and tech, so it's gonna be hard to rein her in, but I suppose it is International Women's Day, she can go wild with her commentary, but only today... like corporate feminism, everything goes back to sweatshops and unequal pay tomorrow. Sorry Hera, I don't make the rules. How you doing?
Hera: I'm good. I love that. I love the fact Jonny didn’t write that and I did.
Bex: Sasha is joining us, and like the many men who come out of the woodwork to put #HeForShe in their Twitter bio once a year, her presence is a rarity. Welcome back, Sasha.
Sasha: Hello! And I love that introduction. It is so on topic.
Bex: It’s SO on topic!
Bex: And me, I’m Bex, I was thinking about gender and tech before it was cool. Now it's cool for UN Hackathons and AI... [stumbles] aye ey, ey aye, eyups! I don't know what I'm saying! [laughs] ...AI roundtables aplenty.
Don't get me started on that. I've got lots to say but I'll save it for another day. Hello, everybody!
Hera: Hey! A team podcast!
Bex: I know, no guest. Just us.
Hera: No one wants to talk about gender and tech, Bex!
Bex: WHAT? Why not!
[all laugh]
Bex: We're not in a normal studio as well. I feel like I should say that in case the sound quality isn’t as good.
Hera: Yeah, and it's not because we don't have a man to manage the production. [laughs] It's simply because there was a (studio) refurbishment happening and we have a temporary home for today.
Sacha: I feel like we are just digging a hole here though
Bex: But, actually, it made it more complicated to set up than usual and I did it all by myself. I did not need Paul. It was fine.
Hera: Its a win for our gender. Thank you!
[all laugh]
Stat of the Week
Bex: Stat the week then, let's go. Let's move on. Let's talk about gender, and the first thing we're talking about is Stalkerware. Apparently infections grew by 40%? I don't know what any of this means, Sacha!?
Sacha: Well...okay...so...it's a really interesting headline. So it’s Stalkerware infections grew by 40% in 2019 and from looking…
Bex: Did I say 2009..instead of 2019?
Sacha: It says 2019, but it says that in the in the article thing…
Bex: [laughs] it’s me, I sometimes can’t say 2019 and say 2009
Hera: It’s too hard for you [laughs] to many feelings attached to 2019.
Bex: Yeah I just can’t believe it’s 2019 already…
Hera: 2020…
Bex: OH MY GOD!
[all laugh]
Hera: Bex can set up a podcast room by herself but cannot remember calendar dates. Got it. [laughs]
So this article is really interesting, as Sasha was saying. It's interesting for me because I've been tracking this. As some of the listeners would know. I founded a tech and gender project, which is run by volunteers all around gender based violence, called CHAYN and stalkerware is something that we are actually starting to research this year with UCL.
Sacha: Exciting!
Hera: And this is the reason because, you know, we're seeing more and more cases, of survivors who are either threatened by their partners, saying “I'll know exactly what you do” and either it's an empty threat or it is actually happening, like we see in this article.
Bex: So...I'm gonna play stupid... I know there's things that can track where you go, but I presume that was like parents put it on kids phones...
Hera: That's very similar. That's the scary thing. It's so similar, the software is often the same, made by the same manufacturers. It’s not regulated, right. And there are so many forums that... if you just Google ‘how can I hack my girlfriend?’ you'll see the amount of forums that come up…
Sacha: oh my God…
Hera: It is just people who are going in there being... Sometimes you just pay someone like, you know, 50 bucks and they will do it for you! And what's happening is... some of these app’s, they appear very, you know, really simple, they might look like a weather app, they might look look like a game. You might not even see it. It's just a file that's in your phone, and they spy on not just your location, but who you're talking to, they mirror that so the person who wants to know can see it. They catch your password, they leaked that, and we don't know what happened to the data. So it's not just that the person who was spying on you is getting the data. One of my concerns is that if it's done through an outfit, that really do not care about ethics if they're doing this... are this selling that information on to other companies as well?
So does that mean that you're not only being, you know, your privacy is being jeopardised by your partner who's supposed to respect that. And your life could be in threat, that someone is willing to go to those lengths, but also suddenly all these other people who buy data to find vulnerable people that they can then swindle and get money off... now (they) have access to your information!
Sacha: Absolutely. There's so many avenues for abuse with that, isn't there? I mean, and it’s quite interesting to see that, obviously, like the app store still had made efforts to completely eradicate it from the app store last year...no...now I'm thinking we're in 2019 as well, I don’t know what's wrong with us today...[laughs]
Bex: 2018!
Sacha: [laughs] in 2018! So, yeah, like two years ago almost now. Yeah, I mean, they promised that it would be completely gone from the app ecosystem, but it seems like people are coming up with more... unique ways of getting this software into people's phones, which again is even more disconcerting! If people are not just downloading it as an app... there are other ways? Like it manually being uploaded onto there? Like what are the other ways? I mean, I just think that's even more terrifying, the fact that it's increased as well since the app store has done so much work to take it down, like, how is this happening?
Also, I imagine that these figures... so they say around 67,000, according to the company that did this study, and it's an antivirus maker organisation that concluded this, around 67,000 unique users had used or had stalkware app installed somewhere on their phone. That figure... I imagine is way higher in reality. I mean, how would they know?
Hera: Yeah This is just stuff that one manufacturer has tracked. So I have some good news though, which is that people are standing up and fighting. I'm raising that in the research that we're doing. Eva Galperin, who is the director of cybersecurity at the Electronic Frontier Foundation, fondly referred to as the EFF in the US... She has just released a TED talk, actually talking about the work that she's doing, on standing up to these companies and these people creating spouseware or stalkerware. We hope to have Eva as one of our guests later this year.
Bex: That's very exciting.
Hera: Yes, she's amazing, and I think that it's time that our policy reflects the changing environment, when it comes technology. There's some interesting conversation happening around consent, not just about your spouse, but also about your children, which I think is where people...sometimes for them that’s a grey line? They’re like ‘Oh yeah, you should not spy on your partner, but it's fine if it's a kid’... so that's where it gets a bit murky. But I feel like...there is a direct link with some of the stuff around, like, state sponsored or corporate espionage as well, because, you know, when we started thinking that it is okay to violate someone's privacy, whether they are an activist, a journalist, a citizen that you think may have nothing to hide or a partner, you know, it's just not a legitimate argument to say that you have a right to someone's thoughts, their messages…
Bex: No. And I want to share a personal story with how easy I think it is, that someone can convince you this is a good thing. I had...this was 10 years ago, this is before stalkerware was even a thing...but if it was... I had a partner and I might have even allowed him to install this, like, willingly on my phone, because he was... he'd had a really bad relationship before me, she cheated on him, he was like in a really vulnerable place. He just wanted to make sure that I wasn't cheating on him, and that sounds really stupid, like saying out loud, but that was like after months and months and months he was, like, frightened to let me out and again... saying that out loud sounds really silly, but you all know me, I am not the sort of person that that would happen easily to, but...
Hera: There's no type of person that...
Bex: I know! And that's what I've learned from that, like it can happen to anyone. And I was sucked into this situation, so, you know, willingly. People will be in a really bad situation in a relationship and might let this happen willingly. And then also, how easy is it to get access to someone's phone even when it's not willingly!? So yeah, really dangerous software.
Hera: It's also so important, and this is one of the reasons why CHAYN released a project, which is like a micro course platform and one of the really simple hacks that we did, is that when you sign up to those courses (because they're all around self empowerment) we have fake headlines, so you know, you might be looking at a course that's around…’How do you collect evidence to prove that you're in a domestically abusive relationship’ for court, but you don't want your subject line to say ‘domestic abuse course evidence collection’ you know, because if your partner is most probably looking into what you're doing, you don't want them to see. So those headlines are then things like, ‘Rhianna's new single’ or like, you know, ‘40 ways you could use tinfoil for your hair care’, like really random stuff like that. And, you know, of course it's not bulletproof security because someone opens that they'll see what it is, but it's just like one step….
Sacha: That’s great
Hera: ...and it’s ridiculous that we have to do this, but it is the state of play.
Sacha: Yeah, absolutely. And it's really good that that resource is becoming available for people because I guess that's the thing, I think people are most, you know, vulnerable towards us, they're gonna be least likely to know the mechanism to be aware of resources that can help get this kind of stuff off their phone. That's really cool.
Charity News of the Week
Bex: On to Charity News of the week. So, there's a campaign by Solace Women's Aid, who are using Twitter’s Hidden Reply function. They highlight those trapped in abusive relationships, so links quite a lot to what we were talking about just then. Hera, can you tell me more about this?
Hera: Yes. Interestingly, I found this... of course, because I work in this gender and tech space I follow Solace, Women's Aid…. and I found their tweet! So that's why I wanted to include it. It’s a really interesting campaign. The tweet that kicks it off is like a selfie of a couple, right, everyone smiling. It looks good. Very normal picture on the street.
But then you start using the hidden replies or the hidden... I think it's called a hidden replies function!? And you look at that and then start, it starts telling you, you know, ‘is this as happy as it looks?’ And it starts helping you reconsider whether... all the things that you may not see from the outside that could be going wrong with the relationship. Really, really important. It's like #hiddenabuse. It's the hidden replies function actually.
And I think it's so important, you know, when you have been... and this is the sad thing... when you've been working in the domestic abuse space for some time, I've been doing this for seven years, you just sometimes get a sense for it. And I have a friend who constantly...she posts on Instagram a lot, and I knew about her relationship, she was very close to me. She told me nothing despite the fact that I work in this, but from her Instagram, I just knew something was wrong because she was posting obsessively, happy pictures from her and her partner from old vacations. And none of them seemed recent. And there's just like a couple of things that just hit me.
And I was like, there's just something wrong. So I messaged her and talked to her, she shouldn't say anything. Called air again a few weeks later, and it just all came out
Sasha: Wow
Hera: Yeah, she was in a very like, controlling relationship. It wasn't, you know, again, it's like the hidden abuse, so it wasn't like she was being hit by her partner, it was things like her feelings, or things like, you know, making you do chores, restricting your economic opportunities. All those things that happened that are easy to miss, things like having a joint bank account and not letting her have a bank account.You know, she was a very well educated person who had a thriving career, you know, when the relationship started, she was very beautiful. That's what she heard and then slowly wasn't what she was hearing.
It's all those things that, when you're going out with your friends and they’re...you know, you're meeting someone at a party or like everything's good with that couple. There might be things that you don't see, so that's what the campaign is about. I think it's a really excellent use. And this is a partnership between Twitter and Solace, which is why it's verified [laughs] I checked it! I was like, it's a verified account and it just started two days ago, how is that possible!? That’s why!
[all laugh]
Bex: Skills.
Hera:Yeah!
Bex: But that's quite interesting, because, for me, I read “hidden abuse”, there’s two types of hidden as well, isn’t there? Another's kind of hidden to the outside world...but it also might initially be hidden to the person who is in the abusive relationship.
Sasha: Absolutely...
Hera: That’s true…
Sacha: Yeah. And even... that is it. And I think that's overlooked more than anything else, isn't it really, that element to it
Hera: and shout out to the agency Stack, who came up with this concept.
Bex: Yeah. Yes, absolutely. Thank you. We'll move on to the next charitiy news, which also isn't very nice! But Sasha...parents are begging PornHub to remove explicit sex photos of underage teenagers…?
Sacha: Oh, my gosh. I mean, when I read this headline, I was just...I mean, face in hands kind of situation, like, I mean, I guess it's also not that surprising, unfortunately. But yes, parents are at the moment having to BEG... I know it's maybe kind of a tabloid-y kind of headline, but “beg” essentially PornHub to remove explicit sex photos off their children or underage teenage children. Yeah, which is just crazy me, number one, why are they even having to beg? For starters. I mean, if you are an organisation, like any organisation, you have an obligation to protect people from abuse. I mean, we have harassment policies, we have anti-trafficking policies. We have safety at work policies. Buy we didn't have any kind of policies...you know? Why’s there not an awareness, like, there's obviously a gap there for abuse within the porn industry. That's well known. You know, there's high risk there actually for things to go wrong. It seems insane that their response is so slow to this.
Hera: And there are these ending up there, Sasha? How are these, like they are not consented to...
Sacha: So the article is saying that basically, kids, you know, are not accessing PornHub, they're completely unrelated to the website. They’re on Snapchat, they're on social media platforms like Snapchat, and hackers, who are able to scrape that data are taking images,maybe they're sending between each other in private but somehow they managed to get a screenshot of it and upload it onto the site.
Hera: And there's another really important part here, which is around image based abuse, which is also called revenge porn. But I beg everyone not to call it that, because that doesn't show you that there's a lack of consent. Image based abuse is the correct terminology here because it is not intended to be porn when it is filmed. So if you consent to having this filmed or pictures with your partner because you trust them and then those that are then leaked...and not always with partners, that's the sad thing. And I want to give a shout out to a couple of really excellent organisations that are working on this, which is something that's so important because what we're talking about is horrific.
So, Rose Kalemba is a survivor, was in the news a couple of weeks ago, and I want to shout out to her bravery and her courage. She wrote a blog last year because telling about her experience, she was raped as a 14 year old, and it was a very, very brutal attack and that was videotaped. Then people started contacting her, saying that we found videos of you on Pornhub, so she contacted them and they didn't reply. Then she, completely distraught, pretended to be a lawyer and then e-mailed. So she made up a email address, and they replied!
Sacha: Wow
Hera: They still didn't take it down, I think, Or it took some time. Really, really horrific, you know, and shout out to BBC for really covering it, Megha Mohan did a really amazing story detailing this and for Rose...What she says is that she, you know, in this case, like there's no blame. There's never blame with your images ending up on the Internet when they're not intended to… but you know what happened to her was...it was a crime! And the fact that PornHub did not take it down. Once the story came up, a lot of people contacted her saying that the same thing had happened to them. And PornHub have like a very typical statement saying ‘we don't agree with this’ and ‘We would always take it down…’
Sacha: Oh, all well and good! But why is there not a process?
Hera: Yes! And why...you know, like she said that when she emailed pretending to be a lawyer... it was removed within 48 hours.
And when she was emailing as a victim, there was no response. That is shocking!
Sacha: No, its power isn't it? It's like this person has power. I'm gonna listen to them. This person is vulnerable on a victim, I don't give a shit because you you know…
Hera: and a 14 year old! Even things like that, it’s illegal to have that! You know, I just don't understand! You know, if Pornhub wants to be a sustainable business in the long term...liike forget everything, you would want to be on the right side of the law!
Bex: Yeah it's already like on dodgy ground, because porn, right? So at least, you know, don't do underage porn, that’s like, your first step…
Hera: And instead of campaign videos on getting a lot of credit for having, like talking about how Pornhub is very, very environment friendly, you know, invest in your customer support. In fact, I'm sure you don't need to even invest in, make it a priority. You have people who do customer support when it's a lawyer emailing you... Respond to survivors! It's awful that you know something that can tarnish someone's mental health for a very long time, especially when it's you know it was coming from children and is ending up there.
I just think it's something they need strict regulations.
Sacha: Absolutely. There isn't enough light shone on this. I don't think
Bex: No, you know what, we talk about how Facebook doesn't take down stuff and not even sexually explicit or underage stuff. Like we get mad at Facebook for not policing its content better. And actually, we don't shine the light on things like Pornhub because because it's a complicated issue right? But surely, actually, Pornhub should be standing up to this sort of stuff way more than… we should be talking about that…
Sacha: They’re just kind of hiding in the background. They think ‘Well, no one's looking as yet, so we're just gonna keep our mouths shut’ but it's just looking awful. It's looking fucking awful. [laughs] I don’t know why I'm swearing so much today!
Bex: Because it’s FUCKING SHIT!
Hera: It’s fucking shit. Yeah let’s all swear.
We also want to give a shout out to Kate Isaacs, who has set up the #NotYourPorn campaign. You know, we'd love to have her, she would have loved to be on the show as well, but we couldn't make it work. She's been working with these parents about getting these images down. So there are amazing people out there working on this and like, you know, putting themselves in really vulnerable positions and talking about their stories. We should be, as Bex said... We need to talk about these, even though they're uncomfortable and it's brutal to talk about things like, image based abuse, or someone's personal tragedy. But people are doing that. So why aren't lawmakers doing something about this?
Bex: Absolutely. And I think part of the problem here is that it’s porn. We don't talk about porn, you know?
Sacha: Yeah
Bex: And actually, the porn industry has a lot to answer for. There's a lot of ethical things going on that need to be worked on within the porn industry itself. You know, even if we want to say, actually we're not against porn as a concept, you know, some people might be. But, you know, I'm not really. And there there are better places to find porn, than you know, if you are someone who consumes porn, maybe don't consume it on Pornhub because they are not being ethical…
Hera: Exactly, that's what #NotYourPorn says. And actually, Rose Kalemba, the survivor I was talking about, I was just so like, you know, just taken aback by her courage and someone messaged her on Twitter, so it's public, saying that ‘I am one of the frequent visitors of Pornhub and I'm so shocked by your story, I will not be going there anymore.’ And she said, You know what? You are the kind of person I want to talk to because people who don't go to Pornhub... I don't need to convince them... who I need to convince, and what will make a difference to Pornhub is the people that go there. And tell them, and not just like, you know, you want to tweet if you're not comfortable, but please email them. Tell them this is an important issue to you as a customer of Pornhub, as a producer, someone who is in business with Pornhub. You know the people who host Pornhub. Any contractor of Pornhub, raise your voice. This is a really important issue.
Bex: Yeah, move away. Go somewhere else, Take your business elsewhere and I'm sure there are other places. I guess, there's probably a list of more ethical places to consume your porn out there? I mean, I've just looked on Bustle. There is a list, I will tweet that, we will tweet that, of ethical porn providers. So, you know, you can go somewhere else! Which doesn't have underage pictures of children against, you know, that they haven’t consented to.
Hera: And you know what, it made me think of something... How much coverage have you seen of the fact that in India there are like rape video stores and how much…
Sacha: What!?
Hera: There are! And there's a lot of coverage of that. And how much coverage have you seen of Pornhub and the fact that rape footage is ending up there. Not enought!
Sacha: None!
Bex: But you know, obviously, actual rape footage should not be on there. But they are also, this is a really complicated issue, because they're proliferating that as well..
Hera: Yes.
Bex: ...loads of stuff on Pornhub is underage BUT NOT REALLY UNDERAGE, and like, rape BUT NOT REALLY RAPE...and I know we're allowed fantasies, but I don't know...It is a complicated a message that we're putting out there with the types of porn that we're doing on, and actually, if some of it is actually rape footage... Jesus fucking Christ. Like what!?
Hera: What kind of a human do you have to be to say I'm sorry I’m not going to remove that…
Sacha: How are they having views?! I mean, that's really, really, really worrying that it even gets views, but a big part of this is that people shy away from it because it's porn, and we need to stop doing that, because atrocities happening here.
Bex: Yeah. Okay. So Hera, next story! Bristol Post becomes one of the first newspapers to share abuse and threats made to Greta?
Hera: Yeah, this is a fascinating story, and, you know, we're continuing on the theme of...again, violence against….
Bex: Everyone is shit! The theme of ‘Everyone is Shit’ [laughs]
Hera: yeah, and like, sexual assault. This is really, really upsetting. We've spoken about how Greta Thunburg has received a completely unwarranted amount of abuse,
Sacha: It’s unbelievable.
Hera: Mostly from middle aged people and mostly men. And she announces she's gonna come to Bristol and, you know, it created a lot of buzz in Bristol. And then, you know, like, things go, she received a lot of abuse. But this time around, there was a particularly upsetting, and grotesque, like, cruel, all those adjectives, picture of her drawn that showed her basically being raped.
Sacha: [audible gasp]
Hera: Yeah
Bex: Oh what?
Sacha: I didn't see that, oh my gosh!
Hera: Yes. And then it was being circulated everywhere. Within this, people were…
Sacha: [laughs] I hate this world!
Hera: Yeah it's really, really upsetting! And so a lot of people were enjoying that and thought that was very funny.
Sacha: Oh gosh, it’s so...
Hera: I just don't get it! I don’t understand men.
Sacha: Psychologically I just don't understand how this fits into people’s...like...how they can just hold that in their minds...
Hera: I just don’t...it’s so upsetting. So Bristol Live, and seriously, I just couldn't believe it when I saw this on Twitter. Bristol Live actually found people who were posting such content, that were from Bristol and published that...
Sacha: I love that.
Hera: Right? It's just like it just was just amazing. So Tristan Cork, senior reporter at Bristol Live...you are our favourite person.
Sacha: Woo!
Hera: This is an excellent thing to do. It is quite bold because, you know, we are aware that you should not be piling on people, like average people especially. And we've had some discussion on the podcast before, around Jameela Jamil and how that's going to shape a lot of the arguments around inviting people to pile on. But in this case, I have to say…
Bex: Pile. On.
Hera: Pile on [laughs]. So...you know what really made me upset...One of the first people who I was tweeting this, or putting it on Facebook had a filter that said “Be Kind”! Oh my God!
Bex: [laughs] I’ve just seen that! I’m fuming! You be kind!
Hera: Yes. Like, what the...? And there were stars! You will never walk alone. What the hell? You were saying the most awful things about a teenager, a child!
Sacha: It's mad. So what, I mean, not to get too bogged down in details….What kind of things were people saying then? This is like the Facebook comments?
Hera: Yeah, it's really like obscene Facebook comments like talking about physical violence, sexual violence, just... yeah…
Sacha: It's so perplexing because you have someone like David Attenborough, who is talking about the exact same things and we all love David Attenborough, well most people do, and yet Greta is talking about the same things, and even from... not just even from the public, but from the tabloids....she is vilified!
Hera: Yeah
Sacha: So often, and it doesn't, you know, if you've got that kind of leniency, you're no agreeing of what she's thinking, and you want to do that, it does not appear that David gets the same kind of rap for it...
Hera: You know what it reminds me of…? Malala.
Sacha: Yeah
Hera: You know when Malala became this, like, world figure, like I saw the same reaction from Pakistan. As a young person from Pakistan, I was so ashamed. And there were a lot of people in my circles that were saying, you know, like, ‘we must be the only country to do this’... and well, to all my fellow Pakistanis, we're not! Something to be less embarrassed about! But shame on you, World! Like, seriously, shame on you! What is up with you putting all the weight, like, the responsibility on these young people who are trying to say something and do something good? Completely unwarranted!
Sacha: Yeah. It's so strange, isn't it? It's just like, what is going on psychologically there. It’s just like…
Hera: I think it's something about being in the spotlight. And I'm sure there's a lot of research being done on this. And if there is and someone from our community knows about it, tell us, you know, tweet us, share it with us on Slack. There must be something about how, when someone is in the spotlight and you’re constantly seeing them in like ‘Press Mode’...?
Sacha: Especially if it was a young girl, for some reason it just...infuriates people!
Hera: Or a person of colour! Again, I'm very surprised, like you know, you could have a lot of problems with Jameela Jamil, I'm kind of like, completely shocked by the kind of abuse she gets....and it's like, “well you're not talking about X Y and Z”....she's an actress. Who is trying to do something around body positivity, like she doesn't have to talk about every single thing that’s happening in the world. It's just really, really strange. So I think there must be something around our neural networks, which is like getting triggered by people being...or looking a certain way? Like, as if you think they need to be answerable to you? Same thing with Meghan Markle, how people thought….or even Kate Middleton…. around her pregnancy and her pictures... people being like, you know “well, we pay for their house or their palace and that means we deserve to have access to whatever we want”. No, you do not!
Bex: No! [laughs]
Hera: No, no, just very weird. Humans are shit.
Bex: They are shit
Sacha: Agreed
Bex: But someone's actually done something nice for a change, in Tech News of the Week...
Tech News of the Week
Bex: So a mega donor has bought a stake... so he's a Republican mega donor, and he's also bought stake in Twitter in order to make change.
Hera: Do you believe that?
Bex: I mean, yeah, maybe not. But let’s let people… I don’t know! I like the idea that maybe he will make change...
Hera: [laughs]
Bex: I do think...like...money corrupts, power corrupts, but...
Hera: This is our tech news of the week, right? And I think what I heard from some of my friends in the U.S is a lot of worry rather than celebration, because the person who we're talking about...it's like the company's in which they have bought stuff before, some of them are, you know, completely fine things like Waterstones, but there are some other more nefarious investments....
Bex: oh really…? Oh I didn’t look into it.
Hera: I think that's why people were a bit concerned. But to be fair, I haven't done a completely...big analysis. I'll read something funny from one of the articles from The Guardian which talks about this. Paul Singer, is the name of the person who's invested, and the former President of Argentina, Cristina Fernández de Kirchner, memorably described Singer as the “Vulture Lord”, a “Bloodsucker” and a “financial terrorist”
Bex: Oh, wow. [laughs]
Hera: So yeah, Tech For Good Live community...Tell us what you think about this news!
Bex: I mean, I just like that somebody's fucking with Jack. I mean, maybe it might not work out
[All laugh]
Hera: What if it makes matters worse!?
Bex: I don't know, maybe it will, but Jack will be mad. And that all may be slightly happy, I suppose.
[All laugh]
Rant or Nice of the Week
Bex: OK, so we're going to do a Rant of the Week, but then we do have nice news after it. So, Sasha, the Vatican has joined forces with some tech giants. What is happening?
Sacha: Oh, my goodness. Yeah, this is a crazy headline. So essentially, exactly that, the Vatican has joined up with Microsoft and IBM to kind of raise awareness around AI and facial recognition and things…
Bex: aye ey, ey aye, eyup eyup! [laughs]
Sacha: [laughs] Yeah, absolutely,
Hera: what!? What is my response [laughs]
Sacha: Well, it's just like, yeah, it’s one of those things, isn't it, that you don't really know what to take from it.
Because obviously, the Vatican has come out, it comes out every now and then with it’s position on various things that are happening in the world, including things that are happening in the science community in the tech community.
Hera: and now AI? [laughs]
Sacha: And now it's decided to kind of focus attention on AI. And the reason behind this is…. they've signed a document, it’s from the officials, this is including Microsoft as well...they’ve signed document called “Rome Call for AI Ethics”...
Hera: [laughs] Another ethics list!
Sacha: Yeah, like I don't know how much it’s going to contribute in this space, but, you know, I think…okay, interesting…
Hera: Is this “Peak AI” - That's my question to us...
Bex: [laughs - repeats Hera’s question] Hopefully!
Hera: I think we may have peaked.
Sacha: I don't know! I mean, I think it's just an interesting one, isn't it? You know, it's great that they are again shining a light on the potential for abuses in this area. They have called AI “a potential new form of barbarianism”...
Bex: [laughs] What!?
Sacha: To quote…
Bex: that’s just weird!
Sacha: I know! So, yeah, the language that’s used is really interesting as well, so they're basically saying, kind of like, you know, the potential for disinformation campaigns is toxic because it can endanger institutions that aim to preserve peaceful co-existence. So it's kind of coming at the angle of like…
Bex: [sarcastically] You know what? Before the Pope said it...I didn't really think about that.
[All laugh]
Sacha: I know. We needed the pope to kind of clarify this. But I think, you know, it may take this topic to a wider audience that isn't engaged in this. Maybe, I don't know.
Hera: So we've discussed AI ethics a lot and we've talked about accountability and transparency in other pods. The interesting thing here is that the article says (This is the article from Fortune by Jeremy Kahn) - the article says that under transparency, the Rome Call [stutters], the Rome Call...
Bex: Which is weird, right?
Hera: Yeah!
Bex: Sounds like the roll call, I don’t get it!
Hera: which is why my reading was wrong! The Rome Call says...In principle, the AI systems must be expendable. But that doesn't say how…. So, it doesn't specify what kind of transparency and, you know, we've discussed it before in the pod, we talk about “who's made it?”, you know, “How is it developed?” “What is the actual algorithm like?” “How can you hold it accountable if you don't…” Those are the kind of things that you want?” Right?
Sacha: Yeah.
Hera: Like, even thinking in Biblical terms, and I'm not Christian, so, like, if someone else is, please help me out here.
Bex: Where’s Jonny when we need him? [laughs]
Hera: There must be….you want to have some language around, like judgement and like responsibility…
Bex: Just from a...I mean good branding right!? That’s just tone of voice!
[all laugh]
Hera: Exactly! So...then later on the article says... that neither company, so this is talking about Microsoft and IBM, bastions of transparency...Neither company, however, has said whether they will take steps to police the applications customers run on their cloud services. So what is this, empty words?
Sacha: Mmm. Probably.
Hera: What do you think?
Sacha: Yeah, I don't know. I think this is the kind of thing we'll see maybe down the line, whether it's had in effect. I think a lot of people will take note of this because it is quite an unusual thing as well. I think, in countries that are kind of predominantly Christian as well I think it could... I don't know. I think there's potential for it to least be awareness raising, but in terms of it actually giving some kind of practical guidance or intervening or something along those lines that maybe is more direct, I don't... I'm not too sure.
Hera: And I don't know if I'm wrong, but didn’t I read an article that said there are more than 200 AI and ethics statements?
Bex: Oh, yeah, that doesn’t surprise me, that’s probably true
Hera: Wow, This is like 201! [laughs] There’s probably 200 that have come out since that article came out.
[all laugh]
Hera: I just... you know what, again, it's really hard to judge people or things before they come out. But I feel like we've been, we are just drained from having seen so much of this. Like, what would you call it? You know like “Stamping?”
Sacha: It’s like greenwashing, but it's like a different kind of…
Hera: it’s open washing - I wonder how much money they spent on this? I keep thinking about the money because I'm like, why did they do this? And how much money have you spent on organising this, having that meeting, then doing the publicity for it... would it have been better to just invest in, like, better practices inside the institutions?
Bex: I think we already know by now what we should and shouldn't be doing with AI, right? Because as you say there's already 200 policies on that, one of which I think Microsoft already worked on, so I don't know why it's done this second one with the Pope, because it sounds cool or whatever, but yeah, why aren't we doing something now? We've got the principles, we have that... what's next?
Hera: And this is very similar to how I feel about Hackathons for gender based violence. Because I'm like, you know what? I've done so many of them, and I apologise to everyone that, you know, was partnered with me and CHAYN on that, we were learning, and I realised that the only projects in all the hacks that I've been in that have been successful...new ideas, not people with existing organisations coming there, have been the ones that CHAYN have incubated within the organisation.None of them, from 10 or 12 hackathons I've organised have ever made anything, and I think about all the time and effort we spent because it's a volunteer organisation, would have been way better if we had just kept volunteering on our existing projects. And then I was talking to a funder, who will not be named, and I was talking to them about some of the work that CHAYN’s doing and they were like, “Oh, that's great! We've organised five hackathons with Bloomberg, X, Y and Z and we spent like so much money” and I'm guessing it's like hundreds and thousands of dollars, “and this is exactly the kind of work we wanted to do”. And, well, you know who doesn't have hundreds of thousands of dollars!? The actual organisations that are doing this [laughs] So you have these ‘vanity hackathons”...like these “vanity AI lists”
Bex: Ohh, “Vanity Hackathons”, that's a really good term for it.
Hera: And I think the AI lists are just the same. There are organisations actually, like, trying to hold these algorithms to account. FUND THEM! FUND ACTIVISM. FUND RESEARCH. Stop funding these vanity lists, policies, procedures that will go out of date very quickly, or if they don't go out of date, no one will use them.
Bex: Boom! Mic drop. Don't actually drop the mike. Thank you. Ermm… Block party. This is good! So this is a good one! Hera, somebody is developing an app for blocking online harassment. That's good.
Hera: Yes. So,Tracy Chou, who is a very outspoken, you know, entrepreneur who worked in Silicon Valley and has invested in a lot of companies before, and she has created, and she actually gets harassed A LOT online because she talked about diversity in tech.
Bex: Oh right, obviously…
Hera: She's amazing, really amazing, And she's created an organisation called Block Party. And what's going to do is, it's going to allow you to sort out people who reply to your tweets into those that are potential Bots, or that are potential, like harassers. It looks at things like… does this person have a profile photo? Does this account... how many accounts does this account follow?
Sacha: Wow
Hera: It's things like that. You think that Twitter's would make it, but no, Tracy had to go and do it. And... I'm lucky enough that I got in there, they have a waiting list, but they let me in, because I know Tracy and Jasmina, who also works there, shout out to Jasmina, who actually did research when she was at Amnesty, on Twitter harassment. So they are working on this, and it's just started it’s waiting list. So please, if you or someone who you know, gets a lot of abuse online, which is basically anyone who talks about anything important! Please sign up to the waiting list. It's https://www.blockpartyapp.com - there are FAQs that you can look at which explains what it's about.
So we have good news. Something good is happening.
Sacha: That is really good news actually
Hera: Right? Like people standing up. And you know what? Interestingly, Tracy when she tweeted about this, she got so much abuse online, she was like “Well, you're just proving the case for the startup that I have!”
Sacha: [laughs] Why would you abuse if someone's doing stuff like this. Why would you need to do that,even then, Like, this is ridiculous! Oh my gosh, they’re like “yeah, test it on me!”...yeah, like that's just...
Bex: I think Harry would give that, like, 10 out of 10 for Tech for Good. So we rate things. He's really good. He's got this rating system on whether we talked about that for good or tech for bad and that was definitely tech for good.
Hera: Yeah, we try to do that because you know what? There is a lot of good happening in the general tech space, and I think, because we talk a lot about the problems, sometimes it's easy to forget that there is...the resistance is strong and it's both from within large organisations and government, but also from the outside from consumers, activists, survivors... and things are not as bleak.
And Finally…
Bex: No! And that leads very well into our ‘and finally…’ which is looking at a big organisation that has done a nice thing this week. So eBay has banned sellers from using Coronavirus to profit from tragedies and disasters. That's good that they're taking those steps to ban sellers. What kind of profits are people..having
Hera: So I don't know about the profits, but what they've said is that they're completely banning items, things like face masks, which DO NOT WORK, people! Those do not work...
Bex: [laughs]
Hera: ...hand sanitiser gels, again, they do not work. This is not a bacterial infection. This is like, around, it's a virus. Or wipes. And interestingly, you know what people were doing…? Sellers were putting in the titles, they putting “Coronavirus”...because they knew that people were gonna be searching for it, so that's how, like, eBay has taken action
Sacha: Oh I see! Right, okay.
Hera: so they were putting COVID-19, and things like that.
Sacha: They were just typing that in? They were putting that in there, like, that was the subject title…
Hera: So they were searching Google saying, like, what can save me from Coronavirus? How can I protect myself and they were coming up with these. And there is a step there which says that...I’ll read it from the article: “the move comes as Coronavirus fears deepened in the UK leading supermarket sales of hand sanitizer to soar to 55% year on year, according to new figures from market research firm Kantar.”
Sacha: That's not even the best way to deal with that either, as well...
Hera: It doesn’t at all! It’s not “the best way”, it does not deal with it! And, also despite what people say, I have read such bogus advice, like, do not rub essential oils in your different cavities
Bex: [gasp] Oh my god!
Hera: [laughs] it does not nork! Do not buy crystals for your room, it’s not going to ward off...
Bex: At least they look nice.
Hera: They look nice, yes, but do not spend hundreds of pounds on it. Do not go to Goop for advice. Full stop.
Sacha: [laughs]
Bex: But thank you to eBay for your public service. That is a nice note to stop on. You made a little note, well it isn't that nice. Yes, it is nice, Hera, thank you for that note in the bottom of the script. But yeah, we're outta time. Everyone happy with that? I think we ended on a high, even though there was some turmoil mid-Podcast?
Hera: Yes, definitely.
Bex: Excellent. Listeners, what did you think? We’d love to hear your thoughts. Get in touch on twitter @techforgoodlive or Email at hello@techforgood.live
We’d love it if you gave us a nice iTunes review and told your mates about this podcast!
If you want to get more involved in our community, you can join our Slack channel. We also have a newsletter. Just visit the Tech for Good Live website to find out how.
Thanks to the wonderful podcast.co for hosting our podcast on their podcast space.
Hera: So much podcast!
Bex: All the podcasts! Podcast, podcast, podcast!
We’d also like to thank Happy Porch for their sponsorship which will allow us to have our episodes transcribed, helping us be more inclusive and accessible. Happy Porch provides strategy, technology and development for purpose driven organisations. You can find out more about them at happyporch.com Bye!
Sacha & Hera: Bye!