TFGL2021 - S2 - Ep 6 - More Carrot, Less Stick
Welcome to this episode of the Tech For Good Live podcast.
Joining host Bex on the show this week we have TFGL team members Fay Schofield and Ankur Asthana.
And our special guest is Ross McCulloch, Director of Third Sector Lab who specialise in digital strategy & service design for charities.
Transcript
Bex: Hello fellow winners and Joss welcome to another episode of the Tech for Good Live podcast. If you're new here, it's a podcast all about using technology to have a positive impact on the world. Sometimes though, we talk about the opposite of that, you know, like Elon Musk's plan for a slave army on Mars, or Zuckerberg's plans for a slave army on Earth, or Tim Cook's ones for beautifully branded slave army. Today's podcast will surely be upbeat though, because we're finally in the month of May. Pre-slaughter lambs are in the fields, flowers are blooming everywhere except in my flower beds. That's not a euphemism. We're going to have a quick update on Basecamp where everything is absolutely fine. We'll delve into the numbers and statistics and whatnot about charities and their emissions and energy use in 2020. Also, absolutely fine. And in the tech news of the month, we'll talk about how police have been given the go ahead for using facial recognition technology. I'm sure that will absolutely not lead to horror stories of racism. So yeah, uplifting content from beginning to end. And with that, let's get podcasting. Joining me today we have Greg Ashton back on the pod yet again. Greg, if you were sent back from the future, what would you do in the present day or maybe anytime in the past?
Greg: Dunno. My usual thing is kill them all and start again. So probably that. With some huge tech.
Bex: Great start to the podcast. Very positive so far.
Ben: So recommend me after the sort of revelation that half wasn't enough.
Greg: Yeah, yeah, why stop there? [laughs]
Bex: Ben Light is also back on the pod. Ben, if you were sent back from the future, what would you do other than undo whatever horrors Greg has previously done?
Ben: [laughs] That would be quite a mission.
Greg: [laughs]
Ben: I’d go for the head, ummm, so wait, am I going back in the past from now or I’ve been in the future?
Bex: You are going…. Either. Either. Either,
Greg: Or both.
Ben: Ooooh, Ahhh I dunno. I'll go back to the past. I'd like to go to I think Ancient Greece because you could very easily be a scientist in ancient Greece, because there was kind of no experimental method. So you could just, if you could write and communicate, you can make something up. You could say like, the sun lights the world by sending golden thread and wrapping it around everything which disintegrates at nighttime because it gets cold. And that will be science for like the next 400 years till someone said something different. So I like that idea.
Bex: Oh, you've all got great ideas. Hello, listeners. I'm Bex. And I just thought I'd go to Lovebox festival in 2009 because it was a really great lineup, and I couldn't go. Which was not very insightful or thoughtful or doing anything important in the world. But that was the first thing that came to mind. I mean, we're in lockdown.
Greg: The opportunity to time travel and you want to go to Lovebox [laughs]
Bex: Everybody wants to go dancing, right? We also have a guest with us today. Stuart Pearson, who is Chief Digital Officer at Citizens Advice Manchester. Hello, Stuart.
Stuart: Hi all.
Bex: Do you have an answer to today's time travel question?
Stuart: It possibly would be to go back in time twenty minutes and Google good answers if someone asks you about going back in time, what would you do. I think that would be my response.
Bex: That is an excellent answer. Thank you very much. So how is it being Chief Digital Officer at Citizens Advice Manchester.
Stuart: So it's a very grandiose title for a very small team. So Citizens Advice is a network of 266 independent charities across England and Wales that deliver free, confidential, independent impartial advice to people, wherever you are, whatever the problem, you know, Citizens Advice will try and help you. Citizens Advice Manchester is one of the larger ones. So we support over 60,000 people a year with their issues and obviously, we deliver that across many channels face to face, telephone, digital channels, and my job is to make sure our teams have the digital tools to deliver that service ball so that we have the digital channels that clients can access. And it's been a very interesting twelve months to pivot service remotely in basically two weeks. So yeah, it's been fun times.
Bex: Yeah, it sounds it.
Greg: Yeah, I was gonna say it feels like Citizens Advice is going to be one of them. organisations that nobody realises is a charity and everybody uses on a regular basis.
Stuart: It's a really interesting thing, because Citizens Advice did some market research a few years ago around that issue. About not being seen as a charity. So we hardly get any money in donations. Most of our funding is from local authorities or some national contracts. But actually, when they asked people, most people thought we were part of the government. And when they learned we were a charity, what happened then was people's trust in the advice we gave them went down, thinking that it would just be kind of volunteers and amateurs giving ideas off the top of their head, which obviously it isn't. It’s the kind of incredibly substantial information resource that we use. So yeah, it is hard because we don't, we're not really a hearts and minds charity. You know, it's not a very emotive brand. But I think the last twelve months, because we were formed in 1939, the outbreak of the war. So COVID has been kind of a bit of a replication of that, where we have been a go-to resource for a lot of people around new issues like furlough, and kind of, you know, redundancy and people's rights. So yeah, it's definitely kind of put us a bit back on the map of as an essential kind of service.
Bex: Thanks for joining us Stuart, in the midst of all of your digital transformation and advice giving. We'll move straight on to stat the week. So it's a quick update on Basecamp. I wasn't here last week. I presume we talked about the whole Basecamp thing last time. But now a third of the staff have resigned. Greg, do you want to give us a quick update on what's happened and where we're at.
Greg: Yeah, we may, we may have mentioned this. This absolute debacle. Yeah. So it felt like we had to kind of bring this up, because there was a lot of supposition and we were like trying to work out what may have happened behind the scenes and a lot more information’s come out now. And in doing this announcement, they gave their staff the opportunity to get out. And a third of the staff took them up on that offer. So I think they are surprisingly small, I thought they'd be bigger, but it was, it was 50 staff and 18, last time I saw had said that they were going to leave.
Bex: Yeah, I think things I learned this week was that Basecamp was much smaller than I thought it was. I don't know why I thought there would be thousands of staff, not 57. Yeah, interesting. And the fact that this blog post that was the start of all of this, that was really for those who hadn't seen it or hadn't listened to our previous episodes, it was a controversial blog post, talking about kind of Basecamp new philosophy and how they were going to kind of act. But what we learned this week was that's how they announced it to all of their staff as well. That was the first thing anybody heard about it, was this external blog post, which just doesn't feel too good.
Greg: Yeah, yeah. And the kind of the details behind it, it sounds like there was. I said, the last time I said it sounds like is kind of thrown the baby out with the bathwater and that's exactly what it sounds like. They were trying to resolve some internal struggles around certain discourses happening internally and the founders have just decided nah this is, I'm not happy with this, so we're just going to stop all discourse of any kind. You're just going to do your job.
Stuart: Yeah, obviously, it comes off the back of a lot of kind of pushback about, obviously, workers trying to unionise doesn't it? It’s happened with Google and obviously Amazon. It’s strange that they kind of badge themselves as cuddly, Silicon Valley types and none of them are like that, really, when it comes down to it.
Greg: Yeah, it's alright being libertarian until they start being libertarian about their workplace and then suddenly, the boss has decided it's not not so good.
Bex: It's interesting, isn't it? Because like, it's really nice to see people, staff acting on their morals and ethics and actually leaving a job, which is a massive thing to do for some people. But does that leave just all of the other white supremacists who agreed with this guy and is it like siloing and diversifying, not diversifying, the opposite of that. You know, I mean, creating these groups of like, really, there's no one to push back. Yeah, no one, there’s no one to push back in that organisation now about bad decisions being made. But yeah.
Ben: This is the first time we're hearing about this story. It's kind of...
Greg: Wow.
Ben: Is it Jason Friday, Jason Freed?
Greg: Yeah.
Ben: He was always someone who was kind of like, by massively popular sort of eight, nine years ago, possibly more, you know, this idea about kind of like, productivity and like non interruption by sort of managers and meetings, asynchronous communication, and parent, individual workers. Have I missed, it feels like something's happened between the lines here in terms of what this programmatic discourse was about. So does it, is there more detail as to what was being said or being not said that was kind of like, caused this blog post to sort of arrive?
Greg: So basically, it sounds like it's all rooted from from one point, but I feel like it's probably more complex than that, which was that internally they had a list of funny names, which had been around kind of from the start. Then some people had started mentioning that perhaps that list of funny names was potentially offensive to certain groups, because obviously, you're gonna have certain minorities who would come up more likely in the funny names list, which, you know, from the UK, we're all like, well, what's wrong with funny names because we've got some write funny names over here, that have absolutely no kind of reference to the person’s background. So that kind of triggered a series of like groups and committees who were going to look at these things, and it sounds to me, based on what I've seen that there was no easy answer to this. Where you couldn't, you know, people were getting caught up in the different sides could never meet in the middle. And it sounds to me like they've just kind of gone well, if you're not going to agree, we're just not going to let you do it anymore. So they cancelled everything. All the committees. All the other things and said, right, if you're not going to get on, we're going to you, we're going to take away all your toys and you're just going to come to work and do your work, you're not going to talk about anything outside of work,
Ben: It feels like to get that, small company like that, to get that kind of reaction, there must have been like a lot of entrenched positions. Like this must have been kind of like something they've been struggling with for some time. Because on the face of it sort of, you know, trying to have a reasonable policy about what kind of discourse is permissible and which isn’t, doesn't seem like it's a kind of resigning issue for a third of the company. But you feel that there's kind of like a massive battle going on prior to this.
Greg: Yeah. And I don't feel like it is going to be like a load of white supremacists left their Bex because the impression I get is that there was unhappiness from all sides. So I think what you're going to end up with is a load of introverts who just really don't want to have any kind of discourse with anyone anyway. And yeah, they'll really, really love this whole thing where they can just quietly come in and not speak to anyone and just do their jobs.
Ben: I feel that was like the premise of Basecamp and a lot of the stuff that was said at the start of it was to kind of create that, that space where you didn't have to speak to them [laughs]
Bex: And also. I mean, yeah, there's so much in this. I mean, first yeah, I mean, since then, though, I don't think it is people who just don't want to talk about stuff in work time. Like since then they had this big company meeting and at that company meeting, there was some really dodge things expressed, I would say, suggesting that the founders don't have nice, like opinions or thoughts. And also there was much wider things going on in this blog post, everyone seems to ditched that I also think we're like, questionable. And I forget what they were. That was two weeks ago now. But I also think it's hilarious, though, that like, rule one of introducing company, big company changes, like, consult your staff maybe before going public with it. And these guys have written like five books on how to manage teams. So yeah, don't buy those books.
Greg: Yeah, I think like he's, I mean, a few years old now. So I'm sure their sales have not been great for the last few years. But everybody looking at it now.
Ben: I’ve got to check which ones I’ve bought. I've definitely got [laughs]
Greg: [laughs] Yeah, crazy. Crazy.
Stuart: Do you think they worked on the European Super League as well because it feels like the same PR firm was involved?
Greg: [laughs] Yeah. Oh, my God. Yeah. Let's check who did that press release. It could well be Jason Freed at the bottom there, having a bad week.
Bex: Anyway, we're, we're all talking about Basecamp but I bet like most listeners don't even know what Basecamp is. It's a project management system. But um, on that note, there's also loads of people who still don't even have the internet, nevermind know what Basecamp is. So the next stat of the week is that Ofcom has found 1.5 million homes are still offline in the UK. Greg, do you want to tell us more?
Greg: Yeah, we see this every couple of years, it still comes up. And people are like, you know, there’s certain groups that are kind of keep rattling on about this saying, look, we need to resolve this and it never seems to go away. Yeah, it just kind of continues in that back in the background there. So I mean, you got things like 20% of children did not always have access to a device for learning while the schools were closed. You know, you've got the other end of the spectrum as well, where you've got people over 65, who won't or can’t use the internet. But then you've got other things like 91% of 12 to 15 year olds having their own smartphones. But that's still you know, you've still got 9% that don't have access to those devices. So it just feels like this is a problem that we can’t seem able to deal with.
Bex: That won’t go away?
Greg: Yeah.
Bex: And I think like these people as well are often people who need more help for whatever reason. And most of the charities that I've worked with during lockdown instead of, you know, they’re doing digital transformation as much as they can, but they really struggle to do it. Because in the back of their mind, they're like, what about all these people we can't reach? What about them? What about them, though? And I'm like, you know what, I don't think anybody's quite answered that. I don't know how you dealt with this, Stuart.
Stuart: So digital exclusion is a real kind of passion of mine. And it's such an issue. And at Citizens Advice we’ve been aware of it for a number of years because as more and more services move digital by default, particularly like Universal Credit, or kind of housing benefit. Council tax is all digital first in Manchester. But more and more people were coming to us and digital skills was the presenting issue rather than the actual advice issue. So what was interesting is in like 2019, we were obviously working with the council, you know, how can we address this. And the council's fear in terms of digital skills was the kind of higher end, kind of, that you know, we have gotten programmers all kinds of cyber, they weren't really interested in, they can't, they've got an email address, you know, that, you know, they don't want to use a mouse. But COVID has really shined a light on that, because we've been working with the City Council, helping, you know, people do online shopping, you know, while we've had lockdown for all these people, where we just think that all these things we just take for granted. For this cohort of people, it's a real challenge, you know, particularly the ones who've never had an interest in going online or have been scared about going online, because their only stories they ever hear about going online is what they see on the news, which is never good news stories about going online. So it has been kind of building up their trust and confidence to access digital, because, like you say, if you're offline, there's so many disadvantages, you know, everything's more expensive if you shop offline versus versus online, and kind of social exclusion and even health, you know, getting in contact with the GP as well now. So it is a big issue and something that I know Greater Manchester has set up the Greater Manchester taskforce with the ambition of getting 100% of people, you know, digitally confident. So we're part of that and I know a lot of other kind of charities and tech for good sectors are really involved in that. So it's a big issue. I'll get off my soapbox now.
Bex: But what do you think about those who don't want to be online and that's okay. Like, is that ever allowed? I can’t figure it out?
Stuart: I just think it's harder and harder to be offline. So I think there needs to be more carrot and less stick. You know, I think that's the challenge. It is about really selling the benefits of being online. And I think too much of it has been, you know, you know, obviously get your Universal Credit, in order to, you know, with COVID, you know, to submit your results, you need to be online to do that. You know, in order to get into a bar, you need to have a QR code on a smartphone. Yeah, I think what I think has been kind of selling the benefits of being online. And in terms of kind of the offline kind of premium and social interaction, but it is a challenge.
Bex: Like a caveat, like, I do totally agree with you but also, like, have this like, I don't think we all should be an awful place and we should probably just delete actually. And I don't, I can't swim and have no desire to swim. But people keep telling me oh, you have to learn how to swim. No, you have to do it, you have to learn how to swim. And the same with my accountancy in my business. Like, I'm really bad at it. I don't like it. So I just got an accountant. But people kept saying, you know what, you want to go on this accountancy course, you should go on accounting course. At what point do you just go, I don't want to. Like [laughs] I don't know, I'm just being flippant today, because people try to force me on courses today.
Ben: But you’ll agree yourself Bex that those were some complex tax returns to do. If you were to ask for any help, then, you know, you'll probably rethink a lot of that.
Bex: [laughs] I suppose if the government flooded the UK, I would have to learn how to swim. And I suppose that's what we've done is services online. And now we have to learn how to use online. But that's not the problem of the user. That's the problem of the people designing services.
Ben: I suppose what you're talking about is, you're talking about position, someone who's kind of making a conscious choice that they don't necessarily want to have to do these things online, as opposed to a position where they haven't actually got the opportunity in a meaningful way. So there's two separate things there. You pulled out some statistics from this, Greg. I think the question of access is kind of a tricky and nuanced one. What jumps out at me is 20% of children did not have access to a device for online learning while schools are closed. Those are the kind of things that really infuriate me, because that's an absolute kind of opening shot of case as to people's education to continue in some form or other, they need some form of infrastructure. So that has to be provided. We've kind of made all kinds of concessions and grants and funds available throughout this kind of pandemic. That has to be the most obvious one because you're literally disadvantaging people for the rest of their lives, given the kind of importance of academic progress. It's like having a classroom where one, you know, students are bathed in darkness and they barely make out what the teacher is saying, oh, I'll see the lessons. It's. So there's a big, more complex picture around their provision of service and sort of co-opting people into accessing them. But then there's a secondary one, which is just like we shouldn't be putting young people in a position where they haven't got access to education for something as simple as buying a shitload of computers basically. It's not a complex problem. But yeah, that was astonishing to me.
Bex: It kind of is, like though, I don't know why I'm being so, I don't know why I'm being so difficult about this today, because normally I'm advocating in exactly the same way that you all are about this. But I think, like this idea of opportunity in a meaningful way really stuck with me when you said that and I was like, oh, yeah, of course. Like I'm talking about stuff I don't want to do but there's other people who literally just can't. Maybe there's some people who want to swim and don't have access to a swimming pool, for example, if we're gonna stick with that really tedious analogy there. But, um, but actually, you know, there's some people who don't have the headspace to think about learning a new skill when they are desperately trying to think about where their next meal is gonna come from. So you know, is more complicated than just giving somebody a device or teaching people the skills. And I know that's not everybody in there is like a low hanging fruit sort of user group where you could just give them a device and it will be fine. I was working with an organisation the other day that deals with young homeless people, and they offer them housing. And the housing doesn't have internet, the young people don't have data, and they're trying to figure out how to solve this. And they were like, you know what, we would happily put a device into every single house and we would nail it to the wall or whatever needs to be done. But the type of people they were working with and their complex lives, they will break it, they will lose it, they will sell it. So like, I don't know, it just all of a sudden became like a super complicated issue. Even if they are willing to plough money and devices and time into these people, there's still a subset of people who are gonna struggle. But yeah, I dunno why I’m being so difficult about it today.
Ben: No it’s fair. I think we've both worked on projects where physical infrastructure changes have left communities behind. Like I remember speaking to you about this a couple years ago. So you do see it from that point of view, where suddenly the ground has shifted, and people have to then shift with it in a way that's like really burdensome, you know, or difficult, or else that was it. That's the plan, the buses have gone. There's simply no more connection unless you move out or you kind of make your own peace in the isolated place. So yeah, no, I don't disagree at all. I’m just furious about computers for children.
Bex: [laughs] Well, I think we all agree about computers for all children.
Ben: Yeah, that's like the Simpsons thing isn’t it? Where somebody thinks of the kids, but in this case, I totally agree with his son.
Greg: [laughs]
Bex: We will have spent far too long on the stats this week. So we'll move on to the charity news of the week. So almost 1/3 of the top 100 UK charities did not apply with government regulations regarding emissions and energy use in 2020. This doesn't sound good Greg.
Greg: Nope.
Bex: But Imean, does anybody?
Greg: Right. Well, yeah, I guess that's kind of the thing is, a lot of people probably don't. But this is one of those reports, which has come from an organisation who do this kind of stuff. So obviously, they want to bang the drum about sustainability, which makes it sound like I don't agree with this stuff but you know, we'll get onto that. But yeah, they estimated 593 UK charities are required to comply with the streamlined energy and carbon reporting legislation, and disclose their carbon emissions. As you said, 1/3, of the 100 UK charities, top 100 UK charities did not comply. Only seven of the assessed charities had set net zero emission targets, in line with the UK Government ambitions. And only 38% of the top UK 100 charities disclosed their organisation's emissions. So yeah, it's really, I'm not surprised to be honest, but not to put Stuart on the spot here, I’d be really interested to hear from somebody who works at a charity, when they hear about that, like, what they think about these stats around, and there's no specification on what these charities do, but this idea that they, you know, they they should be focusing on sustainability.
Ben: As the voice of all charities for the UK [laughs]
Greg: Yeah. Yeah.
Stuart: I think, your heart sinks when you read another kind of how terrible charities are, and it's just obviously so difficult to obviously run a charity, you know, in terms of you’ve got your core mission, and you've got all these regulations and compliance, and then you just have layer upon layer of things. So, obviously, things are going to slip, do you know what I mean? So that'd be my kind of thought. And again, because you're not sure what these charities are, you know, what their carbon footprints are, it's going to be more relevant to some charities than other charities. And it just feels like batter charities to me. So I, obviously, am completely for, you know, the environment and you know, we all need to do better but it's just one of those headlines that can see people liking. Certain people.
Greg: Yeah, I feel like it’s, I think it's less of a bash, more of a come buy our services.
Bex: [laughs} Which is even worse.
Greg: Which is potentially worse. But yeah, I've been involved with conversations with charities about where they're having like really heated debates at the highest levels, about the scope of that charity’s focus. And, you know, even down to the slightest nuance in, you know, things around human rights or things around animal rights and stuff like that. As they have really heated conversations about whether they should or shouldn't pay attention to particular things. So I think everybody should be acting in a sustainable and environmentally friendly way but we're dealing with organisations that, you know, to be really, really focused in the work that they do, and I do question like, you know, they can barely even improve things so that their lives are better, just from a day to day. Getting them to think about sustainability in a much grander scheme, it just seems like a very big ask.
Ben: 2020 is also quite a difficult year to be, you know, kind of auditing and assessing people on things that aren't absolutely essential to staying, you know, in existence and helping people and continuing your mission, in a totally different landscape. That being said, Greg, I think your overall point, which is this is kind of fairly existentially important to all of us. We went to a conference that I wish to forget, several years ago, you'll know what I'm talking about. And there was a, there was a really amazing digital ethicist, who was giving you a headline, talk about sort of design. His name will also come back to mind. But he did a fantastic talk, which he sat in front of hall full of digital designers and said, okay, digital design, stop thinking about that. The planet is doomed and you need to address this before you think you put your digital skills to anything else. And I thought that was an absolutely brilliant statement.
Bex: This was Kenneth Bowles at Camp Digital. But yes.
Ben: Yes, exactly. I, you know, I think it's good every now and again, to get that kind of kicking about this, because it is not an easy thing to kind of keep front and centre when you're really passionate about a cause and your kind of whole time and expertise and experience is devoted to that. So I welcome the kicking into buying thing that’s just happened to certainly sort of prompted me to think of it.
Bex: Yeah, and definitely those who can should and they should support those who can't. There is big inequalities even within the charity sector of those who have space and headspace think about this and those that don't. So maybe we can support each other on it. hey, and end on a positive. [laughs] It’s an idea. And tech news of the week. Greg, we've got two of these. Do you want to do both, looking at the time?
Greg: Yeah, it's, it's interlinked. So the EU has laid out their recommendations for rules around AI. So they've produced a 108 page document, basically outlining their harmonised rules. And it's probably the most detailed attempt at regulating AI that any government has done so far. They kind of split things up. There's a huge amount of detail. I tried reading it, but I'm gonna need a lot more time to get through it because it is very complex. But from a high level, what they've done is they've kind of split it into two areas. So you've got what they classify as high risk AI, which has more stringent rules and legislation around. So that might be things like autonomous vehicles or medical machinery. And then you have your lower risk AI, which would have more of a voluntary system, but they could if they wanted to show willing, adopt the high risk rules and doing that. And that might be something like a filter for spam messages and things like that. Now off the back of this, the new Commissioner for biometrics does not agree with certain aspects of the proposed rules. So one of these was around and a restriction on the use of facial recognition. The new Commissioner feels that this should be left up to law enforcement to make decisions around that use of facial recognition. So what do we think? Big complex things.
Bex: My answer is no [laughs]
Ben: [laughs] I think as with most rules would catch here, more memorable. How, how would you possibly legislate for something like this, how I think we can't be taught we taught us all the time like, is legislation as it stands, like any kind of vehicle to sort of, you know, get involved in contributing to this kind of conversation where the technology literally moves so fast, and the kind of the concept that not only the interpretation, but the understanding and eventual application of the law, like is never going to catch up to anything close to, you know, the mainstream adoption. Never mind the cutting edge, up until the point where you're kind of dealing with massive incumbents with like, hugely, sort of, well embedded systems. We're kind of seeing this in tech in general. Is there anything that's great with your, thank you very much reading some of it, but is there anything in that that kind of makes you think this might be more successful?
Greg: I think, you know, it's kind of centred around much the same as GDPR was, around rights. So inalienable human rights, you know, that they kind of call back to GDPR in it, and they call back to the, the, you know, the bill of human rights as a starting point, you know, to kind of say, right, whatever happens, this is your focus, this is your centre point. Always start from the person's human rights, and then trying through descriptions of the use to try and describe, okay, here's an example, here's where, you know, the rules would apply. So I think that it's more like a framework, and then you could, you know, if you come up with a new different use, you could still reference the rules. But I think you're right, I think there is still that limitation where we will have things that don't necessarily sit within that high and low risk, maybe, across both. But yeah, I certainly agree. And I've said it before on this, that the police shouldn't be the ones. There needs to be a conversation about this. And it shouldn't be the responsibility of law enforcement to make that decision, because it's not fair on them. That's, you know, these are big questions. And if we're putting it in the hands of the people that are trying to protect us, they're the ones that are going to come a cropper when things go tits up, because we've not properly had these discussions about what AI should and shouldn't do. And, yeah, I just think it's unfair to put that responsibility on their shoulders.
Ben: I like the idea of, as you stated, that kind of stated there Greg, of working backwards from, you know, human rights thing, creating this weird grey area where it's like, there's no rules here. But if we think it's kind of come to being human rights, that's when massive penalties will come into force. It's almost like that kind of weird space creates enough of a kind of incentive for people to properly consider this as these different use cases emerge. And it's kind of the technology advances, it doesn't feel like 100% satisfactory, but maybe you can, maybe can take that similar approach to GDPR If it's suitably enforced. I'm not going to say policed, specifically enforced. That might kind of create some kind of framework that the people think they have to be cognizant of what they're doing within.
Greg: That is exactly that. It's making sure that people that take the time, and the real effort to think these things through, especially, you know, you'll have seen loads of articles about the soon to be global trolley problem that's being developed in the body of autonomous vehicles, where we have this whole thing about if a car is going to hit a person, and then swerves and will hit another person, what does the you know, what does the vehicle do. We're going to need to be able to, you know, if one of these vehicles hit somebody, we're going to need the organisation who built it to be able to say, well, this is what we thought, and yes, somebody died, it's not the best out of the situation, but somebody was going to die anyway and this was our thought process. And you know, we're gonna have to have that organisation's thinking of these pretty horrible, you know, bad bad situations, but it needs to happen. And they need to have the space to do that, and then be able to, you know, evidence that they've had those thoughts and those conversations. So it's not just a thing from the tech industry. It’s a thing from you know, it's a cultural thing from society as well to start having that space to have these whoa, what if things do go wrong?
Stuart: It is such obviously a complex space. But I think GDPR is a good example, where GDPR doesn't solve all the issues of kind of data ownership and data controls, but at least there's a framework in place to build on and I think this is the same kind of thing, isn't it? This is gonna solve all the issues of AI or the potential future pitfalls, but at least it's a start. And it's a start by an independent group, rather than the tech companies doing it themselves.
Bex: I have no insight on AI, I'm tired by AI today, so I'm not gonna add to anything. Everything you said was great. Let's move on to the rant of the week. So this Stuart, this has come from you. You wanted to talk about pricing structure in the small charity sector within tech.
Stuart: Yeah, it's just something that’s happened to me a few times over the last few months where I think as a small or medium sized charity, when you kind of engage with larger tech companies, and they come back to you with their kind of nonprofit kind of pricing structure, it's often just completely unrealistic now for a lot of them. It probably works for national charities, but what we recently had some actual conversations with Okter and All Zero around kind of some solutions, and they came back and give you a price and you’re just like, you know, that is the end of the conversation pretty quickly. And I just think there is a lack of understanding about you know what, in terms of going back to the kind of comments the wrong kind of the green issues for charities is so much time charity spend is about funding, you know, and about, you know, how do we survive? And when you kind of try to do it security and ensuring you’re data protected and you kind of engage with these organisations, the price they come back with, you think well that’s just a non-starter, for us because you think well, how much can we do in terms of different elements of tech, if they just can't understand that you just can't afford that. Any type of free is a challenge for charities,
Bex: Yeah. I guess it does price you out of entire solutions that you could be working onfor social good in some way and using these big tech solutions, and it's just not not helping. I know, there's like a bunch of people who are like, well, obviously, there are cheaper versions of some things. Some things that just aren't, some things there are. But then also there's this idea of like piece together a load of free or cheap technology in order to create a solution and there you go, why don't you just do that? And I think that's like, it does work sometimes. But I think also the challenges to that are that you have a broken piecemeal solution that might not quite work how you need it to and you need the knowledge and the skills and you probably need a developer to like piece that together. So actually, is it much cheaper? Yeah, it probably yes. But it's also a bit shit. So like, what's the end point there? Like something that costs a lot, took a lot of time, and is now not very good. Yeah.
Ben: Yeah, the cost can come later on, and the operation of it and the kind of the friction it created and the sort of headspace it creates to kind of maintain or update and all these things and, you know, again, it is infuriating, because it's not like shipping two copies of a software is gonna cost anything, you know, comparable to like, there's no, you're not shipping two different sets of books to two different addresses, you're literally pressing a button and kind of altering a few digits. I know there's more to it than that. But there's not that much more to it than that. It shouldn't mean charities get like exceptionally good pricing. The only thing that was preventing that as a kind of classification was customers who could spend money. And that's basically the kind of barrier there. So yeah, I feel your pain on that one.
Greg: I think for some some smaller charities I work with as well, I think a big part of it is down to kind of digital literacy and that understanding about how much some things do cost, like a new website, and some of the values that I've seen kind of thrown around for what people think a website costs to build. And now I think, yes, money is tight but I think that that understanding can help to then prioritise where that money is spent.
Bex: Yeah, there is that on the flip side, isn't there? There's a couple of sides to this coin.
Greg: Yeah. And then we get the crazy sales emails [laughs] which was mentioned before we started recording.
Bex: Yeah. And then also there are enterprise solutions or organisations, which will overcharge for what they're doing, and charities who don't know better will buy into it. And you're like, left . The times as an agency, I've picked up projects, and I'm like, what have you done? It's too late now. You're tied into a five year deal. We're just gonna have to work with it. Again, because of that, yeah, literacy level. The digital literacy levels of people purchasing software.
Stuart: I do think stuff that kind of tech for good sector does, I think is really important for charities. They can engage with trusted partners. So I think I've been in the charity sector long enough to have seen so many white elephants developed, you know, where they get a big pot of money from Big Lottery to go away and build an app or build, you know, a referral system or whatever and then, once a project funding ends, there's no kind of additional cost for support or redevelopment or security patches, and then it's just a waste of money. Whereas if they’d of engaged with trusted partners and looked at a more sustainable model. I think that's really key.
Greg: Yeah, that funding model’s always been a big issue. I know the team at Reason Digital were trying to find different ways of getting around that because it's fine getting that initial seed funding, but then that just disappears. And if you've not, if you're not charging lots of money to keep that thing going, it just, it dies a death because, you know, one charity or a few can't keep it going, you need to have huge number and getting to that point, there's just no funding to get it to that point.
Ben: Yeah, you’ve proved the concept, but then you can't continue it. In any event. all you've got is kind of an academic footnote of, hey, this thing probably works. End of story. Oh wait, we're doing it again. We’re dipping. We need a positive.
Bex: Okay, and finally. And we've got an AI that's been developed to help people buy bread, but it actually helps spot cancer and loads of other stuff.
Greg: This is just amazing. I don't know if anybody's seen this.
Ben: I’ve seen this.
Greg: It's just so astounding. So this is like one of them internet things that's just kind of appeared on Twitter and there was an article about it, and it's like one of them like myths where you're like, that can't be true, and it's actually true, like the ending of Byker Grove, which I discovered the other day is just ridiculous. But we won't go into that now. So basically this bakery in Japan, they went through this whole process of, they knew that their customers wanted lots of different types of baked goods. And they didn't want to do them in packaging separately, because they were like, well, if we do that, like it's got preservatives and apparently, in Japan, they don't like that kind of thing. They wanted to know that they were really nice and fresh. But then it was that whole question of where do we put the labels if there's no packaging, and also, we don't want people like constantly touching them and things like that. So this guy just kind of raised this challenge and this developer started working on it. It was a few years ago now, and came up with this system where they could basically put these pastries in an area. This system would look at the pastries, identify the different types, work out how much it all cost, and then just tell the cashier, right, this is how much they owe. So the cashier would just put the pastries wherever, and then have a conversation and then they’d go to the person, hey, this is how much you owe. And it's like some kind of witchcraft. But then somebody looks at this and was like, oh, I wonder if this would work for cancer cells. Because there's things here where you know, recognising these different shapes. It would work really well, if you could use this on samples. Turns out, it works really well. And there's like a whole other bunch of things that they're going to use it for to identify using this AI that was developed to identify baked goods. Amazing.
Bex: But aren’t the best inventions always by mistake anyway/. Like penicillin, x rays. I heard champagne was a mistake. Maybe don't ever try and invent anything, maybe just see what happens [laughs]
Ben: We've all worked with, you know, visionary people and CEOs who would be like, hey, guys, you know, that thing that we use for bread? It looks a bit like the cancer cells. Can we just do something where? And we would’ve all been like, oh my gosh.
Bex: Shut up, yeah [laughs] Oh my gosh.
Ben: [laughs] It's like, okay, well, maybe this is. The other alternative, which is horrifying is that they never cracked it. They kind of snuck a scale in there. And just an algorithm that randomly assigns different values to this roughly based on weight. I’m not going to consider that I'm going to choose to believe that they've cracked that algorithmically and that cancer cells are very similar to cancer cells developing.
Bex: Awesome. Good news to end with. But that is all we have time for today. Thank you for listening. Stuart, how was that for you?
Stuart: It was fun.
Bex: Thanks for joining.
Stuart: No problem.
Bex: Where can people find you on the internet?
Stuart: Just on Twitter. At Stu Pearson, S T. double O, Pearson, and obviously Citizens Advice Manchester dot co dot uk. But don't ask for me because I don't give advice.
Bex: [laughs] Is there anything you want to plug or promote?
Stuart: No, no, apart from yourselves, I think any kind of charities who are looking for help kind of reach out to, to reply upon things because I think for charities, digital is obviously such an important thing. Digital transformation. And it will be good to kind of reach out to kind of trusted partners who've got your interest at heart.
Bex: Trusted partners. Good, good ending. Listeners. What did you think? We'd love to hear your thoughts. Get in touch on Twitter at Tech for good dot live or you can email us at hello at Tech for good dot live. We'd love it if you gave us a nice iTunes review and told your mates about this podcast. Thanks to Podcast.co for hosting us on their platform. Thanks to our editors. And also thanks to Greg for editing the outro as I was speaking because you know, I just read whatever is on the page. That's it. Thank you everyone.
Greg: Bye.
Stuart: Bye.
Ben: Bye.