TFGL2021 - S4 - Where does the agency lie? - NUI Galway Special

Welcome to this episode of the Tech For Good Live podcast.

The show is produced and hosted by Kristiana Zunde. She is joined by Bex-Rae Evans. Bex is a Tech for Good Live podcast regular and she ran the module for the students at NUI Galway. 

And they’re joined by Sheila Malone. Sheila is a lecturer in Marketing at NUI Galway. 


Transcript

Kristiana: Hello and welcome to another episode of the Tech For Good Live podcast. A podcast all about how we might use tech for positive social impact. This is a special episode featuring recordings from the Tech For good module run by backs and Johnny from our team at the national university of Ireland Galway. In this episode, we will hear the students' thoughts on dealing with propaganda misinformation, data and privacy. Joining me on this episode, we've got Bex Rae-Evans. 

Bex:  Hello.

Kristiana: We've got Sheila Malone with us, who is the lecturer at the university. Hi Sheila. 

Sheila: Hi Kristiana.

Kristiana: And then you've got me. I'm Kristiana. And I usually do the behind the scenes stuff. I do the production, but I'm here. So hello. 

Bex: Hello, thanks for joining us Kristiana and hello, Sheila. Thanks for joining us too. I guess I just wanted to start by giving a little bit of context to the course and the students that were on the course. So I'm going to hand that over to Sheila. Do you want to tell us what was this module that happened and who were the students that you're going to hear from today?

Sheila: Thank you Bex. So we run a summer school every year for our post-graduate marketing students across two programs. So it's the marketing management program and the international marketing and entrepreneurship. And the summer school is about advancing the students’ applied skills, thinking about what they have learned in the previous two semesters and bringing all of that together in a more applied form. And the week that we decided to do it we were trying to think of something that would have, I suppose, positive change and would have something of a contribution to society, or at least challenge the students to think about what that might look like. It's not often industry or a sector that they pay a lot of attention to. So this is where we came together and thought, well, perhaps a good theme for the week would be to look at the intersection between sustainability and technology from a marketing perspective, and to get the students to really question that and step outside their comfort zone and think about how they might produce something that would work in that marketplace. Prior to this, the students had undertaken a module, social marketing and sustainability, which opened their eyes to ideas around system thinking and design thinking and working with organisations that are set for positive change. So they had a little bit of insight into it but not in terms of the depth that we would have carried out in the summer school. And that's where it came along where Bex and Jonny had the skills to look at tech for good and how the students might be challenged in terms of their understanding and their learnings and the knowledge and how they might apply that in terms of a prototype. So I don't know if you want me to talk about how the school went or do you want to introduce that, Bex?

Bex: Oh yeah, yeah. And how they found it because I think it's quite interesting as I work with. Groups of students and some of them naturally have an interest in social good. It tends to happen as a group weirdly, I don't know if it's because of who they come in contact with previously or what but interesting that it's not something that naturally they were really exploring themselves as a group of students. So yeah, I guess they had a little bit of a taster to it earlier on in the previous module that they did, but how do they find it? How did it go?

Sheila: I think initially when I pitched it to the students, there was a degree of concern and questions around how this fits with marketing. They couldn't see perhaps the links that were there or at least the links I was trying to display to them and how we might use marketing in a more positive light and not always see it as the bad beast in the corner that's just trying to fuel the capitalist system with greater levels of consumption and production. But to see that there are very positive returns that can be had when it's looked at in different sectors. And yeah, they started off with a lot of l, I suppose, trepidation, unsure, not quite understanding how this might play out and particularly that they would end up with a prototype product at the end of five days. That I suppose is quite a challenge. And definitely, you put the challenge to them and for some students, I think they felt that they had limitations, perhaps without having a lot of work experience prior to doing their masters. And our students are from all around the world. And they come with different skill sets and they would have different knowledge bases and different cultural understandings. And we saw in the week, as the week progressed, where that became so fruitful, these cultural insights that they had. Just to name one that really stood out to me has stayed in my mind since the summer school, was the forgotten children. So, I don't know if this stayed with you Bex but a group of our students, majority of them from China, wanted to highlight the issues around, children who perhaps are left with elderly relatives in more rural destinations because perhaps their parents have to go to larger cities to work and send money back to those families. But those children end up with quite significant mental, emotional, and social wellbeing issues. And the students wanted to address that concern. They felt it was quite a large concern for society and that they didn't necessarily feel that the Chinese government had provided the support that they thought would help that generation. And it was amazing to see that they came up with a fantastic prototype product about what they felt was a forgotten generation and how they would market that, how they would consider that could be financed, how they would brand it, the service elements that ran us, thinking about the customer journey, the customer touch points. They drew upon such an array of modules and insights that they had learned in semester one and semester two, in order to make that something to produce, something that was a value, not only to them as individuals and as students, but to the marketplace that they were considering and drew upon the sensitivities that were required. And that just that displays and demonstrates a very high level of social intellect and emotional intellect. And to have the ability to do that. And I suppose foresee where it might go in the future was really rewarding for them. And for that group of students, they felt very proud of what they had produced. They were delighted that the opportunity had presented itself, but they had never met the natural link that their knowledge of marketing is what brought them to that point and their sensitivities around consumers and customer bases and how we service service scapes and how we use technology to drive social change. That was quite interesting.

Bex: Yeah, I think that one stuck with me as well. I guess, because it was something that I'd never known about or I never thought about before. So yeah, the international nature of the students really. In line, my island, I learnt loads and the resilience of the students. So I'm going to say because the week was really intense, deliberately, well, I felt like it was anyway. Every time we were planning a day, I was like, oh, Sheila, I think we've got too much in this day. And you were like, no, they can handle it. And they did and it was brilliant. I’d love to see the outputs of it. 

Sheila: And I think even for them as individuals, see what they started with blank canvas, more or less, and a theme that they're going to work on, something over a period of four days and every day was done so well that when the final day came and they produced these prototypes, the sense of like, I can't believe I achieved that. I can't believe I did that. I can't believe that we as a group managed to make this work because there were lots of challenges. They did all of this online and in a range of different time zones, a range of different countries, which they pulled it off and pulled it off very well. And in the initial days, I mean, yourself and Jonny did a great job of teaching them about tech for good, talking about what that looks like, looking at social problems that are in society, thinking of why they matter, why do we need to pay attention to them? How can we address social issues? You know, how can we use technology to help address them? How can we challenge biases and privileges that are out there? And think about it as a marketplace. Those imbalances. W e have so many areas that are often, I suppose, not considered as marketplaces like disability, mobility is, you know, the things that are kind of sometimes set aside are marginalised, but they're coming to the front, they're coming to the fore and they're being addressed and I think this challenge for the students, yes, it was seriously challenging for them and really stretched them beyond what they thought were limitations. It showed them that they're not limitations actually. And if you work hard and work well as a group and come together with a collective mind, with shared ideas and bringing in your personal strengths and your knowledge and everything you can bring to the table, you can produce something really pretty amazing. Even when they'd go to the ideation and developing it and designing it every day brought a major challenge of which I think they overcame fantastically. I couldn't believe actually how well they did at the end of the week. It was really, really well done. 

Bex: Yeah. I was so pleased. And I think, you know, there's a question about why we should teach this social thing. I think bringing an element of social good should be part of every course ever. I just think it's really important to consider those sorts of things. And at the end of the module, I was like, is anyone going to work in social good? And everyone went, no, cause they were just done with it by that point. But I'm hoping on reflection, some of them might end up working in the charity sector or the social good sector. And I guess also, even if they don't, you know, this is an awareness now that they’ll take into their day jobs and even if they ended up working at a big corporation that maybe isn't tech for social good, they'll take that learning about wider societal needs into that role? And that's just as valuable. 

Sheila: I think as well, their eyes were open too, so when you put the challenge to them about the ethical OS and looking at the eight priority areas and suggesting to them that one was missing and what might that one that's missing be, how would you fill that gap? And when they started to think about it, the knowledge base is there and the understandings are there but in order to, I suppose, draw upon that they have to be challenged to do so. But that was quite interesting when they had to kind of look at the fact, yes, the environment is glaringly obvious. It's out there and it feeds so well with today's society, climate change, zero carbon, sustainable development goals. It's what we're about. It's trying to address those grand challenges in society and play our part, even if it's only a small part. And I think the students learned that. I think it was a takeaway for them that we can do very positive things. We just have to maybe sometimes fit the coin, think about it a bit differently and look at marketing as a vehicle to drive sustainability or to drive positive interventions, technological interventions, and social change. And you know, the ways in which you did that, I thought also were very interesting to them. Most of them had not created a podcast before. Most of them had not thought about creating or how they go about creating a podcast. And then when they had done it and pulled it all together, and some of them had put some music in, intro music and exit music and they got so into it. They got so into it honestly [laughs]

Bex: [laughs]  

Sheila: We'll have all these pirate podcast channels coming out.

Bex: They did. It was so good how much thought they put into it. On that note, let's hear from the students. So a little bit more background. We talked about, they created a podcast. It was the hand-in on the first day. So every day there was a hand in that they had to produce and the first day was podcasts, of which they have never done before. I think they had a couple of hours to pull it together. The podcast was about, we got them to explore potential areas where you might want to be cautious of when developing tech solutions. So this is the ethical OS that Sheila was talking about, which was a tool that we use to explore those potential harms. So it includes things like, truth disinformation and propaganda addiction and the dopamine economy, economic inequalities, all that sort of thing. Weirdly, as we said, missing the environment but we talked about that on the course as well. And we used Clubhouse as an example to frame the discussions.  So that the assignment was to talk about the potential harms that Clubhouse could have using the eight potential harm areas that the ethical OS lays out. And then they created a podcast from them. So Kristiana, how did you find going through the clips? 

Kristiana: It was, they were all really interesting to listen to. I mean, firstly, I have to say that creating just a five minute podcast is quite difficult. They've done it myself. So like trying to get everything into five minutes is, yeah, you really have to think about what goes and what doesn't go in it. But yeah, it was quite interesting. Obviously, there were quite a few of them. I think there were seven altogether and because they have the same topic, they were quite repetitive, but yeah, I kind of pick the topics that I thought would be good talking points. I mean, I think on the tech for good live podcast we talk about data and privacy a lot. Propaganda and misinformation, we talk about a lot as well. But Clubhouse, that's kind of different cause I guess there's, I mean, there's just a lack of privacy. You kind of don't really have it, I suppose. And yeah, so I just kind of picked those two points to talk about cause I've kind of found them the most interesting. And that was I guess also sort of the main talking points in the podcast as well. 

Bex: Great. Let's hear from the first clip. So I believe this one is data and privacy.

Clip 1 Unlike other apps, clubhouse apparently does not take into account privacy by design. It lacks basic privacy safeguards needed by the EU’s general data protection regulation. Let's take an example. If you want to invite friends to use the app, you have to give your entire contact list. Now here's the funny bit.  Clubhouse says ‘‘people may choose to optionally grant access to their phone contacts, but a user cannot even invite anyone to join a call unless the whole contact list is shared’’. I think that Clubhouse could easily rework the invitation system. Perhaps use login usernames or unique links like Zoom does. Typically most users of apps do not read the lengthy terms and conditions which creates space for their data and privacy to become compromised. Especially with regards to JJ and surveillance and online, personal sharing of personal data. If that is shared of who is in certain chatrooms and what they are discussing, this information could be used in a manner that negatively impacts that user. For instance, by revealing political affiliation or other views and opinions belonging to that user. This information could be used by organisations or political priorities to influence voting and elections by unethical marketing practices.

Bex: Interesting stuff. That's the clip. So I’m just going to challenge you there, Sheila. Do you know who that was? 

Sheila: I do. I know who the second speaker was. Was it Reshaun? 

Bex: I don't actually know.  We'll check. We'll tag everybody in.

Sheila: I'm sure it was Reshaun. The first, I’m not sure. 

Bex: We'll double check, but very well presented by both of them and really great points. And the sharing of contact lists, that's really good research. So I think they did have two hours to go and create the content for this, figure out how to record a podcast and record it and the fact that they've managed to find out that you have to share your contact list in order to sign up to Clubhouse, I thought was quite well researched.

Sheila: I think what was interesting is that for their generation, which is the whole social media is, are out there as the second speaker pointed out who reads the small print? Who goes through the details of the smaller print but yet there are ramifications or the consequences of you sharing your entire contact list for the sake of inviting another person into our conversation can be quite significant. And the students picked up on that. They were saying that this isn't ethical. This isn’t good. This is not the way it should be because there are data and privacy issues there that are questionable. And I think that, you know, we all do it, don't we? I mean, you download an app and you just press consent and you don't really go through the ins and outs of it until next thing, somebody reads something somewhere that says, do you realise you've just actually given full access to all of what you own on your phone or something to that effect. So I think for the students to pick up on that and to start thinking about that as a real issue, It's a major issue where we're constantly faced with data issues and privacy issues. And GDPR tried to put a good stamp to a lot of that and those activities going on, but I think they did well to show that Clubhouse, as he said, does not take into account privacy by design. Quite the opposite there, it's almost a manipulation of people's privacy. That if you wish to be a member of Clubhouse, it's the opposite way round, isn't it. And you know, they picked up on that quite well I thought. 

Bex: Yeah. And linking it as well to the political stuff that's been happening lately, with all the political ads. Linking to that was very thoughtful. It's terrible.  Working in this sector [laughs] I talk about the general public and I just mean people who don't talk about tech day in and day out. Like I do. I assume that people are a bit pants at privacy stuff. And by that, I mean, everyone's still on Facebook, despite the fact that there are like hundreds of articles every year that say Facebook is stealing all your data and using it to sell you ads. I guess I just had this assumption that everyone just doesn't care about privacy. But actually people do care about privacy. It's just a big leap to come off some of these platforms. I forget that I'm still on WhatsApp, which is a Facebook product. And, you know, I'm judging all of these people that don't care about privacy, in my mind. Why would they? They don't know enough about it. They don't think about jacking the same way that I do, uh, which is unfair because clearly, you know, these students or the general public, they don't think about tech for good everyday, like I do. But they very quickly put two and two together and been like, this isn't right. Which was really heartening for me to think that people are thinking about privacy in this way. Just, it's tough to remove yourself from the platform sometimes. Kristiana, anything to add? Anything else that you thought was interesting about these clips? 

Kristiana: I mean, yeah, I guess, well, speaking of wanting to go off these social media apps and not being able to cause you kind of feel stuck on them. I suppose. I mean, I did actually take a break from all social media for a while. And I'm still not back on Twitter actually, but yeah, it is hard, especially, you know, things like Facebook and WhatsApp when those are the kind of only ways people get in touch with you and things like that. Like my mom's still on WhatsApp. A lot of my friends back in Latvia are still on there. So I'm still on there just because of them.  But yeah, with Clubhouse it’s kind of like, I dunno. I mean, I was never actually on Clubhouse. That was the one I avoided only because it was like, oh my God, not another social media platform. But yeah. It's hard, but it's good to see that, you know, people do actually paying attention to what's going on and yeah. The terms and conditions. I mean, yeah, no one reads them, but maybe there is something apps should do to kind of maybe write them shorter and you know, in a more understanding way so that people would actually read them and kind of think about, okay, do I want to join this app?

Bex: Yeah, really good point Kristiana. It's funny. I've recorded two podcasts today. It's two podcast day for me. And the previous podcast, I was talking about this link between data. And so we all default to data has to be secure. Right? That makes sense. 

Kristiana: Yeah. 

Bex: That’s okay. I'm okay with that. And we should do that, especially when it's personal data and it's really important data, we should think about privacy and keeping that secure. However, in the instance I was talking about earlier, the users were having to retell their traumatic story to multiple different agencies over and over and over again. And they were active users. The clients were actively saying, please share my story with all the other agencies and the agencies were going, we can’t. We want to keep your data secure. And there's this like butting up against wanting to keep data secure but also, you know, that person doesn't want to retell that traumatic story. They're very happy for that data to be shared within this group of people. And actually, what was the point? I think what I'm trying to say is, we default to things sometimes and actually the default should just be the right answer. So yeah, we can make terms and conditions as easy as possible to read, but you still probably wouldn't read them or understand them or care, so actually you should just not have any nefarious stuff in there. So this thing about sharing contact lists, we should just not have it and then people won't be so frightened of data and signing up things because the incident I was talking about earlier, should just default to this data is incredibly secure and we'll always share it with the right people at the right time. And then everybody should feel safe about that data. But no one feels safe about data because no one knows what the given away anymore, because it's not clear.

Sheila: I think as well with Clubhouse. If you kind of go back a bit like, Oprah Winfrey. Demi Lovato were all kind of promoting and reinforcing Clubhouse and using it to have mad conversations. And I think that we do fall upon celebrities and when they endorse products, we assume a degree of conformity or a security in that if it's good enough for these people, it should be good enough for the rest of us. Surely they've looked into the issues around data privacy, or you know, it's almost like the responsibility lies elsewhere because. Even from a marketing point of view, influencers have such clout in that respect that we don't always question the noise around it or the information around it. We fall back to well, if it's good enough for them and it works well, surely it's good enough for me. So perhaps there's issues there from a marketing point of view as well about celebrity endorsements of these products or social influencers promoting them and suggesting that, you know, they have a place to be without considering the consequences of sharing all your data.

Bex: Such a good point. Bloody celebrities. Let’s blame them. I’m okay with that.

Sheila: [laughs] And Oprah Winfrey [laughs]

Bex: [laughs] Cause I think he's quite politically on another level, isn't he? I mean. 

Bex: On another planet.

Sheila: [laughs]

Bex: Let’s move on to clip 2. I believe this is misinformation propaganda, so I will play it now.

Clip 2  To propose another kind of platform, they don't verify users. It becomes unreliable information at times. The resources, they might not be backed up, or you don't know who you're talking to. And what is the information that they have backed up, the content that makes it knowing that it's correct or not? It could be very influential and it could be fake news or this information propaganda use. And as an invitation only platform. Self-selected audiences may lead to biased conversations, affecting how people process information and make decisions. There's also a high risk potential for people to spread fake news by impersonating influential people in society, using bots and other technology. And finally bad actors can use the platform to construct a convincing panel to spread negative activism. For example, anti-vaxxers may be holding discussions against the COVID-19 vaccination, spreading false claims and negative information about the vaccines. There is a huge potential for Clubhouse to be used for like-minded people to get together and spread their views to a wider population. As an invite only audio chat platform, there is a possibility that Clubhouse could be targeted by hateful actors as an ideal way to disseminate or distorted themes to their followers in an effortless way, with an assurance by Clubhouse that it will be kept to a particular cohort. 

Bex: There you go, that was a great segue that did Sheila from celebrities and influences into misinformation and propaganda spread by influencers on Clubhouse. I guess, there was a lot of people there. Any voices you recognise? 

Sheila:  I do, I do. Amanda in particular and Sean. Look again, it's fantastic to see that the students, that this is something that's on their radar and their interpretation of misinformation and disinformation and the role that that plays in society and how it's influencing people without questioning it. I mean, one of the outcomes of the masters program is to be a more critical thinker, critical understanding, you know, that demonstration of a more questioning mind is really important. And I think that if the students are able to kind of look at something like Clubhouse, pull it apart, see the challenges that exist, understand what they mean for society and think about elements of marketing that are influencing this. Now, this is looking at marketing in a negative way, isn't it? Because it's kind of suggesting that we have these celebrities out there, we have these endorsers, or we have, you know, social influencers that have a serious following, who are effectively in some respect spreading this kind of misinformation or a form of propaganda to disinformation and false information that's out there of which maybe their following and their receivers are believing in that and in society to try and take on that challenge, I suppose, and to think about how we might traverse it. How do we go around this? How do we think about it and for the students to highlight that and to see Clubhouse in that way, to me is very important because it's a demonstration of their, of their ability to critique radio and to not just take it at face value, but in fact, you know, kind of totally deconstruct it, find its finer parts and then say, well, is this good or is it bad. It’s the questioning and evaluating what that might mean for society and that's the whole point, isn't it, looking at these technologies and thinking about their impact on society. 

Bex: Yeah, really great discussion there. I loved everything about that clip. Lots in it and lots to talk about, but I think the key thing for me is what you were saying., that just hearing students talk about it in this way is great. I guess it sounds, from what you're saying as well, it sounds like we’re super negative about stuff. And you have to be. That’s step one. So it was day one. What are the problems? And unless you're honest and realistic about the problems picking them apart. So we do a lot of future thinking in social tech and the reason why we do that is because short-term and long-term effects and impact is really, really complex. I think it's hard to predict what if I do this thing now, it might help this certain set of people now, but how's that going to affect them in 5, 10, 15, 20 years time? We don't always know and it's not always possible to predict that, but thinking as far into the future as is possible can be helpful for that. I think you have to start with be negative. You have to start by just tearing it apart and say, what are all the potential things that can go wrong with this thing? And then you can start thinking of solutions. So the next step is more positive. The next step is okay, well, you know, there are celebrities spreading misinformation. What do we do about it? Well, actually, you know, we've already started to see people solutionising around this. So on Instagram influencers now have to say, this is an ad. And it doesn't fix everything. Of course it doesn't. But one thing that, you know, someone sat down and gone, oh, we've got all these influencers selling stuff. No one knows what's real and what is an ad. Get them to announce, this is an ad. So somebody sat down and thought about that problem and came up with a solution to it. And it's not a perfect solution, but it's step one. So I think, you know, in my view because I don't think I'm inherently negative, but I love to pick stuff apart. And I think that is exciting because that's step one of, okay, well now how do we fix this? And I think they've done a really good job of picking apart Clubhouse.

Sheila: I don't think a lot of the students knew or are active members of Clubhouse. I think that their testimonials displayed a bit of, oh my gosh, this actually exists and people can just say what they want and nobody questions it. But of course all that social media is like that to a certain degree isn't it? It's not just Clubhouse, but I suppose it's the background actions of Clubhouse that makes it questionable for them. The fact that all of their contacts have to be accessible and they have to give up that information to just be a member on there. How elite is that? I mean, would you really want it that badly? 

Bex: [laughs] Kristiana, I feel like you've had something to say there as well. 

Kristiana: Yeah. Actually in the podcast there were quite a few people who had never even heard of Clubhouse or didn't have access to it. But I was going to say I actually really enjoyed putting this clip together because people across the different podcasts had different things to say in terms of misinformation and propaganda and all that stuff. Yeah. I dunno. I think Clubhouse as an app just really worries me. I'm kind of glad I'd never really got on it. But I mean, you know, going back to the thing about celebrities and influencers and things like that, I guess, you know, if people are going to see, I don't know, their favourite celebrity on Clubhouse, I guess, because they don't verify their users. You can't really tell, you know, whether they're real or not. So then, but because it's, it might be your favourite celebrity and you're just gonna instantly join it and believe it's them and kind of follow what they have to say, even though, you know, that might not be true. 

Bex: I think what was quite interesting about cloud products more generally for me was that people were very quick to criticise it immediately about all of these things that the students were picked upon and then, well, they didn't do anything about it. And then even worse, Twitter created whatever the Twitter version of this is.  They created like a Twitter Clubhouse. What's it called? And they didn't fix any of the problems. 

Kristiana: I think it's like a Twitter space or something like that. Something about spaces or audio spaces or something like that. 

Bex: I dunno. I was trying to look for it but yeah, so they didn't learn from any of the criticism. They were just like, oh yeah, we'll do that too. [laughs]  I've just been whingeing today on Twitter. I don't know if anybody, Sheila or Kristiana, you've come across this, but there's this idea in the tech for good space, we talk about tech is just a tool. It's a neutral tool, just like a spine up. It's not inherently evil or good. It's neutral. So there's this idea that tech’s neutral, but this is a competing idea that tech for good people, like prefer saying the tech isn't neutral and we should never say it's neutral because it's never neutral. It's not a spanner sat there. Somebody's created it for a purpose. And that purpose, you know, it takes on the morals and bias of the people that have created it, which I completely agree with. But I think saying tech isn't neutral and I read a lot of articles about it, that's where this came from, were very like passive and saying like, oh, Facebook is built in this way. Facebook has certain biases. Facebook has incentives that make us do certain things.  Clubhouse is not inherently private. But that takes the human out of it. Somebody designed Clubhouse. Somebody decided that privacy wasn't important on Clubhouse. Somebody, a human, a person made that decision and maybe someone above them directed them to do it. And maybe someone above them created the business model that allowed for all of these decisions to be made. But just feel like this idea of tech isn't neutral has taken the human out of the action of, you know, a person made this decision [laughs] And I guess this is why I'm really passionate about talking about this a lot and training people in this because it's a complex space and there's not always an easy answer. B ut the more people that know about it, the more they'll go away and they might be that designer or that marketeer in that process and you know what, maybe they can't really challenge it because maybe they're not, I don't have enough power, but it will cross their mind and they'll go, oh, maybe we shouldn't implement that feature. Maybe we shouldn't send out that email in that way. And I think that's the first step for stopping this.

Sheila: Do you really want to think of the social power that's there with social media? So when people collectively come together, communities of people who are unhappy with something not going in the right way, there's quite a strong force behind that, if it's collectively driven.  And what you were seeing their Bex about, you know, technology being neutral or where does the agency lie I think is the question.  Agency does lie and that locus of control does lie with the person who designed it, but it equally lies with the person who was using it. You make the active decision to support that if you engage with us, which I think Kristiana is your point of, you know, decentralising yourself from social media or whatever. However, you might describe that. I don't know. Or what they call it, digital detox therapy.  Agency lies in lots of spaces and I think it's dangerous to say technology alone is neutral because technology is a vehicle by which something else happens.  And we talk about it in terms of intervention. So, you know, for the summer school, you were talking about it in terms of positive social change, how it can facilitate something, but it only happens that way if somebody triggers it or if somebody designs it in a way that it can be triggered to help implement greater positive social change or positive interventions or whatever that might be. But likewise, it has the opposite effect and that's what the students are picking up on with the misinformation and disinformation and the fact that it can be a vehicle by which a lot of negative or kind of propaganda style, our biases are platformed and pivoted. And it's almost like a net core grammar phone or whatever way you want to describe it to, I suppose, share those views. What's important is that we educate people enough to know, to question us to not accept it is. And I think that became really apparent didn't in the American elections. And the question mark around whether social media was kind of played with or toyed with in terms of, you know, voting and political agendas and it's a whole other area, isn't it? But it was a really good example of the question mark, around how much we should believe what is reality and what's out there and false news and fake news or however it's described. And at the end of the day, it is a marketing product. It is a marketing tool that's used but it has to be triggered. So agency is in there. It's in there for the people both who create it but also those who use it.

Bex: And on that note, that's a really good point. We all have agency. Why have you been listening to this podcast when you could've been doing something different with your time? You chose this listeners. It's all we have time for. I think we're going to have to wrap up, but I just wanted to say thanks for joining us, Sheila. And thanks to all of the students who created the podcasts and were on the course. It was great. I had a great time teaching them and working with them. 

Sheila: And on behalf of all the students, we thank you as well because it challenged all of us to think differently about. So thank you.

Bex: Thanks. Is there anything you want to plug or do you want to talk about where people can find you on the internet or anything like that?

Sheila: Oh, I'm so not a tech person. There’s the irony [laughs]

Bex: Are you on Clubhouse?

Sheila: No, you know, something, I have a lot to learn about Clubhouse, but I've learned from my students. It's not a space I want to be in [laughs]

Bex: [laughs]

Sheila: No, but you're saying, you know, from an education point of view, from an academic point of view, it's about challenging our preconceptions, it's about putting in the face of our maybe comfort zones, uncomfortable positions and thinking processes and getting us to think outside of what we are told. And that's hugely important. It's hugely important that we develop future very inquisitive, very critical thinking marketers. 

Bex: Awesome. And I think we've done that. Yes, we’ve succeeded. Kristiana, thank you so much for going through all the podcasts and choosing the clips. Very interesting stuff. And I hope you enjoyed listening to the podcasts while you were doing it.

Kristiana: I did it. It was really fun. 

Bex: Good. Do you want to outro us Kristiana? 

Kristiana: Yeah. Sure. Listeners, what did you think? We would love it if you gave us a cheeky Itunes review and or told your mates about the podcast. You can also get in touch with us on Twitter tech for good life or on email us at hello at tech for good dot live. We are run by volunteers, believe it or not, so if you can spare the price of a coffee, we will make sure that all the episodes are transcribed. You can donate at tech for good dot live slash donate or if you want, you can give us a shout and sponsor us. And that is it. Thank you. 

Bex: Thanks. Bye.

Sheila: Thank you.

PodcastHarry Bailey