TFGL2022 - S5 - Ep4 - The Virtual Property Market
Welcome to this episode of the Tech For Good Live podcast.
TFGL Team members Hera Hussain and Harriet Pugh are joined by our special guest Zoë Firth, Project Lead at Chayn
Transcript
Hera: Hello, it is February, so there's only 11 months until Christmas. I'm joking. We don't do that here. We'll be talking about data, how we make it anonymous, privacy, ethics, and mental health and the growing world of NSDs, all that and more coming right up. Joining me in today's exploration into everything tech and everything bad and good, we have Harriet who is here with us. She is an expert in NFTs. Don't believe it if she tells you otherwise. Her real job of working as a service designer is just a front. She's all about the NFTs. And me. I'm Hera Hussain and I'll be your host today. I've been away from the podcast for a whole year, but now that I'm hosting, I'm back for good. I'm back for good on Tech For Good Live to talk about tech for good live. See what I did there? We have a very special guest with us today. Zoe first is a mental health expert. She is British American and her background is in psychology, working in acute mental health. And she's recently completed a clinical research MSE as Chayn's bloom projects lead. So she writes, edits and facilitates the bloom courses, manages the course delivery, and she also speaks English and French and we work together. So hello everybody. How are you doing today? Especially you Zoe. We gave you a whole month to prepare for this, didn't we?
Zoe: You gave me a whole month of 30 minutes, so amazing.
Hera: 30 seconds.
Zoe: 30 seconds. No, I am super excited to be here. Not at all nervous about living up to my introduction. But yeah, excited to talk about all of this stuff today.
Hera: And what about you Harriet? Our resident NFT expert.
Harriet: Yeah. Well, I mean, it's interesting. I'd love to know you how many so-called experts of NFTs they are in the world because I probably know as much as quite a large proportion of people in the world. A big fat zero. It's just like watching this thing unfold. Is just fascinating, isn't it?
Hera: That's the thing. There's so many things that just started blowing up and there's such a hype and smoke and mirrors. In fact, my friend wrote a book called Smoke and Mirrors about hype technology. It's crazy, but let's talk about, you know, how, what brings you to this pod? Is there a particular reason why we are talking tech for good right now in February, 2nd month of the year? Is it anything to do with the doom and gloom that we usually surround ourselves with? Are we starting to come out of fear, do you think? Are there rays of hope in our world relaxing more? There's more good news or no? Should we just continue our usual trend of tech for good talking about tech for bad? How are we feeling about that?
Zoe: Having this conversation is so important, right? Because technology can get conflated with good and bad all too easily. And I think opening up a discussion about what is good, what is bad, what do we think? What are the different viewpoints here? That's just important to keep people critical about what's happening.
Hera: I could not have said it better. I mean, Zoe, you've done a lot of work in ethical tech and thinking about applying science and the kind of ethics that you would go into science with blend with that technology. So tell us about when you think about the phrase tech for good or the kind of topics that we're going to talk about today, what kind of feelings are you bringing to this conversation?
Zoe: I definitely love that we bring this kind of optimism. Also, side note, I'm bringing a load of optimism to this conversation because it's my birthday next week. So when you said, how are you feeling at the moment? I was thinking February is the best time of month, but yeah. Yeah, I, well, I mean, I think Harriet said it so well in terms of the point of tech for good. I think there's a lot of pessimism and not knowing where to start, feeling lost amongst all the information that is out there. And there's a lot of freedom and optimism and at least being able to identify the problem. I mean, I'm really interested in what's coming up today because I also have a bit of a background in kind of clinical linguistics. So talking about the use of language in particular, the language that comes up in the way that it's mined, and in this case, potentially weaponised in a technological context is certainly a, maybe that's not the most optimistic conversation, but I've also seen firsthand through Chayn the power of the virtual community and engaging in these online spaces in a way that we might not be able to in real life, especially in the last couple years.
Hera: Well, that brings us really nicely to our stat of the week. I wanted to stick with one, but Harriet insisted that we go with two. So there we are. We have two stats of the week. So the first one is that, did you know that the amount of funding going to all-women teams of tech startups in Europe has increased by nearly 80? Eight zero. 80% in 2021. And this is according to a report by Atomical, which is a venture capital firm based in London. And the report is called State of European Tech 21. What do we think isn't that? I thought it was pretty amazing and that's a huge jump.
Zoe: That's incredible. And also I love how it just makes, it's so apparent how false all the narratives are about how hard it is to, you know, fund all-female tech teams or development teams or find female developers. It's just so inaccurate and it's really encouraging to see numbers like that, which just proves how false all of that is.
Harriet: Oh, I'm a bit more cynical about this.
Hera: Go for it.
Harriet: My first question is like, but what is 80%? Because if it's 80% of a really tiny number, then that's still not very good. And digging into it a little bit more. It's still overall, only 1% of funding is going to all female companies and 8.8 is mixed gender. So, I think the real story here is why is 90% of funding still going to all-male tech companies?
Hera: That's a really good point. And it's interesting, right? Cause these are the kinds of things that catch headlines. The fact that it's been grown by 80%, but the fact that the money is still not going to those teams, that they're still struggling to grow that. I remember in my early days in startups, tech startups in both London and Scotland, it was just so male and it was, you know, you'd go to your startup competition, you'd see lots of women. And then like, they were never the popular teams. You have the choice to go enjoy the team and the teams that were led by women, they never had enough people. They never have enough developers. And it's crazy having also worked as a non-coding person in tech startups, it's like, you start noticing so much sexism that's embedded in the way, you know, not just the infrastructure of these spaces, but people's attitudes towards women's progression and their ideas and what kind of roles they're good for. It's just laid in with sexism. So, yeah, thank you for digging into that. So let's go into our second stat of the week, which is, here's one for Zoe. How many public health experts have called on Spotify to take action against Joe Rogan, a US-based blogger's cOVID-19 misinformation. Do you know?
Harriet: I don't know. Tell me, Hera, put me out of my misery.
Hera: Two hundred and seventy.
Harriet: Wow.
Hera: Anyone who has just missed why people are cancelling their Spotify subscriptions. This is why. And for those of you who don't know who Joe Rogan is, do people on here have any thoughts about the impact of that level of misinformation? that he is the most extreme and the most popular in the most extreme way in the US as a blogger and has an immense audience, but I had just heard the name come up and very far away, like two weeks, I've just seen I've actually not, I don't know much about him. Do you?
Zoe: I have to be fully honest. I haven't listened to a single episode of Joe Rogan. I wish I could report back on that experience or do I wish I could report back on that? I don't know. But he's making headlines every other week. And just so unfortunate. I shouldn't say unfortunate, very irresponsible of him to be spreading this misinformation. But what I love about that figure, and I had no idea that many public health experts had heard a little bit about the headlines of kind of music artists who are moving their catalogues off Spotify, which is great and people cancelling subscriptions, it's just so much the, you know, the power of the consumer. There's a lot of, kind of potentially pessimistic or like just, yeah, not hopeful ideas about the powerlessness of individual consumer action and changing patterns like this, which is very valid, but here we've actually seen, oh wow, these actions on every single level have potential. Can I be the optimist of this tech for good pod and say, can I have a take effect in some meaningful way and not be pessimistic about nothing's actually going to change? But yeah, we'll see.
Harriet: There's another aspect of this, which is that subscription-based services are so common industry and people were talking about how the fact that there's no way to export Spotify means people are locked into their Spotify accounts so, you know, if they're going to take an ethical stance and cancel their subscription, they're going to lose all of their playlists or they're going to have to manually do it and with thousands of songs, that's not going to happen. So I think there's a design element there as well, in terms of ethics of companies when they build subscription-based models about how they allow or don't allow consumers to opt-out of them without losing the value that they've created on the platform.
Zoe: And the total transition of the entire model of music consumption right now to these monopolies of music. So Spotify, Apple Music, that kind of thing, means that I've seen a lot of people also saying you know, I'm going to transfer to another subscription service. So, yeah, I'm going to go to Amazon Music. And then we also start to question, well, can we say that that's an ethical transition if there are also implications of subscribing to this new service, which while it may not have Joe Rogan's podcast, there are also I'm sure, myriad other ethical considerations for us to make. So ultimately what really is the effect of that action?
Ad: And now for a little interlude. Just to remind you that this podcast is run entirely by volunteers and we survive on sponsorships and donations. So if you've ever tuned in to one of our podcasts, please consider chipping in for the price of a brew. You can find the details at tech for good dot live forward slash donate. We'll use the money to pay for our subscriptions and to get our podcasts transcribed, to make sure that we're being accessible. Thank you to everyone who's already talked as a fiver. You're the absolute best, and we appreciate you so much.
Hera: While we talk about subscriptions and the ethics of design, I'm going to move us onto our charity news of the week. Last week, the Crisis Text Line was in the news because a political publication shared that the suicide in crisis helping service based in the US was selling anonymised data from its conversations with people in crisis to train a chatbot service, which is for customer service run by its for-profit spin-off, Loris.ai. And for those who don't know Crisis Text Line, is one of the largest organisations that do crisis counselling. Since its launch in 2013, it has exchanged more than 219 million messages and more than 6.7 million conversations over text, Facebook messenger and WhatsApp. And most of their audience are young people in crisis and this story for me, was really important because the next line has been an inspiration of mine since the time it launched. It is an organisation I've looked up to from its Founder to CEO. I have tracked what they've done. I have always given their example when people ask me who I'm inspired by. I have been so invested in their growth and really bought into the fact that I trusted its leadership to take these things and do things like ethics and you know, trauma based design and from informed frameworks into consideration, I was completely bowled over and upset. This is not the first time they would have been in the news as well. Actually a couple of months ago, they were in the news for toxic culture in their workplace. And it's just a really sad story for me. And we have an update on this from this week, which is, there was a lot of outrage and now they have announced this week that they will no longer be sending its data to Loris.ai. But what do our panellists think about this?
Zoe: I think is completely inappropriate to be taking data from people in this circumstance for this purpose. It's for what, improving chatbots. Effectively like customer service platforms. So I think there's a whole load....
Hera: To make it more empathetic. That's the term that they were using.
Zoe: Yeah. There are a whole load more suitable scenarios in which you might look at data of conversations, but like somebody who's having a suicidal moment in their life, which is obviously a moment of crisis is unlikely to be consenting to that conversation recorded and used for this purpose. And in any case, is it appropriate for a customer service chatbot to be using suicidal conversations to shape the design of its conversation? Like, no, they're just completely different scenarios.
Harriet: That's such a good point and I think it's even quite sinister the fact that they're using this or we're considering this data purchase or data transfer because like you said, it's not for the same purpose it's customer service versus this sort of crisis text line. And the thing is there is no dearth of digital corpora of human language via text lines. There's every single kind of consumer service Textline. I'm sure they could purchase another type of Textline data. Maybe not a suicide hotline. So we have to ask ourselves, well, why. Like you were saying sort of, why are they going for this particular dataset? What is it that they're trying to learn from that, given that there are so many other corpora, which they could access, what is it they're trying to get out of this? I mean, I don't know the answer. I think it's kind of sinister because as you say, you don't need to learn. It's not necessarily empathy, which you'll be extracting from this dataset. Not because the people on the crisis line aren't empathetic, but because it's such a complex situation. So yeah, I just found that whole idea quite sinister.
Zoe: Can you imagine if they were transparent as well about this data use? The position that puts you in as a potential user of this service. Like, oh, I need to pull up because I'm in a really awful position, thinking about committing suicide and then being asked like, oh, do you mind if we use your data to inform this customer service platform? I would assume that a lot of people would absolutely not be comfortable with that in this highly vulnerable moment. And so then what are you doing? You're asking people to consider whether or not they're going to use your service that they need, in that moment.
Hera: Well, I also found really interesting because Crisis Text Line like many charities uses a lot of volunteers to do its work. I also founded a volunteer run organisation and their volunteers did not know this was happening. And they are the ones that have been counselling people and they were horrified. Those who did know, one person, in particular, raised this up and then was dismissed when they wanted to write an opinion paper on it. And they started a website called Reform Quest Textline dot com. And there's a change in ethical standards timeline. So in 2016, the Crisis Text Line website said we will never share data for commercial use. Are you selling this? Nope. Heck no, not going to happen. Yuck. Gross. And then 2017. So a year later, they said, that's when they incorporated Loris AI and become a shareholder in it and they shared the same CEO for a while and said, how can we fundraise in a way that helps us further our purpose of putting more empathy in the world? The answer is a subsidiary for-profit called Loris.ai. And I think there are two stories to tell here. One is how notoriously hard people make it to raise money for organisations that are doing this really critical work. Obviously, this does not excuse this kind of commercial deal. But I think it's just important that we recognise how hard it is to do that. And secondly, is that how swift that changes. That's one year that one year in that changing. And then when volunteers started raising voices, they're dismissed, they're told they're not a good fit. It's just really, really troubling. And this is an organisation that has had funding from every major donor, has been backed by Silicon Valley companies. It has such a huge reach and it just makes me so worried that if we are trusting the support around mental health services to attack companies and to nonprofits, which have these advisory boards that have these shareholders and trustees and they're supposed to be holding things to account and that's just not happening. It just makes me feel really scared for the other crisis text lines that are out there. Zoe, when you think about the stuff that has come up about crisis text lines, does it make you think differently about some of the decisions you've made and also designing a mental health service?
Zoe: Oh, wow. That's a really big question. I mean, I guess it, it comes down to a question of like how much data you're collecting and what kind and throughout the service that we've designed, it's very much a minimal data collection. A lot of it is often, I mean, it's designed to be anonymous. That's kind of at the core of everything we do at Chayn and with this project, Bloom. Not to make every single process sound perfect. Wonderful. But that truly is core to the decisions that we've been making. And, I mean, this is a huge aspect of this whole issue. As many people have pointed out is the de-anonymization, potentially of some of the data and it's something we talk about a lot in a clinical setting, also in clinical research of things, which on the face of it may appear to be anonymised data. I mean, anonymity, because anonymity is so much more than a name, phone number and email address. Identifying details can be everything from geography to diagnosis. I mean, that's something that if you know that someone has a diagnosis and they're in a particular area and they attend this particular school or church or whatever the case may be, that's already a de-anonymised bit of data. So I think a lot of the questions around, well, I mean maybe some of the pushback to the pushback has been around, well, of course they're going to strip the data of all of these details. So what does it even matter in the end? I mean, like by de anonymising it in those ways, safety, personalised, I think that's also another really important consideration and something, when you're talking about person-centred care, particularly in a digital context, it's so much more personal than something that can be quantified in that sense. These are people's real lives and their real stories. And also, yeah, just going back to what Harriet was saying, I'm just truly feeling so sick at the thought of someone at the worst moment in their life coming up to this text line or this crisis line or whatever it is, and being asked for informed consent. I think that's the other aspect of design that's really massively important.
Hera: Exactly. Because, how can you have informed consent when someone is in a crisis? You know, I think this is because even if someone gives consent, are there truly giving consent? Like, are they in the best stage to give consent?
Zoe: And, you know, even if they have, you know, they may have the capacity to understand that. And they're, you know, many people who could comprehend that, but the thing is like the amount of information that you would have to tell them to really argue like, yes, this is informed consent. Yes, they know not only what data is being used, but where it's going to and what this whole customer service platform is, that by that point like you've been talking to them for a minute and they have hardly talked about their problems. How long would it take?
Hera: And terms of service are usually 20, 30 pages long.
Zoe: Yeah. And you shouldn't be asking people to make the choice between accessing that service and taking part in this piece of research to improve customer service platforms.
Hera: Absolutely.
Zoe: Which is all very vague anyway. It's just not appropriate to be using that data in a commercial sense.
Hera: Yeah, exactly. I feel like this story really takes me to a dark place when I think about, you know, just the promise that's they had and how much it meant to me as a social entrepreneur and the sort of like, just the fall that I've seen it had. I think it's, but it's also a good thing for me I think as an entrepreneur to think about is that these things can happen to any organisation. It's so important to bake your principles and have multiple accountability measures everywhere. There were so hyped, you know, they're the ones that release that research that said that they had five emojis that could tell them if someone was in a crisis situation. And one of them would use was a pill. I mean, okay. Great. And then when the people started leaving the organisation, they started seeing how that was really oversimplified so that it was newsworthy. And there were lots of complaints from the DIA team that they did not agree that that was an accurate, you know, result from that sample. So, yeah. Let's take our mind off that darkness. Do something that is not dark, but is shrouded in mystery. Let's talk about our tech news of the week. NFTs. NFTs are the rage, do you know what they are? I don't think I know what they are. I made a joke on Twitter saying I don't get them. And then somebody sent me an article explaining what they are. And I actually do get what they are. I just don't get the point of them. So, Harriet, you've been looking into the definition of NFTs. Tell us about like, what does it stand for?
Harriet: I have not been looking at the definition of NFTs.
Hera: You're our resident expert.
Harriet: My very basic understanding of this is that you're buying a digital receipt essentially.
Hera: NFT stands for non-fungible. Wait, what's the T?
Zoe: Token.
Harriet: Okay.
Zoe: Yeah. So yeah, it's basically, you can't transfer it, you can't replace it. But it's just blown up, like the way cryptocurrency did and things are popping up. In fact, someone asked me recently, whether Chayn was considering creating its own NFTs and then one of our team members had asked if they could draw her picture and put it up as NFT. And she was like, no.
Hera: Oh my gosh.
Harriet: Activist, someone did that to her. So they just took her picture from Twitter and they made it NFT out of it without her consent and that's just, someone's now watched right to her face, but she's not a celebrity. It's so strange. So I've been just really enjoying looking at stories at it and I thought we could just visit a few of them. So the first story that we have for you is from Weiss and I think the headline says it all. Which is, this NFT on Open C will steal your IP address. So it is, yeah, pretty mind-blowing by Joseph Fox. You know, there's a story by Joseph Fox it's going to be great. So he actually did it to himself where he went and someone had basically, when you upload NFTs because it's a digital file, people can embed code into it. And of course, when you buy it and you download that file, you're downloading that code onto your device as well. So this really clever hacker had uploaded artwork and embedded a code in it, which would track someone's IP address. And apparently, it was an unknown fact. And just across from this article and it was started buzzing, I saw the hacker space just blow up on Twitter, being like, oh my God, this is great. Now we can totally hack people. Do you guys have any idea we could do this? And I was just like, wow. So I thought it was just really fascinating to see, like, the hacker space, it was like a celebration. It was like Christmas come early. They were just going crazy and being like, okay, well, and just running through the kind of code that they can write to affect people's devices. And they would just think we thinking they're hopping onto this trend. So many people have fomo and they're gonna get buying into NFTs just because everybody else is doing it. They don't want wanna, they don't wanna miss out on the next Bitcoin.
Zoe: That's the risk, isn't it, of getting voted things that very few people understand. Let alone most people. Yeah. I mean I guess in some ways it's like the internet, like, I don't understand how the internet works. But it's just like, when something like this is so new and emerging, ill-defined, it's just got to ring alarm bells, like don't go spending enormous amounts of money on something that hasn't yet built up trust, even, you know. There is clearly a risk in doing that. So public service announcement. Don't do it.
Hera: Yep, don't do it. You've heard it here. And I've got another story that will blow your mind. So you know, how we have virtual reality worlds, like Minecraft and so much stuff on the crypto land as well. Well, just like that, people are buying pretend properties, pretend virtual properties on these games and metaverses that people are developing, but they're essentially unregulated. So that means that anybody can just create one, make loads of money and then just disappear. And that's exactly, what's happened to some people where this one provider just pulled the plug and people have lost hundreds and thousands of dollars in virtual property. I don't get it. Help me understand. Why would you buy virtual property?
Zoe: I'm lost for words. I mean, it's interesting that you can, scrolling through the article, it's like companies like Nike and Adidas have been mentioned. German sports company, Adidas, bought land in the sandbox last year, describing its acquisition of a plot of land on the platform as a way of expressing our excitement about the possibilities it holds.
Hera: Give that land back to indigenous people. Save the Amazon. I don't understand. How is this a good use of money?
Zoe: But also, when you've got companies like this, basically jumping on the bandwagon and sort of extolling the virtues and the excitement of the potential of this new world, I dunno, you can see why people are getting drawn into it.
Hera: Yeah. And for anybody else who was interested in this, the article is called, Would You Buy a Home in the Metaverse and it is pretty mind-blowing.
Harriet: Isn't the NFT story, I mean all of it really comes back to the concept of kind of what is ownership and the way that we've digitised ownership, monetised. What does it mean to own something? I mean, I don't have the answer to that question. I also don't know how the internet works. But yeah, like you were saying earlier, Harriet, there's that lack of comprehension there really. A thing you hear a lot with these stories is people saying, well, how is it different to something like gambling? You know, if someone wants to spend a hundred thousand dollars, whatever it is, on this digital seat, this digital bit of land, whatever it is, why can't they? Why do we object to it and not to kind of more traditional forms of spending? But it really does come down to that comprehension. Do people know what they're signing up for? There's a lot of hype, a lot of money, massively high stakes. It's just like the perfect kind of breeding ground for these catastrophes.
Hera: Yeah, I'm struggling. But that's a really good analogy when thinking about gambling. And I saw that this one artist that I follow, an activist from Pakistan, who is based in the UK, she started this women's rise NFTs series where she is selling things like 10,000 pictures of women, and some of them are real. She's an illustrator, drawings of women activists and then others are just like, put together like a woman, a visible person in an astronaut suit, that kind of thing. And from the way that she talks about it, it's like she wants to spread women and their images and how anybody can achieve anything on NFTs. But I don't understand because she's done this illustration book of a hundred amazing women and that became a whole thing and she doesn't have distractions like that. I just couldn't connect the NFT thing through the kind of books that she produces. And then I always also thinking about consent in the same way, because obviously the people that are in her book, I don't know if she got consent from them. They're like, public activists and figures and most people don't get consent for that kind of stuff. They just do it because they're in the public eye. But in this case, someone is buying the copyrights to your image. It just felt weird. It actually reminded me of that really amazing essay that that model wrote.
Harriet: Emily Radokowski.
Hera: Yes. About the right to use your own image. It just reminded me of the arguments in that. So, yeah. Okay, well, it's time for us to go into our rant or nice of the week. I actually don't have a rant but I think we've done that bit. I have a nice of the week, which is that Google has been sued for its alleged use of deceptive designs. Attorney General's in Washington, DC, Texas and at least two other states in the US have brought lawsuits against the tech giant for repeatedly pressuring users into handing over their location data through unfair and deceptive practices. And this we found out through an article in Gizmodo. So I thought that was quite nice. These things take long, but I think it's just good. Cause they stop lawsuits of other people all the time. I think it's good that public authorities are doing the same.
Harriet: Yeah. Yeah. There's also just the wider kind of systemic issue of the law in tech. And, you know, at what point are we going to get a handle on new technology? And I think that's something that people feel, or a lot of doom over now is that like our basic rights are protected in law right? And if the law can't keep up with tech, then where's your protection. And I think a lot of people do look to the government, EU, whoever to protect that. I mean, whichever legal entity is relevant in your area. But yeah, it's a good, good sign. I agree.
Hera: Exactly. And finally, I have a sad, happy, sad story to end our podcast on, which is that Wordle, the game that Fay and I are obsessed with from the Tech for Good Live team has been sold to the New York Times. I mean, the New York Times are pretty famous for their games and puzzles, but it was just kind of a letdown that something independent, started by a guy is now owned by this big company. So that's where we're going to end on. And that's all we have the time for. So thank you so much, Harriet and Zoe, was this episode for you? Do you know what NFTs are now?
Zoe: I don't think anyone does. I don't think anyone can say they understand this phenomenon.
Harriet: I want to know why all of them are monkeys. We failed to address that in our conversation today, so I'll have to look into that.
Hera: That's the first time a guest at Tech for Good Live. How was this for you? Is there anything you'd like to plug from your work that you'd want the listeners of Tech for Good Live to know about?
Zoe: Yeah, really enjoyed it. Learning and talking about NFTs. That was really fascinating. I would just plug, I mean, us. Bloom by 10. You can check us out. We're a remote trauma service. You can find us at bloom dot chayn dot co. And yeah. Thank you for having me.
Hera: Thank you for coming and listeners, what did you think? We'd love to hear your thoughts. Get in touch with us on Twitter at tech for good live or email. Hello at tech for good dot live. We'd love it if you'd give us a nice iTunes review, tell your pals about this podcast. We'd love to get more reviews. Thanks to Podcast.co for hosting our podcast. And thanks to all of our many awesome volunteers who you can find on our team page at tech for good dot live.