Published: January 21, 2023
As we live in an increasingly internet-driven world, it should come as no surprise that online or cyberbullying is a problem of growing significance and impact, one that actually seems to have worsened during the pandemic. In 2022, nearly half of all teenagers experienced some form of online bullying or harassment. But what may be less well understood is beyond the mental and emotional impact, online harassment can also pose a risk to an individual’s personal data and financial security. Joining me to help explore this topic is Lisa Yeo, a professor with the University of California, Merced. We’ll talk about the work she has been doing to combat online bullying and harassment and how this fits into the larger discussion of security and privacy.
So my personal touchpoint is really Gamergate. And I’m not sure how many people are familiar with it but it was essentially an online harassment campaign targeting women in the gaming industry, and specifically there were three people. There was a journalist and media critic Anita Sarkeesian and two game developers, Zoë Quinn and Brianna Wu, who really sort of took the brunt of this. And for me, as a woman who worked in IT, it was really just this huge confluence of why women don’t work in IT. So there is this big concern that we’re underepresented to begin with so the way this harassment campaign really exploded was very personally distressing and personally where I very much wanted to see what I could do.
Interviewed this episode:
Lisa Yeo
University of California, Merced
Dr. Lisa Yeo is an Assistant Professor in the Ernest & Julio Management Program at UC Merced. She is a collaborative problem solver who helps decision makers understand their operational challenges and identify solutions; she helps organizations understand how to safely govern the data and information they need to compete.
Related Episodes
Episode Transcript
Ashley Kilgore:
As we live in an increasingly internet driven world, it should come as no surprise that online or cyber bullying is a problem of growing significance and impact, one that actually seems to have worsened during the pandemic. In 2022, nearly half of all teenagers experience some form of online bullying or harassment, but what may be less well understood is beyond the mental and emotional impact, online harassment can also pose a risk to an individual’s personal data and financial security. Joining me to help explore this topic is Lisa Yeo, a professor with the University of California Merced, will talk about the work she’s been doing to combat online bullying and harassment and how this fits into the larger discussion of security and privacy. Lisa, thanks so much for joining me.
Lisa Yeo:
I’m really excited to be here. Thank you for asking.
Ashley Kilgore:
So Lisa, I’d love to start by having you share a little bit about yourself, specifically the work you’re doing, how you became interested in it and what you enjoy about it.
Lisa Yeo:
So, sure. I’m an assistant professor of course, as you’ve already mentioned, but I typically work in information security and privacy space. I really like to work in the area where we are building the technology in a way that makes it secure and private, protects privacy from the beginning instead of as an add-on. And that means potentially implementing safeguards or not collecting certain data. But at the same time, we still have these systems and they have impact on people and people’s lives. So my current work right now, one of the things I’m doing is collaborating with a colleague in Maryland and a graduate student here to look at reducing online harms. And that is in the space of online harassment. It could also be potentially fraud and really how us as end users as opposed to as companies are interacting with these online spaces. And so I’m super excited.
Let’s see. You asked about my… What got me interested in this specifically area. So my personal touchpoint is really Gamergate and I’m not sure how many people are familiar with it, but it’s essentially an online harassment campaign targeting women in the gaming industry. And specifically there were three people. There was a journalist and media critic, Anita Sarkeesian and two game developers, Zoë Quinn and Brianna Wu, who really took the brunt of this.
And for me as a woman who’s worked in IT was really just sort of this huge confluence of why women don’t work in IT. So there’s this big concern that we’re underrepresented to begin with, and this whole way that this harassment campaign really exploded was very personally distressing and personally where I very much wanted to see what I could do. My peer that I’m working with from Loyola University in Maryland, for her, the touchpoint was when her daughter was being cyber bullied as a teenager.
And I remember her coming to me saying, “is there anything we can do? How do we stop this?” We’re like, I got to find it. So that sort of was her place and we’ve been exploring different avenues since then to really figure out the how.
Ashley Kilgore:
Wow, thank you for sharing that. So now what type of behavior or activity do we categorize as online or cyber bullying?
Lisa Yeo:
It is, I’m going to start sort of from this bullying perspective and move on. So the basic description of online bullying or cyber bullying is that it is bullying that takes place in online environments. So on our phones, text messaging, social media, what have you. And a lot of it really does bleed into this idea of harassment. So I’m sending hateful messages. I am posting things that are trying to embarrass you or harm you. A big thing often that happens is forms of cyber stalking. So I’m paying attention to where you’re going and traveling and intervening with you and never leaving you alone in the online spaces. Hate messages, threats to hurt you, threats, messages saying you should hurt yourself.
And also one of the things that leads to some real trouble is this idea of doxing. So that would be then when there’s a release of personal information and documents about you. So that could be your address, your phone number, bank accounts, all these things that can lead to harm in the physical or real world space too.
Ashley Kilgore:
So I think traditionally we might assume that bullying is something that really happens among children and maybe young adults, but that sounds like that’s not necessarily the case, correct?
Lisa Yeo:
No. Well, it depends on how you want to phrase it. So the psychology literature and the way that bullying is defined in cyber bullying, it really is about children, young adolescents that is… And the historic interventions for how we address that and interact and trying teach children to be better social citizens, is different than what we’re seeing. And so we typically would think of… I mean you and I might call it cyber bullying when it’s happening between us as adults and humans, but it is sort of clinically technically different. And it does, I mean it is happening online in these online spaces. We definitely see it against… I talked with someone and it was like essentially it’s talking about you you can have harassment being targeted against anyone who’s not somehow fitting into whatever the social norm is of a space. If you want to go against the grain, you’re going to run into trouble. And that’s unfortunate because we need everyone to be working on this.
Ashley Kilgore:
And now why can cyber bullying be so difficult to combat?
Lisa Yeo:
I’m going to take this again in sort of two chunks. Cyber bullying with the target being children, young adolescents, a lot of times it’s because it’s happening in spaces that we the responsible adults aren’t in. And so we can’t see it happening and we have to rely on kids, our children, what have you, to tell us that this is happening. It’s not like bullying in the physical world where oftentimes you are in spaces where there are adults at least peripherally involved, and they can sometimes notice what’s going on. So it can be really hard and it might not even be people physically close to your children, or to the… It’s cyberspace. You could be playing games with somebody from, I don’t know, Germany, and you don’t really know who’s doing this. You’ll never really see the in-person interactions. That’s one of the spaces where it’s hard to observe, but even when it is observable, it can be hard to figure out what to do about it or how to intervene.
And it can be hard for people observing who are bystanders to make decisions about, “what should I do to help? Or should I help?” Sometimes the bystanders actually make it worse. They sort of pile on. So it’s complicated. There’s an additional challenge when we are talking about online platforms and taking space. There’s always sort of like, “they should stop this, right?”
“Facebook should stop this type of messaging. We should do whatever.” But where do you draw the line and what is the difference between what, to me, might be harassment, but to you it’s just a sarcastic message. And how should a particular platform community, like the organization draw this line that doesn’t get them into legal trouble, into conversations about is this free speech and censorship that you’re doing or is this hate speech, and we have different jurisdictions? So it’s really challenging.
Ashley Kilgore:
It sounds like a complex topic to say the very least.
Lisa Yeo:
Yes, which is great because I’m in management of complex systems, so it’s wonderful. It’s what we do here.
Ashley Kilgore:
So now Lisa, what are some of the ways cyber bullying incidents can actually serve as potential security risks?
Lisa Yeo:
I think there are a few ways. One, with one of the activities of the sort of online harassment being this idea of doxing and releasing personal information and bank account information and what have you, you have the potential for sort of physical harm. So what’s gone on in online can now spill over to somebody coming to my house and trying to, I don’t know, break in and beat me up or shoot me. We’ve seen sort of cases of physical threats come that way, but also just the security incidents flowing into the ability to access your employees and potentially some of their credentials, which would allow them access, allow an attacker potential access into your environment.
So we sort of have this flags of multiple things, but at the very least you have employees who are extra stressed out, who are not going to be able to come to work fully engaged, fully comfortable, and that just leads to mistakes, whether they are cyber security privacy mistakes or other mistakes, that is going to happen. So I know personally, I just think we should care about people enough, we want to do something about this, but there are sort of organizational benefits to making sure that this isn’t taking place and we certainly don’t want it taking place within our organization across employee to employee.
Ashley Kilgore:
So now Lisa, can you share some of the ways that analytics is being used or can be leveraged to help combat cyber bullying and the resulting security risks?
Lisa Yeo:
So a lot of this is around the machine learning aspect and creating algorithms that can identify in our data typically. So whether that’s our posts, potentially our image processing, we can do image processing to try and determine is an image going to be considered an intimate partner, non-consensual photo. So we can do some of this. So we can use, but we can essentially use some of the machine learning and our ability to do image processing to identify potential hate speech, potential harassment. And there’s been actually quite a lot of work being done in these spaces to get the machine learning to create the algorithms. Some of the platforms use algorithms to try and identify some of this, but there’s a really big problem still. We still have a lot to do. Machines don’t understand sarcasm. So if I’m having interaction with a friend and they know that I’m a super sarcastic person and I say something, they’ll understand, but the machine never will.
So we flag too much and at the same time we don’t flag enough, which means we still need human moderators to take a look at this. And it’s a terrible job, to moderate this because you’re looking at potential hate speech all the time and trying to make decisions. From my perspective, because some of my early work was related to online collaboration and crowdsourcing stuff, there’s work being done to crowdsource some of that moderation work and to let us rely on the algorithms to flag potentially and then crowds to make decisions or make suggestions to do something. Twitter has the community notes program, which was started sometime as a beta over the summer, had a cooler name at the time I thought. It’s Canary, so it was like Canary in the coal mine. It was totally linked to the bird. But it seems to be just community notes these days where you can be asked, “this person flagged this. Do you think that it is?”
And that’s nice because it’s bystander intervention. Does that help with analytics? I don’t know. One of our research projects is to look at ways that the technology can be designed in order to encourage bystanders to intervene and do something. And also trying to optimize what is that point, right? How much moderation, how much content moderation do you want to have and how much is too much because it turns off people from participating. So these are the sorts of things that are being explored right now, but it is a… It’s anytime human behavior is involved, it’s really hard to just let the computer take care of it and our typical math and analytics approaches.
Ashley Kilgore:
Lisa, thank you so much for joining me. Before we wrap up, is there anything else you’d like to share with our listeners? Maybe what you’re up to next?
Lisa Yeo:
Yeah, up to next. We want to expand this and look into how certain populations might be targeted for scams and fraud and what that does. And I’m thinking about, we teach, for example, our kids in school, so K to 12, we have ways to teach them how to identify and be information literate online, but we don’t necessarily do the same with our seniors. We don’t have the same spaces. So looking at that, and one of the things that I’m really liking right now is that as I’m exploring a new program in digital agriculture and data governance in that space and the use of how this data adds value, but also how it’s at risk.
Ashley Kilgore:
Oh wow. Digital agriculture, that sounds really interesting.
Lisa Yeo:
So people think of precision AG and that’s usually the sensors. So how we collect all this data is really the precision side, but we have the data and we have ecosystems of sensors and data and analytics being done on it. And that’s kind of the digital AG is this entire ecosystem.
Ashley Kilgore:
Want to learn more? Visit resoundinglyhuman.com for additional information on this week’s episode and guest. The podcast is also available for download or streaming from Apple Podcasts, Google Play, Stitcher, and Spotify. Wherever you listen, if you enjoy Resoundingly Human, please be sure to leave a review to help spread the word about the podcast. Until next time, I’m Ashley Kilgore and this is Resoundingly Human.
Want to learn more? Check out the additional resources and links listed below for more information about what was discussed in the episode.
References
Teens and Cyberbullying 2022, Pew Research Center
Cyberbullying and cybersecurity: how are they connected?, AT&T Business
Cyberbullying facts and statistics for 2018-2022, comparitech