Podcast Episode: Finding the Joy in Digital Security
Jul. 16th, 2025 07:05![[syndicated profile]](https://www.dreamwidth.org/img/silk/identity/feed.png)
Many people approach digital security training with furrowed brows, as an obstacle to overcome. But what if learning to keep your tech safe and secure was consistently playful and fun? People react better to learning, and retain more knowledge, when they're having a good time. It doesn’t mean the topic isn’t serious – it’s just about intentionally approaching a serious topic with joy.

(You can also find this episode on the Internet Archive and on YouTube.)
That’s how Helen Andromedon approaches her work as a digital security trainer in East Africa. She teaches human rights defenders how to protect themselves online, creating open and welcoming spaces for activists, journalists, and others at risk to ask hard questions and learn how to protect themselves against online threats. She joins EFF’s Cindy Cohn and Jason Kelley to discuss making digital security less complicated, more relevant, and more joyful to real users, and encouraging all women and girls to take online safety into their own hands so that they can feel fully present and invested in the digital world.
In this episode you’ll learn about:
- How the Trump Administration’s shuttering of the United States Agency for International Development (USAID) has led to funding cuts for digital security programs in Africa and around the world, and why she’s still optimistic about the work
- The importance of helping women feel safe and confident about using online platforms to create positive change in their communities and countries
- Cultivating a mentorship model in digital security training and other training environments
- Why diverse input creates training models that are accessible to a wider audience
- How one size never fits all in digital security solutions, and how Dungeons & Dragons offers lessons to help people retain what they learn
Helen Andromedon – a moniker she uses to protect her own security – is a digital security trainer in East Africa who helps human rights defenders learn how to protect themselves and their data online and on their devices. She played a key role in developing the Safe Sisters project, which is a digital security training program for women. She’s also a UX researcher and educator who has worked as a consultant for many organizations across Africa, including the Association for Progressive Communications and the African Women’s Development Fund.
Resources:
- The Guardian: “Internet shutdowns at record high in Africa as access ‘weaponised’” (March 9, 2025)
- Atlantic Council AfricaSource: “Effective cybersecurity in Africa must start with the basics“ (Oct. 7, 2024)
- Just Security: “The Next Step for USAID’s New Digital Policy: Account for Conflict Risks and Include Peacebuilding” (Sept. 27, 2024)
- Collaboration on International ICT Policy for East and Southern Africa: “How African States Are Undermining the Use of Encryption“ (Oct. 21, 2021)
- Security Education Companion
What do you think of “How to Fix the Internet?” Share your feedback here.
Transcript
HELEN ANDROMEDON: I'll say it bluntly. Learning should be fun. Even if I'm learning about your tool, maybe you design a tutorial that is fun for me to read through, to look at. It seems like that helps with knowledge retention.
I've seen people responding to activities and trainings that are playful. And yet we are working on a serious issue. You know, we are developing an advocacy campaign, it's a serious issue, but we are also having fun.
CINDY COHN: That's Helen Andromedan talking about the importance of joy and play in all things, but especially when it comes to digital security training. I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.
JASON KELLEY: And I'm Jason Kelley, EFF's activism director. This is our podcast, How to Fix the Internet.
CINDY COHN: This show is all about envisioning a better digital world for everyone. Here at EFF, we often specialize in thinking about worst case scenarios and of course, jumping in to help when bad things happen. But the conversations we have here are an opportunity to envision the better world we can build if we start to get things right online.
JASON KELLEY: Our guest today is someone who takes a very active role in helping people take control of their digital lives and experiences.
CINDY COHN: Helen Andromedon - that's a pseudonym by the way, and a great one at that – is a digital security trainer in East Africa. She trains human rights defenders in how to protect themselves digitally. She's also a UX researcher and educator, and she's worked as a consultant for many organizations across Africa, including the Association for Progressive Communications and the African Women's Development Fund.
She also played a key role in developing the Safe Sisters project, which is a digital security training, especially designed for women. Welcome Helen. Thank you so much for joining us.
HELEN ANDROMEDON: Thanks for having me. I've been a huge fan of the tools that came out of EFF and working with Ford Foundation. So yeah, it's such a blast to be here.
CINDY COHN: Wonderful. So we're in a time when a lot of people around the world are thinking more seriously than ever about how to protect their privacy and security. and that's, you know, from companies, but increasingly from governments and many, many other potential bad actors.
You know, there's no one size fits all training, as we know. And the process of determining what you need to protect and from whom you need to protect it is different for everybody. But we're particularly excited to talk to you, Helen, because you know that's what you've been doing for a very long time. And we want to hear how you think about, you know, how to make the resources available to people and make sure that the trainings really fit them. So can you start by explaining what the Safe Sisters project is?
HELEN ANDROMEDON: It's a program that came out of a collaboration amongst friends, but friends who were also working in different organizations and also were doing trainings. In the past, what would have it would be, we would send out an application, Hey, there's a training going on. But there was a different number of women that would actually apply to this fellowship.
It would always be very unequal. So what we decided to do is really kind of like experimenting is say, what if we do a training but only invite, women and people who are activists, people who are journalists, people who are really high risk, and give them a space to ask those hard questions because there are so many different things that come out of suffering online harassment and going through that in your life, you, when you need to share it, sometimes you do need a space where you don't feel judged, where you can kind of feel free to engage in really, really traumatic topics. So this fellowship was created, it had this unique percentage of people that would apply and we started in East Africa.
I think now because of what has happened in the last I, I guess three months, it has halted our ability to run the program in as many. Regions that need it. Um, but Safe Sister, I think what I see, it is a tech community of people who are able to train others or help others solve a problem.
So what problems do, I mean, so for example, I, I think I left my, my phone in the taxi. So what do I do? Um, how do I find my phone? What happens to all my data? Or maybe it could be a case of online harassment where there's some sort of revenge from the other side, from the perpetrator, trying to make the life of the victim really, really difficult at the moment.
So we needed people to be able to have solutions available to talk about and not just say, okay, you are a victim of harassment. What should I do? There's nothing to do, just go offline. No, we need to respond, but many of us don't have the background in ICT, uh, for example, in my region. I think that it is possible now to get a, a good background in IT or ICT related courses, um, up to, um, you know, up to PhD level even.
But sometimes I've, in working with Safe Sister, I've noticed that even such people might not be aware of the dangers that they are facing. Even when they know OPSEC and they're very good at it. They might not necessarily understand the risks. So we decided to keep working on the content each year, every time we can run the program, work on the content: what are the issues, currently, that people are facing? How can we address them through an educational fellowship, which is very, very heavy on mentorship. So mentorship is also a thing that we put a lot of stress on because again, we know that people don't necessarily have the time to take a course or maybe learn about encryption, but they are interested in it. So we want to be able to serve all the different communities and the different threat models that we are seeing.
CINDY COHN: I think that's really great and I, I wanna, um, drill in a couple of things. So first thing you, uh, ICT, internet Communications Technologies. Um, but what I, uh, what I think is really interesting about your approach is the way the fellowship works. You know, you're kind of each one teach one, right?
You're bringing in different people from communities. And if you know, most of us, I think as a, as a model, you know, finding a trusted person who can give you good information is a lot easier than going online and finding information all by yourself. So by kind of seeding these different communities with people who've had your advanced training, you're really kind of able to grow who gets the information. Is that part of the strategy to try to have that?
HELEN ANDROMEDON: It's kind of like two ways. So there is the way where we, we want people to have the information, but also we want people to have the correct information.
Because there is so much available, you can just type in, you know, into your URL and say, is this VPN trusted? And maybe you'll, you'll find a result that isn't necessarily the best one.
We want people to be able to find the resources that are guaranteed by, you know, EFF or by an organization that really cares about digital rights.
CINDY COHN: I mean, that is one of the problems of the current internet. When I started out in the nineties, there just wasn't information. And now really the role of organizations like yours is sifting through the misinformation, the disinformation, just the bad information to really lift up, things that are more trustworthy. It sounds like that's a lot of what you're doing.
HELEN ANDROMEDON: Yeah, absolutely. How I think it's going, I think you, I mean, you mentioned that it's kind of this cascading wave of, you know, knowledge, you know, trickling down into the communities. I do hope that's where it's heading.
I do see people reaching out to me who have been at Safe Sisters, um, asking me, yo Helen, which training should I do? You know, I need content for this. And you can see that they're actively engaging still, even though they went through the fellowship like say four years ago. So that I think is like evidence that maybe it's kind of sustainable, yeah.
CINDY COHN: Yeah. I think so. I wanted to drill down on one other thing you said, which is of course, you mentioned the, what I think of as the funding cuts, right, the Trump administration cutting off money for a lot of the programs like Safe Sisters, around the world. and I know there are other countries in Europe that are also cutting, support for these kind of programs.
Is that what you mean in terms of what's happened in the last few months?
HELEN ANDROMEDON: Yeah. Um, it's really turned around what our expectations for the next couple of years say, yeah, it's really done so, but also there's an opportunity for growth to recreate how, you know, what kind of proposals to develop. It's, yeah, it's always, you know, these things. Sometimes it's always just a way to change.
CINDY COHN: I wanna ask one more question. I really will let Jason ask some at some point, but, um, so what does the world look like if we get it right? Like if your work is successful, and more broadly, the internet is really supporting these kind of communities right now, what does it look like for the kind of women and human rights activists who you work with?
HELEN ANDROMEDON: I think that most of them would feel more confident to use those platforms for their work. So that gives it an extra boost because then they can be creative about their actions. Maybe it's something, maybe they want, you know, uh, they are, they are demonstrating against, uh, an illegal and inhumane act that has passed through parliament.
So online platforms. If they could, if it could be our right and if we could feel like the way we feel, you know, in the real world. So there's a virtual and a real world, you're walking on the road and you know you can touch things.
If we felt ownership of our online spaces so that you feel confident to create something that maybe can change. So in, in that ideal world, it would be that the women can use online spaces to really, really boost change in their communities and have others do so as well because you can teach others and you inspire others to do so. So it's, like, pops up everywhere and really makes things go and change.
I think also for my context, because I've worked with people in very repressive regimes where it is, the internet can be taken away from you. So it's things like the shutdowns, it's just ripped away from you. Uh, you can no longer search, oh, I have this, you know, funny thing on my dog. What should I do? Can I search for the information? Oh, you don't have the internet. What? It's taken away from you. So if we could have a way where the infrastructure of the internet was no longer something that was, like, in the hands of just a few people, then I think – So there's a way to do that, which I've recently learned from speaking to people who work on these things. It's maybe a way of connecting to the internet to go on the main highway, which doesn't require the government, um, the roadblocks and maybe it could be a kind of technology that we could use that could make that possible. So there is a way, and in that ideal world, it would be that, so that you can always find out, uh, what that color is and find out very important things for your life. Because the internet is for that, it's for information.
Online harassment, that one. I, I, yeah, I really would love to see the end of that. Um, just because, so also acknowledging that it's also something that has shown us. As human beings also something that we do, which is not be very kind to others. So it's a difficult thing. What I would like to see is that this future, we have researched it, we have very good data, we know how to avoid it completely. And then we also draw the parameters, so that everybody, when something happens to you, doesn't make you feel good, which is like somebody harassing you that also you are heard, because in some contexts, uh, even when you go to report to the police and you say, look, this happened to me. Sometimes they don't take it seriously, but because of what happens to you after and the trauma, yes, it is important. It is important and we need to recognize that. So it would be a world where you can see it, you can stop it.
CINDY COHN: I hear you and what I hear is that, that the internet should be a place where it's, you know, always available, and not subject to the whims of the government or the companies. There's technologies that can help do that, but we need to make them better and more widely available. That speaking out online is something you can do. And organizing online is something you can do. Um, but also that you have real accountability for harassment that might come as a response. And that could be, you know, technically protecting people, but also I think that sounds more like a policy and legal thing where you actually have resources to fight back if somebody, you know, misuses technology to try to harass you.
HELEN ANDROMEDON: Yeah, absolutely. Because right now the cases get to a point where it seems like depending on the whim of the person in charge, maybe if they go to, to report it, the case can just be dropped or it's not taken seriously. And then people do harm to themselves also, which is on, like, the extreme end and which is something that's really not, uh, nice to happen and should, it shouldn't happen.
CINDY COHN: It shouldn't happen, and I think it is something that disproportionately affects women who are online or marginalized people. Your vision of an internet where people can freely gather together and organize and speak is actually available to a lot of people around the world, but, but some people really don't experience that without tremendous blowback.
And that's, um, you know, that's some of the space that we really need to clear out so that it's a safe space to organize and make your voice heard for everybody, not just, you know, a few people who are already in power or have the, you know, the technical ability to protect themselves.
JASON KELLEY: We really want to, I think, help talk to the people who listen to this podcast and really understand and are building a better future and a better internet. You know, what kind of things you've seen when you train people. What are you thinking about when you're building these resources and these curriculums? What things come up like over and over that maybe people who aren't as familiar with the problems you've seen or the issues you've experienced.
HELEN ANDROMEDON: yeah, I mean the, Hmm, I, maybe they could be a couple of, of reasons that I think, um. What would be my view is, the thing that comes up in trainings is of course, you know, hesitation. there's this new thing and I'm supposed to download it. What is it going to do to my laptop?
My God, I share this laptop. What is it going to do? Now they tell me, do this, do this in 30 minutes, and then we have to break for lunch. So that's not enough time to actually learn because then you have to practice or you could practice, you could throw in a practice of a session, but then you leave this person and that person is as with normal.
Forget very normal. It happens. So the issues sometimes it's that kind of like hesitation to play with the tech toys. And I think that it's, good to be because we are cautious and we want to protect this device that was really expensive to get. Maybe it's borrowed, maybe it's secondhand.
I won't get, you know, like so many things that come up in our day to day because of, of the cost of things.
JASON KELLEY: You mentioned like what do you do when you leave your phone in a taxi? And I'll say that, you know, a few days ago I couldn't find my phone after I went somewhere and I completely freaked out. I know what I'm doing usually, but I was like, okay, how do I turn this thing off?
And I'm wondering like that taxi scenario, is that, is that a common one? Are there, you know, others that people experience there? I, I know you mentioned, you know, internet shutoffs, which happen far too frequently, but a lot of people probably aren't familiar with them. Is that a common scenario? You have to figure out what to do about, like, what are the things that pop up occasionally that, people listening to this might not be as aware of.
HELEN ANDROMEDON: So losing a device or a device malfunctioning is like the top one and internet shutdown is down here because they are not, they're periodic. Usually it's when there's an election cycle, that's when it happens. After that, you know, you sometimes, you have almost a hundred percent back to access. So I think I would put losing a device, destroying a device.
Okay, now what do I do now for the case of the taxi? The phone in the taxi. First of all, the taxi is probably crowded. So you don't think that phone will not be returned most likely.
So maybe there's intimate photos. You know, there's a lot, there's a lot that, you know, can be. So then if this person doesn't have a great password, which is usually the case because there is not so much emphasis when you buy a device. There isn't so much emphasis on, Hey, take time to make a strong password now. Now it's better. Now obviously there are better products available that teach you about device security as you are setting up the phone. But usually you buy it, you switch it on, so you don't really have the knowledge. This is a better password than that. Or maybe don't forget to put a password, for example.
So that person responding to that case would be now asking if they had maybe the find my device app, if we could use that, if that could work, like as you were saying, there's a possibility that it might, uh, bing in another place and be noticed and for sure taken away. So there's, it has to be kind of a backwards, a learning journey to say, let's start from ground zero.
JASON KELLEY: Let's take a quick moment to say thank you to our sponsor. How to Fix The Internet is supported by the Alfred p Sloan Foundation's program in public understanding of science and technology enriching people's lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
We also wanna thank EFF members and donors. You are the reason we exist.
You can become a member for just $25 and for a little more, you can get some great, very stylish gear. The more members we have, the more power we have in state houses, courthouses and on the streets.
EFF has been fighting for digital rights for decades, and that fight is bigger than ever. So please, if you like what we do, go to ff.org/pod to donate.
We also wanted to share that our friend Cory Doctorow has a new podcast. Listen to this. [Who Broke the Internet trailer]
And now back to our conversation with Helen Andromedon.
CINDY COHN: So how do you find the people who come and do the trainings? How do you identify people who would be good fellows or who need to come in to do the training? Because I think that's its own problem, especially, you know, the Safe Sisters is very spread out among multiple countries.
HELEN ANDROMEDON: Right now it has been a combination of partners saying, Hey, we have an idea, and then seeing where the issues are.
As you know, a fellowship needs resources. So if there is an interest because of the methodology, at least, um, let's say it's a partner in Madagascar who is working on digital rights. They would like to make sure that their community, maybe staff and maybe people that they've given sub-grants to. So that entire community, they want to make sure that it is safe, they can communicate safely. Nothing, you know, is leaked out, they can work well. And they're looking for, how do we do this? We need trainers, we need content. we need somebody who understands also learning separate from the resources. So I think that the Safe Sister Fellowship also is something that, because it's like you can pick it up here and you can design it in, in whatever context you have.
I think that has made it like be stronger. You take it, you make it your own. So it has happened like that. So a partner has an interest. We have the methodology, we have the trainers, and then we have the tools as well. And then that's how it happens.
CINDY COHN: What I'm hearing here is that, you know, there's already a pretty strong network of partners across Africa and the communities you serve. there's groups and, you know, we know this from EFF, 'cause we hear from them as well ,that there are, there are actually a pretty well developed set of groups that are doing digital activism and human rights defenders using technology already across, uh, Africa and the rest of the communities. And that you have this network and you are the go-to people, uh, when people in the network realize they need a higher level of security thinking and training than they had. Does that sound right?
HELEN ANDROMEDON: sound right? Yeah. A higher level of our being aware And usually it comes down to how do we keep this information safe? Because we are having incidents. Yeah.
CINDY COHN: Do you have an incident that you could, you explain?
HELEN ANDROMEDON: Oh, um, queer communities, say, an incident of, executive director being kidnapped. And it was, we think, that it's probably got to do with how influential they were and what kind of message they were sending. So it, it's apparent. And then so shortly after that incident, there's a break-in into the, the office space. Now that one is actually quite common, um, especially in the civic space. So that one then, uh, if they, they were storing maybe case files, um, everything was in a hard copy. All the information was there, receipts, checks, um, payment details. That is very, very tragic in that case.
So in that, what we did, because this incident had happened in multiple places, we decided to run a program for all the staff that was, um, involved in their day to day. So we could do it like that and make sure that as a response to what happened, everybody gets some education. We have some quizzes, we have some tests, we have some community. We keep engaged and maybe. That would help. And yeah, they'll be more prepared in case it happens again.
CINDY COHN: Oh yeah. And this is such an old, old issue. You know, when we were doing the encryption fight in the nineties, we had stories of people in El Salvador and Guatemala where the office gets raided and the information gets in the hands of the government, whoever the opposition is, and then other people start disappearing and getting targeted too, because their identities are revealed in the information that gets seized. And that sounds like the very same pattern that you're still seeing.
HELEN ANDROMEDON: Yeah there's a lot to consider for that case. Uh, cloud saving, um, we have to see if there's somebody that can, there's somebody who can host their server. It's very, yeah, it's, it's interesting for that case.
CINDY COHN: Yeah. I think it's an ongoing issue and there are better tools than we had in the nineties, but people need to know about them and, and actually using them is not, it's not easy. It's, you, you have to actually think about it.
HELEN ANDROMEDON: Yeah, I, I don't know. I've seen a model that works, so if it's a tool, it's great. It's working well. I've seen it, uh, with I think the Tor project, because the, to project, has user communities. What it appears to be doing is engaging people with training, so doing safety trainings and then they get value from, from using your tool. because they get to have all this information, not only about your tool, but of safety. So that's a good model to build user communities and then get your tool used. I think this is also a problem.
CINDY COHN: Yeah. I mean, this is a, another traditional problem is that the trainers will come in and they'll do a training, but then nobody really is trained well enough to continue to use the tool.
And I see you, you know, building networks and building community and also having, you know, enough time for people to get familiar with and use these tools so that they won't just drop it after the training's over. It sounds like you're really thinking hard about that.
HELEN ANDROMEDON: Yeah. Um, yeah, I think that we have many opportunities and because the learning is so difficult to cultivate and we don't have the resources to make it long term. Um, so yes, you do risk having all the information forgotten. Yes.
JASON KELLEY: I wanna just quickly emphasize that some of the scenarios, Cindy, you've talked about, and Helen you just mentioned, I think a lot of: potential break-ins, harassment, kidnapping, and it's, it's really, it's awful, but I think this is one of the things that makes this kind of training so necessary. I know that this seems obvious to many people listening and, and to the folks here, but I think it's, it's really it. I. Just needs emphasized that these are serious issues. That, and that's why you can't make a one size fits all training because these are real problems that, you know, someone might not have to deal with in one country and they might have a regular problem with in another. Is there a kind of difference that you can just clarify about how you would train, for example, groups of women that are experiencing one thing when they, you know, need digital security advice or help versus let's say human rights defenders? Is the training completely different when you do that, or is it just really kind of emphasizing the same things about like protecting your privacy, protecting your data, using certain tools, things like that?
HELEN ANDROMEDON: Yeah. Jason, let me, let me first respond to your first comment about the tools. So one size fits all, obviously is wrong. Maybe get more people of diversity working on that tool and they'll give you their opinion because the development is a process. You don't just develop a tool - you have time to change, modify, test. Do I use that? Like if you had somebody like that in the room, they would tell you if you had two, that would be great because now you have two different points of evidence. And keep mixing. And then, um, I know it's like it's expensive. Like you have to do it one way and then get feedback, then do it another way. But I, I think just do more of that. Um, yeah. Um, how do I train? So the training isn't that different. There are some core concepts that we keep and then, so if it, if I had like five days, I would do like one or two days. The more technical, uh, concepts of digital safety, which everybody has to do, which is, look, this is my device, this is how it works, this is how I keep it safe. This is my account, this is how it works. This is how I keep it safe.
And then when you have more time, you can dive into the personas, let's say it's a journalist, so is there a resource for, and this is how then you pull a resource and then you show it is there a resource which identify specific tools developed for journalists? Oh, maybe there is, there is something that is like a panic button that one they need. So you then you start to put all these things together and in the remaining time you can kind of like hone into those differences.
Now for women, um, it would be … So if it's HRDs and it's mixed, I still would cover cyber harassment because it affects everyone. For women would, would be slightly different because maybe we could go into self-defense, we could go into how to deal, we could really hone into the finer points of responding to online harassment because for their their case, it's more likely because you did a threat model, it's more likely because of their agenda and because of the work that they do. So I think that would be how I would approach the two.
JASON KELLEY: And one, one quick thing that I just, I want to mention that you brought up earlier is, um, shared devices. There's a lot of, uh, solutionism in government, and especially right now with this sort of, assumption that if you just assume everyone has one device, if you just say everyone has their phone, everyone has their computer, you can, let's say, age verify people. You can say, well, kids who use this phone can't go to this website, and adults who use this other phone can go to this website. And this is a regular issue we've seen where there's not an awareness that people are buying secondhand devices a lot, people are sharing devices a lot.
HELEN ANDROMEDON: Yeah, absolutely. Shared devices is the assumption always. And then we do get a few people who have their own devices. So Jason, I just wanted to add one more factor that could be bad. Yeah. For the shared devices, because of the context, and the regions that I'm in, you have also the additional culture and religious norms, which sometimes makes it like you don't have liberty over your devices. So anybody at any one time, if they're your spouse or your parent, they can just take it from you, and demand that you let them in. So it's not necessarily that you could all have your own device, but the access to that device, it can be shared.
CINDY COHN: So as you look at the world of, kind of, tools that are available, where are the gaps? Where would you like to see better tools or different tools or tools at all, um, to help protect and empower the communities you work with?
HELEN ANDROMEDON: We need a solution for the internet shutdowns because, because sometimes it could have an, it could have health repercussions, you could have a need, a serious need, and you don't have access to the internet. So I don't know. We need to figure that one out. Um, the technology is there, as you mentioned earlier, before, but you know, it needs to be, like, more developed and tested. It would be nice to have technology that responds or gives victim advice. Now I've seen interventions. By case. Case by case. So many people are doing them now. Um, you, you know, you, you're right. They verify, then they help you with whatever. But that's a slow process.
Um, you're processing the information. It's very traumatic. So you need good advice. You need to stay calm, think through your options, and then make a plan, and then do the plan. So that's the kind of advice. Now I think there are apps because maybe I'm not using them or I don't, maybe that means they're not well known as of now.
Yeah. But that's technology I would like to see. Um, then also every, every, everything that is available. The good stuff. It's really good. It's really well written. It's getting better – more visuals, more videos, more human, um, more human like interaction, not that text. And mind you, I'm a huge fan of text, um, and like the GitHub text.
That's awesome. Um, but sometimes for just getting into the topic you need a different kind of, uh, ticket. So I don't know if we can invest in that, but the content is really good.
Practice would be nice. So we need practice. How do we get practice? That's a question I would leave to you. How do you practice a tool on your own? It's good for you, how do you practice it on your own? So it's things like that helping the, the person onboard, doing resources to help that transition. You want people to use it at scale.
JASON KELLEY: I wonder if you can talk a bit about that moment when you're training someone and you realize that they really get it. Maybe it's because it's fun, or maybe it's because they just sort of finally understand like, oh, that's how this works. Is that something, you know, I assume it's something you see a lot because you're clearly, you know, an experienced and successful teacher, but it's, it's just such a lovely moment when you're trying to teach someone
HELEN ANDROMEDON: when trying to teach someone something. Yeah, I mean, I can't speak for everybody, but I'll speak to myself. So there are some things that surprise me sitting in a class, in a workshop room, or reading a tutorial or watching how the internet works and reading about the cables, but also reading about electromagnetism. All those things were so different from, what were we talking about? Which is like how internet and civil society, all that stuff. But that thing, the science of it, the way it is, that should, for me, I think that it's enough because it's really great.
But then, um. So say we are, we are doing a session on how the internet works in relation to internet shutdowns. Is it enough to just talk about it? Are we jumping from problem to solution, or can we give some time? So that the person doesn't forget, can we give some time to explain the concept? Almost like moving their face away from the issue for a little bit and like, it's like a deception.
So let's talk about electromagnetism that you won't forget. Maybe you put two and two together about the cyber optic cables. Maybe you answer the correction, the, the right, uh, answer to a question in, at a talk. So it's, it's trying to make connections because we don't have that background. We don't have a tech background.
I just discovered Dungeons and Dragons at my age. So we don't have that tech liking tech, playing with it. We don't really have that, at least in my context. So get us there. Be sneaky, but get us there.
JASON KELLEY: You have to be a really good dungeon master. That's what I'm hearing. That's very good.
HELEN ANDROMEDON: yes.
CINDY COHN: I think that's wonderful and, and I agree with you about, like, bringing the joy, making it fun, and making it interesting on multiple levels, right?
You know, learning about the science as well as, you know, just how to do things that just can add a layer of connection for people that helps keep them engaged and keeps them in it. And also when stuff goes wrong, if you actually understand how it works under the hood, I think you're in a better position to decide what to do next too.
So you've gotta, you know, it not only makes it fun and interesting, it actually gives people a deeper level of understanding that can help 'em down the road.
HELEN ANDROMEDON: Yeah, I agree. Absolutely.
JASON KELLEY: Yeah, Helen, thanks so much for joining us – this has been really helpful and really fun.
Well, that was really fun and really useful for people I think, who are thinking about digital security and people who don't spend much time thinking about digital security, but maybe should start, um, something that she mentioned that, that, that you talked about, the Train the Trainer model, reminded me that we should mention our surveillance self-defense guides that, um, are available@ssd.ff.org.
That we talked about a little bit. They're a great resource as well as the Security Education companion website, which is security education companion.org.
Both of these are great things that came up and that people might want to check out.
CINDY COHN: Yeah, it's wonderful to hear someone like Helen, who's really out there in the field working with people, say that these guides help her. Uh, we try to be kind of the brain trust for people all over the world who are doing these trainings, but also make it easy if. If you're someone who's interested in learning how to do trainings, we have materials that'll help you get started. Um, and as, as we all know, we're in a time when more people are coming to us and other organizations seeking security help than ever before.
JASON KELLEY: Yeah, and unfortunately there's less resources now, so I think we, you know, in terms of funding, right, there's less resources in terms of funding. So it's important that people have access to these kinds of guides, and that was something that we talked about that kind of surprised me. Helen was really, I think, optimistic about the funding cuts, not obviously about them themselves, but about what the opportunities for growth could be because of them.
CINDY COHN: Yeah, I think this really is what resilience sounds like, right? You know, you get handed a situation in which you lose, you know, a lot of the funding support that you're gonna do, and she's used to pivoting and she pivots towards, you know, okay, these are the opportunities for us to grow, for us to, to build new baselines for the work that we do. And I really believe she's gonna do that. The attitude just shines through in the way that she approaches adversity.
JASON KELLEY: Yeah. Yeah. And I really loved, while we're thinking about the, the parts that we're gonna take away from this, I really loved the way she brought up the need for people to feel ownership of the online world. Now, she was talking about infrastructure specifically in that moment, but this is something that's come up quite a bit in our conversations with people.
CINDY COHN: Yeah, her framing of how important the internet is to people all around the world, you know, the work that our friends at Access now and others do with the Keep It On Coalition to try to make sure that the internet doesn't go down. She really gave a feeling for like just how vital and important the internet is, for people all over the world.
JASON KELLEY: Yeah. And even though, you know, some of these conversations were a little bleak in the sense of, you know, protecting yourself from potentially bad things, I was really struck by how she sort of makes it fun in the training and sort of thinking about, you know, how to get people to memorize things. She mentioned magnetism and fiber optics, and just like the science behind it. And it really made me, uh, think more carefully about how I'm gonna talk about certain aspects of security and, and privacy, because she really gets, I think, after years of training what sticks in people's mind.
CINDY COHN: I think that's just so important. I think that people like Helen are this really important kind of connective tissue between the people who are deep in the technology and the people who need it. And you know that this is its own skill and she just, she embodies it. And of course, the joy she brings really makes it alive.
JASON KELLEY: And that's our episode for today. Thanks so much for joining us. If you have feedback or suggestions, we'd love to hear from you. Visit ff.org/podcast and click on listen or feedback. And while you're there, you can become a member and donate, maybe even pick up some of the merch and just see what's happening in digital rights this week and every week.
Our theme music is by Nat Keefe of Beat Mower with Reed Mathis, and How to Fix the Internet is supported by the Alfred Peace Loan Foundation's program and public understanding of science and technology. We'll see you next time. I'm Jason Kelly.
CINDY COHN: And I'm Cindy Cohn.
MUSIC CREDITS: This podcast is licensed creative commons attribution 4.0 international, and includes the following music licensed creative commons attribution 3.0 unported by its creators: Drops of H2O, The Filtered Water Treatment by Jay Lang. Sound design, additional music and theme remixes by Gaetan Harris.