Podcast Episode14 – Right to Privacy: Physical and Digital Safety


CW/TW: This article mentions the implications that privacy and data can have on domestic abuse and violence.

Zoe Rose is a cybersecurity consultant and recognized as one of the 50 most influential women in cybersecurity. In this episode Zoe explains why it is important for the average person to be aware of privacy and shares tips for staying safe.

Apple Podcast logo - Right to privacy

Spotify: https://open.spotify.com/episode/6nLwLKkLJqx0CE8fdFSMTm?si=G9vDitx3Qf28diTxz-Du3Q
Stitcher:https://www.stitcher.com/podcast/the-tripwire-cybersecurity-podcast
RSS:https://tripwire.libsyn.com/rss
YouTube:https://www.youtube.com/playlist?list=PLgTfY3TXF9YKE9pUKp57pGSTaapTLpvC3

Tim Erlin: Welcome everyone to the Tripwire Cybersecurity Podcast. I’m Tim Erlin, vice president of product management and strategy at Tripwire. Today, I am joined by Zoë Rose, a cybersecurity consultant who has been recognized as one of the 50 most influential women in United Kingdom’s cybersecurity for the past two years. Welcome, Zoë.

Zoë Rose: Cheers. Thanks. Yeah, I didn’t lose my coolness for the second year, which is good.

Framing the Right to Privacy

TE: I thought it would be good for us to talk about privacy. There’s a lot of debate around privacy from social media privacy to encryption back doors. These debates go back and forth and change in terms of prevalence. But I wanted to ask you: why is it important for the average person to have a right to privacy?

ZE: You’ll often hear the phrase, “If you’ve got nothing to hide, you’ve got nothing to fear.” But I have a drastic example. Looking back to WWII, I think of René Carmille, a French military officer and arguably one of the earlier ethical hackers who sabotaged the Nazi census in France and saved tens of thousands of Jewish people. I know that’s a bit of a drastic example, but the reality is that your right to privacy can protect you. Things change, after all. What if there’s corruption? What if something not so positive happens? I think the average person looks at their life and says, “I want to live free, not harm anyone and retain my right to access the things that I need and want. I should be able to do that in a way that’s safe for me.” That does often mean being able to do it privately.

If we also look at innovation—that side of things of not only safety but also people making cool discoveries, making cool changes and solving problems—it actually flourishes when we feel safe to make those mistakes. So, privacy can actually help us be more innovative, as well.

I think overall the biggest concern from my perspective is growing up in a world that doesn’t take everything you’ve said out of context and negatively. And only privacy is going to help with that. I didn’t grow up with the Internet all around me all the time. And so, I’m quite happy that I don’t have baby pictures or foolish videos of me as a child floating around the Internet as much as some other people do. I think that helps me in my life.

If you’re okay with absolutely everything you do being public knowledge, you’re relying on nobody to take advantage of that. There’s that assumption that everybody has the best intentions. But I can promise you, that’s not the case.

TE: Yeah. This idea of abuse is one that I think is maybe hard for the average person to imagine or think about. But I think that that certainly happens.

ZR: Definitely. And it changes. Like when everybody moves off of a messenger app because its terms and conditions have changed. Lots of people originally might sign up saying, “Oh, it’s end-to-end encrypted. It’s safe. We’re okay.” What they didn’t realize is that can change. In the future, if somebody purchases it or if somebody gains access to that data, they will use it.

TE: This is an interesting angle, this idea that your data should remain yours regardless of changes in ownership over a company or an app. That’s a difficult one because in many ways, one of the reasons for one company to buy another is for access to the data that they have.

ZR: Exactly. And at the other side, maybe we’ve implemented privacy and security by design, but we’re humans. Even if we’re the developers and have the best intentions, there will be times where we make mistakes. So, the typical user, the end user, needs to acknowledge that.

Privacy vs. Anonymity

TE: This is challenging because I think a right to privacy is one thing that we can write into law. But what does it mean in practice to keep your data private? Like, is it possible to actually use social media and actually maintain privacy?

ZR: I would say yes, but there are degrees of privacy.

I would say the average user probably doesn’t need to be 100,000 billion percent anonymous, and it’s quite expensive, and it’s quite difficult to do that. I have a background in intelligence and investigations using data like open-source intelligence, which is investigating publicly available data. The one thing that I would constantly get asked is, “I want to be a hundred percent anonymous. How do I do that?” And I would say to them, “You don’t necessarily need to be a hundred percent anonymous. You know, you want to be private.”

There are ways to do it in a way that works for you. Yes, you can use social media. Yes, you can use the Internet of Things. But you have to do it with a conscious decision. That means to be aware of what device or what platform you’re using. Where is that data being sent or stored? And who’s it being shared with?

If I’m a social media user, for example, and I have my community of friends and family, who is that audience? Do I know all of those people? One thing that often comes up when I speak to younger persons is the acknowledgement that not everybody online has my best intentions in place. Maybe they’re not actually the person that they say they are.

I know it sounds very negative. But realizing that actually helps us design our privacy plan or our online platform plan so that we can maintain it day to day. Sometimes that means taking accounts offline that we don’t use or don’t actually get benefit from., And sometimes, that means sharing content but knowing what that impact is.

TE: Yeah. By posting a photo that’s perceived as completely innocent, you could give away a bunch of information that you might not want to give away, for instance.

The Privacy Challenges of Domestic Abuse

TE: I understand you’ve volunteered helping survivors of domestic violence protect their privacy and data. Tell me a little bit about that volunteer work that you’re doing.

ZR: Yeah, so I currently volunteer with a company called “Operation Safe Escape.” The aim of that organization is to support survivors of domestic abuse. There’s a huge community of people who do things like open-source intelligence like investigating a survivor’s footprint online, securing legal advice as well as offering training on how to take back control of your accounts, retain that control and do it safely. The one thing that people forget is that in a situation of domestic abuse and violence, taking back control is dangerous.

I’ve been a survivor of domestic abuse and violence. I have left that situation, and that’s actually how I got into my current career. And I can promise you the unknown is absolutely terrifying at that time. I was not technical, and I actually had to be self-taught, and that’s how, again, I built this career. But it’s very scary. It’s very scary. And when you make that decision that you are going to leave, that’s actually the most dangerous part.

The reality is when you are taking back control, you are actually in a dangerous situation. Therefore, you have to do it in a way that’s safe. And that does often mean having a community that’s able to guide you through that safely.

TE: You put out something there that seemed really important to me because if someone were to ask me for that advice, my first response would be, “Yeah, you should start locking down your accounts, making sure that people don’t have access.” But it sounds like you’re saying that that’s not necessarily the step you should take.

ZR: That is, yeah. That’s not the first step I would take at all. What happened in my situation is I didn’t do that. I left, and they retained access to my email at the time and I think a couple of my social media accounts. I eventually figured it out for some of them. The reason that I did was because I was scared of this person. They threatened me, and they had quite a bit control over me, and I didn’t know the extent of what they would do, but also, I didn’t know the extent of what they could do. That is a common theme that I have seen for many people, that is, not understanding what impact or what control they do have. Especially with IoT and smart devices.

When you make that choice that you’re leaving, that abuser is losing control, and they acknowledge that. And that’s a threat to them. If they can’t control you, you could leave, you could be happy, and it wouldn’t require them. And that sense of control is very critical to them. And so actually by removing their access, you potentially are putting yourself further into danger. An escalation. And so, it’s making that balanced choice.

Making that choice needs to be done in a safe way. If you do need to get out of there quickly, you need a way of doing it. We recommend leaving the devices behind because there’s different things that the abuser could use. Maybe they could claim you stole their device. Maybe they can still monitor it, and maybe they can track that device. So, it’s actually much safer to start fresh with brand new devices and accounts.

TE: It’s incredibly scary to think about. And of course, it’s not something that you generally do think about until you have to, really.

ZR: Oh, completely. I’m not saying I know everything, mind you. I’ve been through that situation, and therefore I’ve had to consider it, as you said. Open-source intelligence to me was very interesting because I had it used against me before I even knew the term. And so, I try to consider all of those different things, but there are situations that I haven’t considered. So, I can’t blame a developer or a company, a team of developers, that maybe didn’t consider that threat map. But what I can say is if you’re going to create this solution, you do need to include a diverse team so that these situations are considered. Or you make it clear that there is a risk to the user.

So, I think there is the responsibility of the everyday user to protect themselves. Take back control, for sure. But there’s also the responsibility of the manufacturers of the producers, of the people developing these solutions, to consider whom they’re marketing to and what that means for their own privacy and their own safety.

TE: So, tell us again the name of the organization that you volunteer for and that can help people if they have questions.

ZR: Yeah. Operation Safe Escape. You can learn more here.

TE: Excellent. Unfortunately, I think we’re out of time at this point. Zoë Rose, I want to thank you for spending time with us. It was an interesting conversation. Thank you for joining us.

ZR: Lovely. Thank you for having me. And hopefully, if anybody does have any questions or wants to learn more, they could feel free to reach out to me on social media or through Tripwire. I’ve written a few articles about taking back control. So, there are resources out there if they need them.

TE: Thank you so much. And for everyone listening, I hope it was interesting and enjoyable. We’ll look forward to you joining us for the next episode of the Tripwire Cybersecurity Podcast.





Source link