What if the biggest vulnerability in your security system isn’t a line of code — it’s a person? That’s the question driving one of cybersecurity’s most compelling thinkers today. In this exclusive Dr. Sheeba Armoogum interview, we sit down with the Associate Professor in Cybersecurity at the University of Mauritius — a researcher, author, and strategist whose work spans AI-driven threat detection, digital forensics, cyberpsychology, and quantum security.
With over two decades of experience across academia, research, and industry, Dr. Sheeba Armoogum has built a reputation for asking the questions others overlook.
What happens to your digital identity after you die? Why do technically sound systems still get breached? And why does cybersecurity still struggle to include the very diversity of thinking it desperately needs?
Her recently published book, Digital Afterlife: A Global Framework for Law, Technology and Victim Justice, is already reshaping conversations around digital legacy and governance, a field most security professionals haven’t even begun to map.
From the psychology behind cyberattacks to ethical AI design, from mentoring the next generation of women in cybersecurity to building systems that are not just intelligent but accountable — Dr. Sheeba Armoogum doesn’t just defend networks. She defends trust.
Read on full Dr. Sheeba Armoogum interview:

Dr. Sheeba Armoogum Interview on Women, Leadership, and Structural Change
TCE: You have worked across academia, research, and industry for over two decades. What first inspired you to pursue cybersecurity, and how has your journey evolved over the years?
Dr. Sheeba Armoogum: My journey into cybersecurity did not begin with a grand plan. It started with curiosity — an urge to understand how systems think, respond, and connect to the world. This fascination also led me to a realisation: as we became more interconnected, our vulnerability increased. I saw how easily systems could be compromised and how breaches affected not just data but people’s finances, privacy, and sense of security. Cybersecurity transformed from a technical field into a deeply human concern.
My doctoral research marked a significant turning point. It encouraged me to rethink not only how we block known threats but also how to build systems that can adapt, learn, and evolve. As my work progressed, I explored how AI could detect patterns humans might overlook, how digital forensics could protect justice, how cyberpsychology could explain why people become victims of manipulation, and how quantum cybersecurity could redefine what ‘secure’ truly means. Today, I no longer see cybersecurity merely as protecting infrastructure. I consider it as safeguarding trust.
TCE: Cybersecurity is constantly evolving with AI, quantum technologies, and digital forensics. Which emerging area do you believe will most reshape the future of cyber defense?
Dr. Sheeba Armoogum: Artificial Intelligence will transform cyber defense in ways we’re only beginning to understand. Historically, security has been reactive: an attack occurs, a signature is created, and a patch is released. We are now shifting towards an era where systems must anticipate threats proactively. What excites me is the ability of AI-driven systems to detect subtle behavioural changes — minor anomalies potentially indicating an early breach before any damage occurs.
At the same time, I remain cautious. When AI systems operate as black boxes, making decisions that even their creators can’t fully explain, we face a different kind of vulnerability. Security architectures should be intelligent, yet also auditable, transparent, and ethically aligned. I envision systems that safeguard not only networks but also public confidence. Ultimately, cyber defence revolves around maintaining trust within a digital society.
TCE: As a global advocate for innovation and research, what are the biggest challenges women still face in cybersecurity, especially in leadership and technical research roles?
Dr. Sheeba Armoogum: While progress is visible, it is not yet deeply rooted structurally. More women are joining the field, but just entering the profession doesn’t equate to having influence. Many women begin in operational or support roles, but fewer hold positions in advanced areas like algorithmic research, secure systems architecture, or strategic advisory roles where long-term security decisions are made.
A subtle issue lies in how credibility is perceived. Women often need to repeatedly demonstrate their expertise before receiving recognition. Addressing this cannot depend only on encouragement — it demands institutional maturity, with deliberate access to fair research funding, structured doctoral mentorship, and inclusion in international research consortia. Representation in patent development, standards committees, and strategic innovation boards shapes the future of the field. Cybersecurity depends on diverse thinking, and when leadership includes a variety of experiences, overall resilience improves.
Also Read: Top 50 Women Leaders in Cybersecurity to Watch in 2026
TCE: This year’s Women’s Day theme focuses on “Give to Gain.” What does this idea mean to you in the context of mentoring and empowering the next generation of women in cybersecurity?
Dr. Sheeba Armoogum: For me, “Give to Gain” reflects how cybersecurity operates in reality. No system is completely secure by itself — resilience requires a collective effort, and sharing knowledge strengthens protection. I now see mentorship as more than generosity; it’s a strategic investment in future stability. When young researchers are entrusted with complex algorithmic challenges or guided in ethical AI design, they are not merely acquiring knowledge; they are becoming integral to the next line of defence.
When expertise is limited to a few individuals, systems become more fragile. When knowledge is shared thoughtfully, ecosystems are strengthened. In cybersecurity, giving is not a loss. It is multiplication.
TCE: You lead and mentor doctoral researchers through your CyberSecurity & Forensics Research initiatives. What are three practical steps organizations can take to encourage more women to enter advanced cybersecurity research?
Dr. Sheeba Armoogum: Our strategy should extend beyond motivational messages. First, organizations must establish well-defined, funded pathways into high-impact technical disciplines — opportunities in AI-based intrusion detection, quantum-safe cryptography, or advanced digital forensics need to be deliberately made accessible, making women integral contributors at the foundational level.
Second, exposure plays a crucial role. True confidence is gained through hands-on experience — working in AI labs, contributing to secure system designs, or analysing real forensic datasets builds both technical skills and intellectual authority. Third, visibility holds significant influence. When women lead keynote technical sessions, showcase new algorithms, or participate in standards committees, it signals that leadership is not exceptional; it is normal. Aspiration is shaped by what appears achievable.


TCE: Your recent book, Digital Afterlife: A Global Framework for Law, Technology and Victim Justice, explores an important emerging topic. What inspired you to write it, and why is digital legacy becoming a critical cybersecurity and policy concern?
Dr. Sheeba Armoogum: Digital Afterlife originated from a recurring question: what happens to our digital footprint when we’re gone? Cybersecurity conversations focus on breaches and encryption but often overlook what remains — digital identities, intellectual property, cloud storage, social media profiles, and AI models trained on personal information. Our legal and governance frameworks lag behind. When someone passes away, their digital footprints don’t disappear; they persist. Families are left managing passwords, privacy policies, and legal uncertainties during moments of grief.
Digital legacy has shifted from a philosophical concern to a practical security issue. Dormant accounts can be exploited for identity theft, unmanaged digital wallets are vulnerable, and research data may become compromised. The book provides a framework combining law, cybersecurity protocols, platform governance, and victim justice. Managing digital afterlife is not optional — it is an increasingly important responsibility. Safeguarding dignity must go beyond simply protecting life.
TCE: From a cybersecurity and digital forensics perspective, what should individuals and organizations start doing today to better manage digital footprints and digital assets after death?
Dr. Sheeba Armoogum: A significant part of our value — personal, intellectual, or economic — resides digitally. Individuals must treat digital assets with the same importance as physical property: online accounts, intellectual property, research data, digital wallets, and professional platforms all need to be accounted for. Estate planning now needs to include digital credentials and instructions — documenting digital footprints, clarifying data intentions, and ensuring lawful, secure transfer of access.
Organizations share a similar responsibility. They should proactively create access procedures, developing structured data governance policies, clear transfer protocols, and memorialisation frameworks in advance. Without proactive planning, digital remnants can lead to identity theft, internal disputes, or legal issues. Cybersecurity must now focus on lifecycle management, understanding that digital systems outlive individuals, and governance should be structured to reflect this.
TCE: You work at the intersection of AI, cybersecurity, and cyberpsychology. How do you see human behavior influencing future cyber threats and defense strategies?
Dr. Sheeba Armoogum: Cyber threats now mainly target individuals rather than systems. The key vulnerability is often psychological. Social engineering is about manipulating trust. AI-generated impersonations sound convincing because they replicate familiarity. Sextortion tactics rely on fear and shame, while misinformation campaigns exploit biases and emotional reactions. Attackers analyse behaviour as carefully as they inspect infrastructure.
This is why relying solely on technical security measures is insufficient. Cyber resilience must extend beyond architecture to include behavioural science, digital literacy, and psychological awareness. Analysing why people become victims reveals recurring patterns — emotional triggers, situational stress, and social influences. Recognising these patterns helps develop more effective training and awareness campaigns. Protecting systems depends on understanding people.
TCE: As both a CIO/CISO-level strategist and academic leader, how do you balance technical innovation with ethical responsibility, especially in AI-driven security environments?
Dr. Sheeba Armoogum: Every intelligent system makes decisions, but the key questions are whether those decisions are understandable, auditable, and justifiable. Bias auditing, explainability, and traceability are not mere administrative tasks; they are essential safeguards. Without them, there is a risk of embedding hidden biases or opaque processes into security systems. In high-stakes environments, there’s often a push toward speed — but prioritising quick results without ethical oversight causes long-term instability. A system that functions well but isn’t accountable will eventually erode trust.
I do not see ethics as a barrier to innovation but as a means of stabilising the structure. Responsible innovation guarantees that as our systems grow more intelligent, they stay fair, transparent, and justifiable.
TCE: What advice would you give to young women aspiring to build impactful careers in cybersecurity, particularly those who may feel intimidated by the technical depth of the field?
Dr. Sheeba Armoogum: Take both the field and yourself seriously. Begin by mastering the fundamentals — understand how data moves through networks, how encryption protects information, and how AI models learn. A strong foundation naturally boosts confidence. Start exploring early, even if it feels overwhelming. Genuine progress happens when you apply theory to real-world challenges: designing, building, testing, and sometimes failing before achieving success.
Do not let technical intimidation take over. True expertise is based on understanding, not volume. What truly matters in cybersecurity is competence, curiosity, and courage — the willingness to ask difficult questions and challenge assumptions. Your perspective is not just an addition to the field; it is vital to its development. Diversity in thinking improves architecture, refines threat modelling, and drives innovation. Your contribution is not minor; it is crucial.

