When open science meets real-world cybersecurity

When open science meets real-world cybersecurity

Scientific research environments are built for openness and collaboration, often prioritizing long-term discovery over traditional enterprise security.

In this Help Net Security interview, Matthew Kwiatkowski, CISO at Fermilab, America’s particle physics and accelerator laboratory, discusses where cybersecurity blind spots emerge, why availability can outweigh confidentiality, and how security teams protect complex, legacy-driven research infrastructure while supporting scientific progress.

What kinds of security blind spots emerge when infrastructure is designed by scientists rather than security engineers?

This has been improving over the last decade. For context, I’ve been working in the IT/cyber space for the Department of Energy Laboratory (DOE) complex for the last 25 years. IT leaders/CIOs across the DOE complex have made great strides and advances to collapse commodity IT and scientific IT which can have drastically different configurations.

If there is no collaboration, usually the product that emerges is a great scientific specimen with very risky implementations. The risk is usually caught by normal cyber processes and reduced accordingly; however, scientists who see the value in IT/cyber collaboration usually also end up with a great scientific specimen. There is also managed risk in the implementation with almost no measurable negative impacts or costs. We’ve seen that if collaboration is planned into the project very early on, cybersecurity can provide value.

When people hear “science infrastructure,” they often think of universities or national labs but from a cybersecurity perspective what types of environments are most misunderstood or overlooked?

At Fermilab, we host a lot of scientific information that is releasable to the public. The types of data can range from highly technical papers to terabytes of experimental data. We routinely get reports that we are leaking information that we shouldn’t be!

We get around one report a month from concerned members of the public about the amount of information that is available on our services. Especially the complex high-energy physics data that is authorized for public release. Cybersecurity researchers often are confused and look for issues on the internet where they stumble onto the laboratory IT footprint and make claims that we are leaking non-public information. We clearly label and denote information that is releasable to the public, but it always seems there are folks who are quicker to report than to read the dissemination labels.

What assumptions about trust, openness, or collaboration in scientific communities most frequently become security liabilities?

Fermilab has a great responsibility to make sure that our affiliates (partners from academia) and the people who come to use our facilities to perform experiments are authorized to perform those activities.

In the world of high-energy physics, experiments can take 5-10 years of data collection and analysis. Usually, early career scientists and academia folks move through multiple institutions during that time. This requires re-vetting of the information for every move that is happening outside our organization. It can be daunting to keep the collaborations active and re-vetting relationships and institutions.

How should security teams recalibrate their approach when availability for research is treated as more mission critical than confidentiality?

Thankfully in the C.I.A. model (confidentiality, integrity, and availability), when referencing the NIST frameworks, moderate integrity and/or availability is the same control set as moderate confidentiality. So, there does not have to be so much recalibration as you think.

However, confidentiality is not a primary concern. For example, for publicly releasable information, local Authorizing Official acceptance is exempt from encrypting large storage areas of data that are open to the public. Encryption at rest (EIR) is really a control to prevent data loss when the storage medium is no longer in your control. So, when the data has been reviewed for public release, we don’t spend the extra time, effort, and money to apply a control to data stores that provide no value to either the implementation or to a cyber control.

The flip side to this question is that Fermilab also has typical business systems and proprietary data stores that do fall into a moderate controls protection scheme for confidentiality. And for that, you need a more standard cyber approach to those systems.

Many research environments rely on highly specialized, aging, or custom-built systems. From your experience which legacy technologies pose the greatest long-term cyber risk and why are they so hard to replace?

Fermilab runs a particle accelerator. That accelerator is built as a one-off precision machine. You can imagine there are many custom IT and OT parts that run that machine. The replacement of components is not on a typical IT replacement schedule.

This can present longer than ideal technology refresh cycles. The risk here is that integrating modern cyber technology into an older IT/OT technology stack has its challenges. Thankfully, the DOE has expertise in specialized OT systems and has partnered with agencies like DHS/CISA, which released common-sense guidance and approaches to help with this challenge.

Additionally, the DOE recognizes this challenge internally and has created the Center of Excellence for Operational Technology (CoE4OT), which I co-chair with a representative from the National Nuclear Security Administration to help the entire DOE on this grand challenge. Some of our initial findings are: if resources are put towards the challenge, it can be overcome through workforce upskilling, architecture, configuration, and continuous monitoring.



Source link