Enhancing firewall management with automation tools


In this Help Net Security interview, Raymond Brancato, CEO at Tufin, discusses the considerations organizations must weigh when selecting a next-generation firewall to effectively balance security needs with network performance.

What factors should organizations prioritize when selecting a next-generation firewall to balance security needs with network performance?

You already said the most important part – security needs, useability, and network performance must be balanced, and it’s not always easy to find that common ground.

When selecting a firewall, the first step an organization must take is to take a deep dive into the existing network infrastructure. Evaluating the current network architecture and security requirements and any key assets or sensitive data will help a team determine the most suitable firewall type and configuration.

Be sure to focus on key variables such as what network access levels are necessary to support, the desired level of security, internal segmentation (e.g., DMZ), regulatory and compliance requirements, the complexity of the network topology, and common/potential threat types. Emphasizing these factors as a firewall is selected will help to ensure that the choice ultimately reflects the unique challenges and needs of the organization.

Engaging all relevant stakeholders early in the process is another important action as it will help to align goals and expectations across the organization. Failure to do this can result in distrust of the firewall, and a feeling among developers, for example, that the firewall is a hurdle for them to overcome in order to be productive.

Are there any common misconceptions or mistakes businesses make when using firewall management tools?

One of the biggest mistakes organizations (and security teams) can make is to think that firewalls are just ‘set it and forget it’ tools. Regularly revisiting and adjusting the organization’s firewall ruleset to adapt to evolving network demands – and threats – is essential. Failure to do so leaves the organization open to attack.

Many organizations also retain a dated, negative view of firewalls. They are still often equated by non-security professionals with something that can slow down network performance and limit the effectiveness of developers. While there may have been some truth to these concerns in the past, modern firewalls are designed to find the balance between security and performance. In fact, if there is any slowdown in performance, it’s usually a case of firewalls being configured incorrectly.

While firewalls do need to be regularly reviewed and updated, another mistake many organizations make is to still manually manage and operate every action. Automation is an excellent tool for security professionals, and many aspects of a modern firewall can be automated – from checking and granting (or denying) access, to deploying new connections. While it is a mistake to ‘set it and forget it,’ it can easily overburden security teams if firewall management does not incorporate as much automation as possible.

Poor firewall configurations have been linked to major data breaches. What steps should organizations take to ensure their firewalls are configured correctly and securely from the start?

An organization’s firewall policy sets the framework for inbound and outbound traffic management, administrative rights and access, which threats should be blocked and how best to adhere to regulatory and organizational compliance standards. I’ve often heard firewalls referred to as the key tool for keeping security teams sane.

There’s always room for adjustments to be made as network environments expand and evolve, but when it comes to firewalls, even more than other solutions, it is critical to start the process correctly.

The configuration process should begin with a thorough examination of the organization’s existing network infrastructure. To operate optimally, firewall settings should be tailored to the specific needs and requirements of each organization.

When configuring a firewall to govern incoming and outgoing traffic, it is important to understand the elements and endpoints of the new firewall (and/or the old one, if only adjustments are being made). These include firewall policies, NAT rules, VPN and/or VLAN settings, IP addresses, and any affiliated authentication, validation, and/or SSL processes connected to these.

Sometimes more access is warranted, sometimes there needs to be tighter protections. The process should identify key assets, sensitive data, and any potential vulnerabilities. Steps should also be taken to ensure that any previous policies are carried over from an old firewall, as long as they are still applicable/relevant.

It is important to test the firewall following the initial setup. Teams should verify that all rules have been applied correctly and that authorized traffic is getting through the firewall as planned. From here, ongoing review and management is a must. Establish the regular review and adjustment process from the start, so rules are regularly adjusted and configured correctly in response to new threats and evolving needs.

Misconfigurations can sometimes lead to legitimate traffic being blocked by firewalls. What are the best practices for minimizing false positives while maintaining security protocols?

If left unaddressed, false positives can quickly become more than a simple annoyance. Not only do they avert time and attention away from actual problems that the security team needs to address, they can create alert fatigue among both everyday employees, and the security team itself. It’s easy to ignore a problem and hope it goes away, but when it comes to false positives, teams need to find the root cause of legitimate traffic being blocked.

More often than not, a false positive is the end result of a misconfiguration. To both identify existing issues and prevent new misconfigurations, organizations need to establish operational guardrails and embrace automation.

These guardrails are essentially boundaries that specific aspects of the system – IP addresses, user/app ID, network traffic, etc. – are required to operate within. In other words, they are rules on who can talk to who, and what can talk to what.

By automating the review of these guardrails, busy security teams can rest easy that any deviations from the established rules will be identified – and that they will receive an immediate notification if there is an issue, so they can respond quickly. Organizations can also take this to another level by automating specific responses to requests and rule deviations, removing a mundane and time-consuming process from the plates of their security teams altogether, while ensuring that networks continue to perform at a high-level.

The combination of well-defined guardrails and automated reviews and responses can make granting access seamless, ensure that networks are secure, and establish false positives as a thing of the past.

Many firewall management tools offer visibility into rule sets and traffic patterns. How can organizations leverage these tools to optimize their firewalls and improve performance?

Firewall rule histories are an important tool for network and security teams. They essentially create a timestamped log of any and all changes to an organization’s environment, which can be immensely valuable when it comes to the forensic stage of remediating a breach; i.e., finding out what went wrong to ensure it doesn’t happen again. Rule histories can help to pinpoint the entry point, the tools used and how much or how little reach an attacker had within the network environment.

But rule histories can do more than answer questions after a breach or an attack. By regularly auditing rule histories, teams can identify risky behaviors, such as changing security policy or rules to enable a request, or a failure to update or remove outdated rules.

Without rule change logs, temporary changes or bypasses of policies could be forgotten, creating the potential for a misconfiguration or exploit by an attacker. Weaknesses in network segmentation, control or governance processes can be uncovered, giving teams what they need to understand and address a potential issue before it’s too late.

In short, security teams can use firewall rule histories to identify both one-off issues and recurring patterns beyond what is immediately visible. This knowledge helps teams become proactive, both in preventing vulnerabilities – and in improving the security, and therefore the performance of their networks.



Source link