The lonely castle: Insights into the evolution of cybersecurity defense

1 week ago
Eric Vanderburg

This technological era is one that changes rapidly and so do defense strategies.  In the ancient world, combat strategies might have stayed the same for generations, but today the strategies of attackers and defenders evolve at an ever-increasing pace.

The castle approach was used a decade ago.  Information systems were walled up behind castle fortifications, and the fortifications looked outward to danger and trusted those inside.  Firewalls and intrusion detection systems focused on what was coming into the network, assuming that systems would be secure if bad actors could just be kept outside the walls.  Phishing and malware proved this to be an incorrect assumption.

Some organizations reacted by trying to hold onto information even tighter.  More restrictions were placed on data and systems to hold onto a dying paradigm.  Some lost creative and intelligent talent who did not want to work in a draconian, micromanaged environment and over the next few years, the castle system continued to erode.

The collapse of walls was also a collapse of cybersecurity illusions.

Workplace environments became more open and organizational boundaries crumbled just like the walls of an old castle.  Data moved with employees, across a myriad of devices, and through trusted and untrusted networks.  Defense strategies also evolved.  When key systems were decoupled from the perceived safety of the corporate network, secure methods of transmitting data between them had to be developed. Such methods also had to be easy for enterprises to adopt.

The cloud was a major driver for such efforts.  It was a tipping point that resulted in large-scale cybersecurity changes.  Companies learned that cloud vendors should not have access to back-end data so they encrypted the data and distributed keys such that cloud providers could not access the data they hosted.  Robust APIs were created to integrate systems while providing only the minimum required service access.  Likewise, communications between system components such as databases and web services were also encrypted.  Shortcuts like advertising services and ports, allowing back-end components to communicate unrestricted, and giving IT the keys to the kingdom, may have been overlooked in the before, but they were universally accepted as a bad practice now.

Since then, cloud systems have been used to plug-in best of breed security technologies into information systems by using trusted APIs.  Organizations leveraged monitoring and control, identity and access management (IAM), Data loss prevention (DLP), and many other technologies per resource and classification type, resulting in targeted risk-based decisions.  The associations between data, users, and acceptable use were independent of the application, yet tightly coupled to security requirements.

The collapse of data barriers has refined our concept of trust.  Trust is now earned, not given.  Computers and users that want to authenticate should demonstrate that they meet security requirements before being granted access.  Likewise, data is not trusted until it is screened for the presence of malicious code.

These ideals have made their way back into the enterprise through cloud integration and increased awareness.   Cybersecurity-conscious companies continue to apply the same security skepticism to innovate better ways of securing data, turning challenges into opportunities.

This article was sponsored by TCDI, a cybersecurity, computer forensics, and eDiscovery company.