Home 0P3N Blog Implementing Data Protection by Design
Ready to Start Your Career?
Create Free Account
Dr. Edward Amorosos profile image
By: Dr. Edward Amoroso
July 30, 2020

Implementing Data Protection by Design

By: Dr. Edward Amoroso
July 30, 2020
Dr. Edward Amorosos profile image
By: Dr. Edward Amoroso
July 30, 2020

Optimizing prevention, detection, and response in the cybersecurity challenges of tomorrow

Data protection used to be relatively easy. Traditional firewalls would keep most of the external threats from getting into the company network, while antivirus software installed on endpoints would usually take care of those which did make it through. Additional layers of security, such as intrusion detection and prevention systems and web application firewalls would secure any external connections. Enterprises would establish a perimeter around their information assets, much like medieval castles would keep invaders out with walls, ditches, and moats.

In the age of cloud computing and virtualization, this traditional approach to perimeter security has become woefully insufficient. Digital assets no longer exist only on in-house servers, but across a multitude of systems and devices, which are often spread across a huge geographic area with varying physical security measures in place. Today, the actual enterprise perimeter is full of different endpoints from in-house systems to cloud-hosted virtual computing resources to employee-owned mobile devices. Every single one of those endpoints, and the connections between them, is a potential entry point for attackers.

What is protection by design, and why does it matter?

Firstly, we should start by reinforcing the importance of conventional perimeter security. Still a core component of any robust infrastructure, it serves as a crucial backup if something does go wrong. After all, there’s no such thing as a perfect security infrastructure.

However, traditional security measures like firewalls and antivirus alone are no longer enough. As the threat landscape continues to evolve, conventional security measures have a hard time keeping up with new risks. Yet at the same time, most organizations recognize the need to embrace new technology and innovate continuously to remain competitive. Yet in the rush to modernize their infrastructures, they often end up leaving sensitive data exposed. Attackers, using a practically unlimited range of tools and processes themselves, are always on standby to exploit these opportunities.

When developing and implementing new digital systems and services, the traditional approach is to tack on security later. In other words, it ends up being an afterthought. Security by design and default aims to prevent attacks from an early stage. It’s a proactive approach that keeps the attack surface small in a time when the physical perimeter can quite literally span the entire globe. It’s about embedding security from the very start.

Rule 1: Retrofitting is less secure

An all-too-common tendency in software development is to tack on security later, typically by patching newly discovered vulnerabilities. The same applies in physical contexts too, as was the case when the Spectre and Meltdown vulnerabilities were found to affect virtually every CPU manufactured before 2019.

There will always be cases where retrofitting is necessary, simply because nothing’s going to be perfect from the start. But it’s also a hit-and-miss process, hence it’s important to get things right as often as possible from the outset.

Rule 2: Simpler systems are more secure

The increasing complexity of computing networks over the past decade has proven extremely difficult to keep up with. A lot of companies really don’t know exactly where their data lives or which controls are in place to protect it. Hardware and software systems have diversified, and every vendor has their own features, solutions, and interfaces to work with.

Often, the best way to secure a system is to simplify it. After all, every node and communication path is a potential attack vector. Thus, the most secure line of code is the one you remove.

__Rule 3: Defense in depth is more secure __

No security layer is 100% secure. Antivirus software requires regular definition updates. Zero-day exploits target software before developers have a chance to patch it, rendering traditional security controls useless. Even the most effective and cutting-edge security layers, such as AI-powered threat detection, aren’t perfect.

While adding more layers of security might conflict with the need to simplify systems, this is one exception that every cybersecurity leader must make. Successive security layers should also be sufficiently diverse to tackle an increasingly diverse range of attack types.

Rule 4: Protection must apply at the concept phase

When conceptualizing new technology, security often doesn’t factor into the conversation. For example, a software developer might draw up a story board outlining how end users interact with certain features. Business leaders are primarily concerned with economics and the impact on productivity that rolling out new technology might bring.

If security measures aren’t included in the concept phase, they risk being an afterthought. The opportunities for improving security and implementing measures by design and default often dissolve during the technology lifecycle too.

Rule 5: Automation empowers better security

We’re all prone to making misjudgments. In the context of cybersecurity, where almost every attack involves a human element, these mistakes can be enormously costly. Even the most skilled and well-informed security teams cannot keep up with the speed of attacks if relying on manual means alone.

Automation works because it’s not subject to human error, and it accelerates repeatable tasks like threat hunting. Also, it empowers security teams to do what they do best by providing data-driven insights that fuel smarter decision-making.

Rule 6: Manual is less secure at scale

Manual intervention is sometimes necessary, which is exactly why security teams should use automation to free up time to focus on matters that demand a human touch. However, reliance on manual tasks greatly limits scalability. There’s a very real risk of a company’s computing infrastructure outgrowing its ability to defend it.

Security teams should approach automation as a way to augment their capabilities, especially when it comes to implementing measures at scale.

Rule 7: Virtualization enhances security

Bare-metal computing lacks flexibility. With operating systems and software being directly tied to a specific hardware configuration, attackers have more opportunities to find a way in. If the system is exploited, it might also be lost for good. It takes time to provision a new one when you have to securely wipe away any traces of malware and reinstall everything from scratch.

Virtualization offers the flexibility security leaders need to enforce policies and apply defense measures without being limited by the capabilities of the underlying hardware or host operating system. Moreover, they can provision encrypted virtual machines faster and fully automate disaster recovery.

Cybrary helps organizations close the cybersecurity skills gap and build a workforce capable of tackling the challenges of today, and tomorrow. Request your demo of Cybrary for Teams to get started.

Schedule Demo

Build your Cybersecurity or IT Career

Accelerate in your role, earn new certifications, and develop cutting-edge skills using the fastest growing catalog in the industry