Why “Less is More” When It Comes to Network Security

 

Last week, Dark Reading published an analysis of 2017’s data breaches, and the results were quite bleak. 

If you follow the news at all, it shouldn’t surprise you that the number of breaches and leaks last year broke a world’s record. What is surprising, however, is over two-thirds of the exposed records were literally available on the Internet. 

That’s not a joke. On the Internet.

To quote the study: “68.7% of exposed records came at the hands of unintentional Web-borne exposure, due to accidental leaking online and misconfigured services & portals.”

That is staggering.

Problems as simple as basic configuration errors are placing sensitive data within the reach of thieves. And not just thieves — search engines (yes, search engines) have been able to access leaked information, as well, through exposed data records.

With all the work we do as security professionals in building an advanced cyber-security system, to go and leave data or backups unprotected is like building a house and forgetting to lock the doors. (At least with a house, risk is limited to the content’s value.)

Leaking customer or employee data can destroy a company’s reputation, trust, and relationships with customers and employees. Many businesses never recover. 

RedLock CSI reported that 53% of Amazon S3 (Simple Storage Service) customers have at least one cloud service exposed to the Internet; more than half. The app developers I’ve worked with are generally quite competent at their craft. It’s hard to imagine so many people incapable of installing S3 correctly and safely.

If over half of a platform’s customer base make an error, it seems likely there is a complexity problem. Hopefully Amazon will examine these issues and find ways to redesign S3 configurations to default to a safer state, reducing room for human error. But perhaps what we need is not more complexity, but less complexity, to increase the security of our organizations.

While it may be difficult to predict which subnets or IP addresses a server may need to communicate with, it’s often easy to know how close (from a proximity basis) a machine must be. This is what we do at HOPZERO; we leverage proximity to simplify network security and reduce configuration errors. 

Here are two questions to consider:

  • Should a machine be accessible beyond the data center?
  • Can that accessibility be tightened further, to restrict communication to a specific zone, cluster, or specific rack?

Most critical information typically resides in database servers. These database servers should probably only be accessible by adjacent application servers.

Servers like this rarely need to allow connections from outside a zone or compartment within the data center. Limiting default travel to the local zone, and carving out an exception for backups, eliminates the threat of remote access.

These rules are easy to understand and set, and a few simple configuration changes can reduce the possibility for human error and significantly improve security.

As you grow your network, how can you incorporate security by design and security, by default? Perhaps, by doing less, you can protect more.

[VIDEO] Discover How HOPZERO Works

 

Read More

HOPZERO Selected as “EMA Vendor to Watch”

HOPZERO Selected as “EMA Vendor to Watch”

Enterprise Management Associates (EMA) is a leading voice in the information security industry. With its dedication to in-depth research — and unrivaled analysis — the EMA is an important resource for data management and IT professionals anywhere. That's...

Remembering 9/11: “Being Ready for the Call”

Remembering 9/11: “Being Ready for the Call”

As we move closer to another anniversary of 9/11, I'm reminded of the opportunity my team and I had, just days after the attack, to serve my country by assisting with communication recovery for a besieged Pentagon.It was an experience I'll never forget. (The VIDEO...