YMCA Twin Cities Takes a Next-Gen Approach to Data Loss Protection

The Y connects with youth, adults, families and seniors of all backgrounds to explore and enjoy opportunities to learn, grow and thrive. In order to strengthen the community, which is our cause, it’s important that we make it easy for our employees and volunteers to do their work in supporting our programs and services — and data security plays a vital role.

The importance of data security for us lies in our ability to keep our data safe while enabling our users to get their jobs done efficiently and fast, without hindering what they’re trying to do. If our users aren’t able to access their data, it impedes their ability to accomplish the mission of the YMCA of the Greater Twin Cities. Specifically, data loss means time wasted in redoing work; it means time spent researching where that data went; it means determining whether that data movement created a new risk for the organization; and ultimately, it means not being able to serve our community so all can thrive.

People want to embrace technology and expect that it will allow them to get their jobs done quicker. As a security director, it is my responsibility to layer in security in a way that enables employees to use technology the way they want to. That’s critical, because if we don’t, they’ll stop using the organization sponsored technology entirely. Providing for this flexibility requires strong governance, and faster detection and response to data loss incidents.

I don’t think traditional data loss prevention (DLP) works. Policy sets with traditional DLP are hard to tune, and it takes months or maybe even a year or two to get to the point where you can enforce policy rather than just monitor. I am not willing to accept the risk associated with imperfect policies, resulting in blind spots. Instead, to enhance the security of the YMCA of the Greater Twin Cities, I prioritize faster detection and response.

When our existing DLP solution was due for an upgrade, we took a cloud-first approach to looking for a replacement. We also wanted to get away from the burden that traditional DLP places on user productivity when policies block the movement of data for legitimate workflows.  Considering this, we found that it made sense fiscally, strategically and technologically, to replace our legacy DLP solution with Code42 Next-Gen Data Loss Protection.

Code42 Next-Gen Data Loss Protection gives us the visibility we need across our endpoints and cloud applications — visibility that I haven’t had through other tools. We can create alerts to help us find any data exfiltration attempts so we can quickly take action, in the event of insider threats. It also helps us detect, respond and recover should there be an incident where a departing employee takes data.

“ The simplicity of the Code42 deployment was amazing. It’s been invaluable for us to be able to deploy efficiently and in such a short time because it freed us to work on other projects. ”

And, we were able to replace more than 10 on-premise servers with a cloud deployment, bringing financial savings. Code42 Next-Gen Data Loss Protection accelerates our detection and response to data loss and leak, at a fraction of the cost of alternatives, all without impeding users from accomplishing the YMCA of the Greater Twin Cities’ mission.

From advocacy to aquatics, child care to camps, mentoring to multicultural experiences, sports to safe spaces, water safety to wellness, the Y strengthens the community with life-changing programs and services. With Code42, we’ve been able to advance our data security program to support these efforts.

Tips from the Trenches: Multi-Tier Logging

Tips From the Trenches: Multi-Tier Logging

Here’s a stat to make your head spin: Gartner says that a medium-sized enterprise creates 20,000 messages of operational data in activity logs every second. That adds up to 500 million messages — more than 150 GB of data — every day. In other words, as security professionals, we all have logs. A lot of logs. So, how do we know if our log collection strategy is effectively meeting our logging requirements? Unfortunately, a one-size-fits-all logging solution doesn’t exist, so many leading security teams have adopted a multi-tier logging approach. There are three steps to implementing a multi-tier logging strategy:

“ A one-size-fits-all logging solution doesn’t exist, so many leading security teams have adopted a multi-tier logging approach. ”

1. Analyze your logging requirements

A multi-tier logging strategy starts with analyzing your logging requirements. Here’s a simple checklist that I’ve used for this:

Who requires access to the organization’s logs?

  • Which teams require access?
  • Is there unnecessary duplication of logs?
  • Can we consolidate logs and logging budgets across departments?

What logging solutions do we currently have in place?

  • What is the current health of our logging systems?
  • Are we receiving all required logs?
  • Have we included all required log source types?
    • Do we need public cloud, private cloud, hybrid cloud and/or SaaS logs?
  • How many events per second (EPS) are we receiving?
  • How much log storage (in gigabytes) are we using now?
  • What are our logs of interest?
    • Create alerts and/or reports to monitor for each.

What time zone strategy will you use for logging? 

  • How many locations are in different time zones across the organization?
  • Will you use a single time zone or multiple time zone logging strategy?

How much storage capacity will be needed for logging for the next 3-5 years?

Do we have a log baseline in place?

  • Where are our logs stored now?
  • Where should they be stored in the future?

Are we collecting logs for troubleshooting, security analysis and/or compliance?

  • What are our compliance requirements?
    • Do we have log storage redundancy requirements?
    • What are our log retention requirements?
    • Do we have log retention requirements defined in official policy?
  • What logs do we really need to keep?
    • Identify those that are useful.
    • Drop those that are not.

2. Digest log information

After all of this information is gathered, it’s time to digest it. It’s important to align your logging infrastructure to log type and retention needs — so you don’t end up inserting a large amount of unstructured data that you will need to be able to quickly search in an SQL database, for example. Most organizations have multiple clouds, many different devices that generate different log types and separate required analysis methods. In other words, one solution usually does not meet all logging needs.

3. Implement multi-tier logging

If, after analyzing your logging requirements, you find that one logging strategy does not meet all of your requirements, consider this tiered logging flow:

Code42 Tiered Logging Flow Example

In this example logging flow, there are three different logging flow types and five different log repositories. There are SIEM logs, application logs and system log flow types. The repositories are the SIEM database, ELK (elasticsearch, logstash and kibana) stack, two long-term syslog archival servers and cloud storage. The repositories each have a unique role:

  • The SIEM correlates logs with known threats.
  • The ELK stack retains approximately 30-60 days of logs for very fast searching capabilities.
  • The two syslog archival servers store the last three to seven years of syslog and application logs for historical and regulatory purposes. One syslog archival server is used for processing logs, the other is a limited-touch, master log repository.
  • Cloud storage also stores the last three to seven years of logs for historical and regulatory purposes.

Simplify your log activity

This is just one quick example of an innovative solution to simplifying log activity. Regardless of whether multi-tier logging is the right solution for your organization, the most critical step is making sure you have a clearly defined logging strategy and an accurate baseline of your current logging state. This basic analysis gives you the understanding and insights you need to simplify log activity — making it easier to accomplish the complex logging goals of your organization.