Code42 Blog

Improved Risk Management Through Better Data Insights

Let’s face it: security professionals are overrun with data. Their logs are brimming with it. Their security tools are continually alerting them to potential anomalies, attacks, new vulnerabilities, changes in system configurations and all of the other things that could put enterprise data at risk. It’s safe to say that when it comes to data, security analysts and administrators are beyond overwhelmed. However, when it comes to business executives, the opposite is true: they often aren’t getting the information they need to assess what type of risk their organization’s data is under. 

The problem is, without the right data — data specific to their roles in the organization — neither security analysts nor business leaders can make effective risk management decisions regarding their corporate data. With version 7 of our Code42®Next-Gen Data Loss Protection solution, we’re tackling that challenge head-on. The goal is to get the right type of information, in the right amounts, at just the right time to those who need it so they can make the best decisions they can relevant to their job. 

“ The problem is, without the right data — data specific to their roles in the organization — neither security analysts nor business leaders can make effective risk management decisions regarding their corporate data. ”

What do I mean, exactly, when I say security professionals get too much data and business executives not enough? I’m talking about a signal to noise ratio: security pros typically get flooded with so much data that they have a challenging time finding the risks they need to focus on, yet business executives get so little relevant security information that they can’t make effective data-driven decisions. 

This can, of course, have profound deleterious effects on security. Bad decision making driven by poor access to the right information will negatively impact regulatory compliance; the protection of intellectual property, business plans and confidential customer data. When it comes to security analysts, if they can’t see the data they need to take immediate steps to mitigate danger, then breaches will go unnoticed until it’s too late. It’s one of the reasons enterprise data breaches, more often than not, go undetected for months. To be specific, the latest research tells us it takes an average of 49.6 days to detect a breach, which is up year-over-year. 

Code42 is taking steps to eliminate these barriers to effective security. At Evolution19, we are announcing a series of enhancements when it comes to our alerts, reports and dashboards within our Next-Gen DLP solution. 

“ At Evolution19, we are announcing a series of enhancements when it comes to our alerts, reports and dashboards within our Next-Gen DLP solution. ”

These improvements will help business leaders get the precise information they need about data risks lurking within their organization. Of course, we will also be providing numerous enhancements needed by front-line analysts to do their jobs more effectively. 

These efforts tightly align with Code42’s belief that security’s ability to be successful is directly tied to their ability to quickly detect and respond to data threats. As such, our goal is to demonstrate that security products can be both powerful and easy to use. That’s why we designed our Next-Gen Data Loss Protection solution with ease-of-use in mind. Customers don’t have to spend their time writing complex DLP rules and policies to reduce data risk like they do with traditional DLP — and now we are making it easy to get actionable information whether one is a security analyst or business leader.

What do I mean when talking about security analytics for business leaders? I’m talking about providing them with the insights they need to understand where the data-related risks hide within their organization. This includes attributes such as where their data resides, where it may be inadvertently exposed and show them how and where users are moving that data around the organization. We also will provide other high-level views about their data so they can make better decisions about managing their data, determining their risk level and even investing in security defenses more effectively.   

“ I’m talking about providing business leaders with the insights they need to understand where the data-related risks hide within their organization. ”

I’ll give you some examples. With these enhancements, business leaders will be able to see not only how many files are shared outside of the organization, but also the kinds of data being shared outside the organization. It will reveal how many file exfiltration events are occurring within your environment and show trends and patterns in data movements that business leaders should know.

Let’s consider insider risks. Often when we think of insider risks, the first thing that comes to mind is the nefarious insider. The insider stealing data to sell to competitors, or to take intellectual property to their next job. Employees acting maliciously isn’t the only cause for concern, though. Sometimes employees simply are careless, or make unintentional or uneducated mistakes. They may not follow the rules around data protection because they’re not convenient, or they may not even be aware of what the rules are.  In all cases, it’s crucial that the organization is aware of trends in data usage and movement so that corrective and mitigative actions can be taken. 

Of course, we are prioritizing enhancements that also will help security admins get a better signal when it comes to data visibility. This includes improved alerting so that security analysts and managers will be sure to see the security-related situations they need to investigate. While we have always provided security managers information about where all of their data resides within their environment, where their files are located, and how that data travels, in the future we will provide them with alerts that will bring potentially risky situations to their immediate attention. Situations like:

  • When a file has a shared link that allows public access to an internal file.
  • When a file is shared publicly and indexed on the internet.
  • When a user copies files to removable media.
  • When a user syncs a file to a cloud service.
  • When user browsers or applications read a file from a device.

That’s a lot of powerful information and will help organizations go a long way in reducing their data security risks.

This is an exciting time for us at Code42; we continue to evolve our Next-Gen Data Loss Protection solution. It’s so rewarding to see all of our efforts come to fruition and I can’t wait to see how our customers put these new capabilities to use.

Code42 Talks DLP with Dark Reading

After unveiling our Next-Gen Data Loss Protection solution at the RSA Conference 2019 in San Francisco, just about every visitor to the Code42 booth asked: How is data loss protection different than data loss prevention?

To answer this question, I sat down with Dark Reading’s Terry Sweeney for a video interview. You’ll find the highlights of our conversation in a short video below — and you can watch the full interview at Dark Reading.

The home security analogy

I like to start with a simple analogy everyone can identify with: Let’s say a would-be burglar comes to your door while you’re at work. In theory, you can rest assured that the person will not break into your house — because you have locks on your doors, right? But we all know locks aren’t failsafe, so what if this individual does find a way in? You won’t know about any of this until you get home — hours later — or until you realize something is missing, perhaps days later. By then, it’s much harder to figure out what all was taken, who took it and when it was taken. That’s the problem with the traditional data loss prevention model: it’s focused on prevention — but if that fails, you’re not left with much.

Now, imagine you have Nest cams inside and outside your house. Your front-door Nest cam notifies you immediately, via smartphone, to activity at your front door. With real-time visibility, if you don’t recognize the face of the visitor and/or are concerned with the actions he takes next (e.g., picking the lock, breaking a window, etc.), you can take action right now. Even if you discover something missing later in the day, you have video logs that will help you figure out when that article was taken and how. Just like the Nest cams, Code42 Next-Gen Data Loss Protection shows you exactly what’s happening, when it’s happening — so you can decide if it’s important and take action now.

Paradigm shift: all data matters

Another major difference in approach between legacy data loss prevention and Code42 Next-Gen Data Loss Protection: how the tools define the value of data. Traditional DLP tools require an organization to decide which data and files are valuable or sensitive — and then figure out how to configure it with rules and policies. But today’s knowledge workers are constantly creating data — and it all matters. From developing new software, to innovating manufacturing processes or providing consulting services, more and more businesses across every sector are ultimately in the business of making new ideas. For these “progressive makers,” as we call them at Code42, every file and every piece of data holds value in the chain of idea creation. And the value of any given piece of data can skyrocket in an instant — when a project turns from theoretical tinkering into tangible innovation. Finally, while traditional forms of protected data like PCI, PII, HIPAA tend to follow predictable formats and patterns that can be recognized through rules, all of this “idea data” is wrapped up in largely unstructured data. The data relating to a software product launch, for example, might span from source code files, to Word documents containing marketing plans, to Excel spreadsheets with revenue forecasts and production budgets, to CRM data on target prospects. There’s no way to create a blanket “rule” for defining the structure or pattern of data relating to a valuable product launch. 

“ In this new reality of endpoints and cloud where all data matters, Code42 offers an unmatched core capability: We’ve gotten really good at collecting and saving every file, from every user, on every device. ”

In this new reality of endpoints and cloud where all data matters, Code42 offers an unmatched core capability: We’ve gotten really good at collecting and saving every file, from every user, on every device. More importantly, we’ve gotten really good at doing it in near-real time, doing it cost-effectively and doing it without inhibiting users as they’re working. This means organizations no longer have to define, at the outset, what data matters. And this complete data collection unlocks the kind of immediate, comprehensive visibility that creates the foundation of data loss protection — and sets it apart from data loss prevention.

Two critical questions DLP buyers need to ask

One of my favorite questions from Terry Sweeney was, “What should a DLP buyer look for as they’re evaluating a solution?” My answer is simple:

  1. How soon does the tool show you that something is going wrong?
  2. How soon does the tool let you take action?

The most consistent and concerning finding from annual infosecurity reports like Verizon’s Data Breach Investigation Report and the Ponemon Institute’s Cost of Data Breach Study is that most organizations aren’t discovering incidents for weeks — or months. In fact, the Ponemon Institute’s 2018 research showed the average breach took 197 days for an organization to discover. That’s six months before the investigation even begins— and even longer until the organization can attempt to take some remedial action. That’s a lot of time for data to be lost, tracks to get covered and stolen IP to do damage to a business.

Code42 Next-Gen Data Loss Protection cuts that time-to-awareness from months to minutes. Take the common example of a departing employee: You’ll know if they’ve taken data before they even leave the building — not months later when a rival launches a competing product. Moreover, you’re getting immediate and full visibility around the context of the departing employee’s data removal — you can look at the exact file(s) and see if it’s valuable and/or sensitive — so you can make decisions and take action quickly and confidently.

Enabling infosec automation

My discussion with Terry ended with a look at perhaps the most important factor driving infosecurity forward: the expanding role of automation in helping organizations manage and protect ever-increasing volumes of data. Many organizations fight expanding data security threats with a small handful of infosecurity staff — half who are “on loan” from IT. Automation and orchestration platforms pull together and make sense of all the alerts, reports and other data from various infosecurity tools — fighting false positives and alert fatigue, and allowing them to see more and do more, with fewer human eyes. But these platforms are only as good as the inputs they’re fed. These platforms rely on comprehensive data feeds to ensure you can create the customized reports and alerts you need to reliably bolster your security automation. The complete security insights gathered by Code42 Next-Gen Data Loss Protection ensure there are no blind spots in that strategyThat’s why we’re focused on making sure all our tools plug into automation and orchestration platforms, and support the workflow automation capabilities you already have in place. All Code42 tools are available through APIs. If you want us to integrate data and alerts to be automatically provisioned in your SIEM or orchestration tool, we can do that. If you want us to automatically raise an email alert to your ticketing system, we can do that, too. Furthermore, Code42’s Next-Gen DLP allows you to take a more proactive “data-hunting” approach to data security, much like you would with threat hunting to deal with external malware and attacks.

This is where the value of Code42 Next-Gen Data Loss Protection gets really exciting. Our tool gives you incredible off-the-shelf value; it does things no other tool can. We’re seeing organizations integrating our tool with advanced automation and orchestration platforms — using our tool in ways we hadn’t even considered — and really amplifying the value and driving up their return on investment.

Watch the video highlights of the Dark Reading interview here or you can watch the full interview at Dark Reading.

Finally, a DLP for Macs

Finally, a DLP for Macs

It’s time to face the facts, Macs are everywhere in the enterprise. In fact, a 2018 survey from Jamf pointed to the fact that more than half of enterprise organizations (52%) offer their employees a choice in their device of preference. Not entirely surprising, 72% of employees choose Mac. The Apple wave within business environments has begun and only promises to grow over time.

“ Legacy Data Loss Prevention (DLP) solutions don’t account for the Mac phenomenon and were not designed with them in mind. ”

The problem is that legacy Data Loss Prevention (DLP) solutions don’t account for the Mac phenomenon and were not designed with them in mind. As a result, legacy DLPs often approach Macs as an afterthought rather than a core strategy. Customer opinions of their DLP for Macs continue to be unfavorable. In fact, last year at Jamf’s JNUC event in Minneapolis, Mac users quickly revealed their sheer frustration with DLP and how it wasn’t built for Macs. Code42 customers currently using legacy DLP vendors vented about their Mac DLP experience saying, “It just sucks!”

Naturally, we asked why.

  1. No Support – Mac updates can be fast and furious. Unfortunately, DLP has traditionally struggled to keep up with those updates. The result? Errors, Kernel panics and increased risk for data loss.
  2. No OS Consistency – We often forget that today’s businesses often use both Mac and Windows. DLP has traditionally maintained a very Windows-centric approach that has made the Mac experience secondary and inconsistent with Windows. Having two sets of users with varying levels of data risk is never good.
  3. It’s Slow – The number one issue often stems from performance-sucking agents that bring the productivity of Mac users to a screeching halt.
  4. Kernel Panics – This is worth reiterating. Macs are sensitive to anything that poses a threat, so whenever perceived unsanctioned DLP software threatens Mac, it means reboots and an increased risk of downtime.
  5. It’s Complicated – Traditional DLP still relies on legacy hardware and manual updates, which is time consuming and expensive.

Recently, Code42 unveiled its Next-Gen Data Loss Protection Solution at the RSA Conference 2019. One of the reasons our 50,000+ customers love us is precisely because of the superior Mac experience we deliver. Our next-gen DLP solution was built with the Mac user in mind. Learn more about our trusted and proven take on DLP for Mac.

Code42 Product Spotlight: Identify Risk to Data Using Advanced Exfiltration Detection

Product Spotlight: Identify Risk to Data Using Advanced Exfiltration Detection

When it comes to data loss protection, there are fundamental security questions that every organization needs to answer. These include, “Who has access to what files?” and “When and how are those files leaving my organization?”

Code42 Next-Gen Data Loss Protection helps you get answers to these questions in seconds by monitoring and investigating file activity across endpoints and cloud services. And now, Code42 has expanded its investigation capabilities to provide greater visibility into removable media, personal cloud and web browser usage by allowing security analysts to search file activity such as:

  • Files synced to personal cloud services. Code42 monitors files that exist in a folder used for syncing with cloud services, including iCloud, Box, Dropbox, Google Drive and Microsoft OneDrive.
  • Use of removable media. Code42 monitors file activity on external devices, such as an external drive or memory card.
  • Files read by browsers and apps. Code42 monitors files opened in an app that is commonly used for uploading files, such as a web browser, Slack, FTP client or curl.

Advanced Exfiltration Detection can be applied to proactively monitor risky user activity — such as the use of USBs across an organization — as well as to eliminate blind spots during security investigations. For example, imagine you’ve just learned that a confidential roadmap presentation was accidentally sent to the wrong email distribution list. Sure, it can later be deleted from the email server. But did anyone download it? Has anyone shared it? By using Code42 to perform a quick search of the file name, you can answer those questions in seconds. You’ll not only see which users have downloaded the attachment, but also that one has since saved the file to a personal Dropbox account. With this information in hand, you can quickly take action against this risky data exposure.

See Advanced Exfiltration Detection in action.


Using-Delayed-Client-Updates-to-Test-the-Code42-App

Product Spotlight: Using Delayed Client Updates to Test the Code42 App

One of the benefits of selecting a Code42 cloud deployment is that that you don’t need to manage software upgrades. Code42 manages all infrastructure, and the Code42 app installed on endpoints is automatically updated when new versions are released. This process ensures your organization always has the latest security updates and newest functionality.

However, some customers have told us their change management process requires them to test new versions of the Code42 app with internal groups prior to distributing to the entire organization. Today we’re excited to announce new functionality that allows you to do just that.

With the new delayed client updates functionality, Code42 cloud deployment customers have up to thirty days to test new versions of the Code42 app before all endpoints are updated. In most cases, you will be notified one week prior to the release date so that you can prepare for the start of the testing period.

How to use delayed client updates

First, you must opt into this functionality by setting a global delay for all Code42 app updates. This delay can be set for up to thirty days. The selected global delay becomes the date on which all endpoints will receive a new version of the Code42 app after its release. Customers who do not set a global delay will continue to receive new versions of the Code42 app automatically on release date.

Once you’ve selected your global delay, you can specify organizations as “exceptions” to the delay date. These will become your test organizations. For example, if you’ve set your global delay to the thirty day maximum, you can arrange for the IT organization to receive the update on the general availability date, and for the marketing organization to receive the new app ten days after the release. This allows for sequenced testing with multiple test groups. If needed, you can also deploy to individual devices for targeted testing.

Once you’ve completed any desired testing, all Code42 apps will update automatically according to your global delay setting.

We hope this process allows you to follow your established change management process while still benefiting from the automatic updates that come with a cloud deployment. Happy testing!




Product Spotlight: Saved Searches

A Simple Way to Streamline Investigations

While every organization wants to protect its data, some files are more critical than others. You need to know where these “crown jewels” exist in your organization, and you don’t want to reinvent the wheel every time you need to find them. Fortunately, Code42 Next-Gen Data Loss Protection (DLP) can help you quickly and accurately locate these files — and save your search criteria so you can easily find them again in the future.

Code42 Next-Gen DLP protects your intellectual property from loss, leak, misuse and theft by showing you where critical files live and move. With Code42 Next-Gen DLP, you can quickly search for data using file hash, date range, type, filepath and more — to get a complete inventory of where important files reside on your endpoints and cloud services.

For example, suppose your organization has “secret sauce recipes” that are vital to your company’s success. These critical files should only be accessible to select employees — but how can you verify that is indeed the case? You can use Code42 Next-Gen DLP to see if your company’s secret sauce recipes are saved anywhere they shouldn’t be. Simply use Code42’s investigation capabilities to search for the SHA256 hashes of your most critical files.

Once you’ve built a search to identify the location of those special files, you can save the search criteria so you can quickly re-run a search in the future. These saved searches can be named and edited as needed. Saved searches pre-populate queries so that routine searches can be run more frequently.

Keeping your crown jewels safe is at the heart of a good data loss protection strategy. And now, Code42 makes this even easier using saved searches.

Gene Kim on DevOps, Part 3: DevSecOps and Why It’s More Important Than Ever (Video)

We at Code42 were fortunate to have our good friend Gene Kim, author of The Phoenix Project and the leading expert on DevOps, stop by our office for a conversation about DevOps and its implications for security. One of the best parts of the visit for me was talking with Gene about our DevSecOps experiences here at Code42 and how we have brought security into the DevOps model.

Here at Code42, we are on a mission to secure our customers’ ideas. That’s why our DevOps journey included bringing security into the DevOps model. I’m proud to say that we’ve been profoundly successful bringing those security risk controls into our process and making it part of our engineering process.

Security is often viewed—especially by engineering— as the department of “No.” Yet, in the DevOps model, you’re trying to embody self-service and autonomy, which can be difficult to square with accountability.

As our DevSecOps model has come together, our security team has been taking the time to establish the expectations for the engineering side of the house, and we’ve been able to implement those controls. One of the most gratifying outcomes for me is, instead of an after-the-fact security scan, we’re now proactively dealing with security as we design and build our software.

Now, engineering has the freedom to do what they need to do, because they’re able to talk more openly and collegially with the security team. A lot of the answers that were “No” before, when explained in the right context, become “Yes,” because the security team can enable the engineers to move forward.

During our interview, Gene echoed the advantages of bringing security to the DevOps table. “It’s been really gratifying to see organizations … call it not DevOps but DevSecOps,” said Gene. “Truly integrating all the information security objectives into everyone’s daily work.”

Hear what else Gene had to say about DevOps and its implications for security.

If you haven’t already, be sure to check out the previous two installments in our three-part blog and video series with Gene where he talks about what it takes to become a DevOps organization and the role of culture.

Gene Kim on DevOps, Part 1: How Do You Become a DevOps Organization?

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org (Video)

Gene Kim, author of The Phoenix Project and one of the most vocal thought leaders for DevOps, spent a day at Code42 headquarters in Minneapolis. During his visit, Gene talked about the optimal cultural conditions that must be in place for companies that embark on a DevOps journey and the advantages of bringing security to the table. This is the second installment in our three-part blog and video series, capturing our conversations with Gene.

As we’ve embarked on our own DevOps journey at Code42, we’ve experienced firsthand that the transformation must be embraced from a cultural perspective in order to make it happen. The core principals of DevOps require systematic thinking, coming together, gaining feedback and then at the same time, constant experimentation. For DevOps to work, it’s critical to have cultural norms that allow people to provide honest feedback without repercussions.

DevOps is not just for the engineering team. There’s something in DevOps that affects everybody from the systems architects to the operations teams to the very way in which QA is administered. In fact, the focus right now on privacy and security make the cultural perspective of DevOps more important than ever because it brings the security and engineering teams together in a very real way. That’s one of the things we at Code42 really appreciate about DevOps: that the cultural norms start to propagate around the organization, so you find groups collaborating across the company.

During my conversation with Gene, he reinforced the importance of team work. He said “Without a doubt, there has to be a sense of collegiality between information security and the engineering teams — that we are fellow team members working toward a common objective.  It’s so counter-intuitive how much more effective this is than the traditional high-ceremony and adversarial nature between infosec and everyone else!”

Listen to part two of my interview with Gene to hear what else he had to say about cultural norms, the absence of fear and empowering security.

“ Without a doubt, there has to be a sense of collegiality between information security and the engineering teams — that we are fellow team members working toward a common objective. ”

Check out the first part of our blog and video series with Gene for insights on how to become a DevOps org and part three — why DevSecOps is more important than ever.





Gene Kim on DevOps, Part 1: How Do You Become a DevOps Organization? (Video)

Gene Kim, author of The Phoenix Project, stopped by our offices. Gene, who is regarded in the industry as one of —if not the — most vocal enthusiasts of DevOps, is a friend of Code42 and a personal mentor of mine. I was thrilled to sit down and interview him. As a result of our visit, we created a three-part blog and video series, where we explore his views on DevOps — particularly security’s growing role. Our first installment opens with his thoughts on what goes into becoming a DevOps company.

The books Gene has written and his perspective on DevOps have changed the way we at Code42 think about our process. After going through our own DevOps journey, we’ve been optimizing our teams to improve our speed of delivery, ensuring we get our customers the features they need faster.

We are not the only ones to find ourselves on this transformational path. Many of our customers are on DevOps journeys of their own — or thinking about starting one — so we wanted to share our experiences and Gene’s best practices on becoming a DevOps organization.

When I was talking to Gene, I asked him about what it means to be a DevOps company, particularly in this day and age when security is such a top concern for businesses and consumers. We hope this video will help companies understand some of the implications and real advantages of adopting a DevOps model.

“ One of the biggest surprises coming off The Phoenix Project is just to see how much DevOps can dramatically improve the lives of not only developers, but also QA, operations and security. ”

During our conversation, Gene said, One of the biggest surprises coming off The Phoenix Project is just to see how much DevOps can dramatically improve the lives of not only developers, but also QA, operations and security.”

Be sure to check out the next two installments in our three-part blog and video series with Gene, where he talks about the role of culture in becoming a DevOps org and why DevOpsSec is more important than ever.

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org

Gene Kim on DevOps, Part 3: DevSecOps and Why It’s More Important Than Ever (Video)

Managing User Authentication in the Cloud

How do you manage user identities and permissions in your organization?

If you’re like 95 percent of enterprise companies, you’re using Microsoft’s Active Directory Domain Services, otherwise known as Active Directory or simply AD. This is the system that allows your employees to use the same username and password to access any domain-bound internal system, and allows your administrators to manage user identities, rights and permissions at scale. Since its introduction in the late ‘90s, AD has become the most robust, dominant and ubiquitous directory service utility in the technology world.

Before the advent of the cloud, AD was all most companies needed for identity management and authentication. AD managed the services, tools and data stores employees needed on-premises. To access these services with their AD credentials, employees needed direct local network access via an on-site device or a virtual private network.

Today, cloud-based Software as a Service (SaaS) solutions are replacing on-premises solutions of all kinds, including tools for collaboration and data sharing, office productivity, creative production work and data security.

As companies transition their data security to the cloud, identity management and authentication architectures have to transition, too. It can be difficult to keep track of where their AD data lives as it moves between clouds, data centers and endpoints. It can be hard to answer “who, what, when, where and how” data moved, so determining “why” can feel next to impossible.

As a long-time data security solutions provider, we’ve worked with hundreds of organizations as they make this journey. From those experiences, we’ve developed a set of recommendations to help you navigate this change to identity management and authentication systems while maintaining your data security and minimizing user hassle.

“ We’ve developed a set of recommendations to help you navigate this change to identity management and authentication systems while maintaining your data security and minimizing user hassle. ”

Identity management in the cloud

There are many benefits to using cloud-based SaaS services, including reduced costs for platform management and increased scalability. Of course, there are also challenges. One of the biggest problems to solve is integrating an existing on-premises AD identity management structure with these external tools. How can you leverage that existing structure so that users can access new SaaS tools with the same login credentials they’re accustomed to?

Single Sign-On

For security reasons, exposing your local AD server to the internet is not recommended. You could set up a lightweight AD server in a network DMZ that syncs with the internal AD domain controller and thus provides authentication for external requests. However, many cloud-based SaaS services don’t support querying AD, so this method could limit the services you can integrate into such a setup.

Enter single sign-on (SSO). Essentially, SSO is an authentication system that allows users to access multiple unrelated systems after just one login, because that initial login gives them an authentication token that’s trusted by those systems. For example, your company may use separate SaaS solutions in the cloud for human resources, training, CRM and project management. SSO allows users to log in to each of these systems through one central portal, because they all trust the SSO identity provider.

SSO solutions are widespread and compatible with the vast majority of cloud-based SaaS technologies because of the near-universal adoption of the Security Assertion Markup Language (SAML). SaaS technologies that use SAML 2.0 can seamlessly integrate with most SSO providers, as the majority “speak the language” of SAML 2.0.

SSO and AD: a bridge to cloud authentication

All of the major SSO identity platforms, such as Okta Identity Cloud, Google Identity Platform, Azure Active Directory and Ping Identity, have a variation on the concept of the “AD Connector” — a tool that synchronizes your AD user data with the SSO Identity provider. With such a tool, your employees use their AD username and password to log into a cloud-based SaaS tool via your SSO provider. AD makes a secure connection to your SSO identity provider but is otherwise safely walled off from the outside world. All your SaaS applications are able to leverage authentication via SSO because of the ubiquity of the SAML 2.0 standard.

Provisioning users

By utilizing a SAML 2.0-compliant SSO identity provider, you can easily solve the “login question.” The next step is to address provisioning. How do you make SaaS tools aware of those users in the first place? How can you organize the users so the permissions and organizational structure you’ve carefully set up in AD is mirrored in your SaaS tools? Finally, how can you automatically deactivate users in a SaaS tool when you deactivate them in AD?

This is where the System for Cross-domain Identity Management (SCIM) comes in. SCIM is an open standard for communicating user attributes, such as group membership, org membership and permissions, between distinct services. For example, SCIM shares user attributes between AD and an SSO identity provider, or between an SSO provider and a SaaS tool.

SCIM 2.0 is a much newer standard than SAML 2.0 and isn’t quite as ubiquitous. Some SSO providers, such as Okta and Google, use SCIM integrations to make provisioning users a snap. However, Google does not have an interface for setting up provisioning rules in a custom app (for example, a SAML 2.0 SaaS tool that you configured yourself without an official Google app). Some SAML 2.0 identity providers, such as Microsoft’s Active Directory Federation Services, do not support SCIM 2.0 at all.

To solve the “SCIM 2.0 isn’t always available” problem, some cloud-based SaaS applications have developed synchronization tools. For example, Code42’s User Directory Sync synchronizes AD user information via direct one-way communication from the customer’s AD server to the SaaS provider. In this example, Code42 still leverages SSO for user authentication, but user provisioning is made possible via a secure one-way sync.

Embrace the cloud era

The SSO market is fairly crowded, with behemoths like Microsoft and Google going head to head with startups like Okta that focus exclusively on SSO. The fact that these services all speak the same language and endeavor to solve the same problem — leveraging your existing identity management system for cloud authentication — means that tackling this problem has never been easier. The plethora of secure, robust SSO providers makes it easy to transition from your on-prem past to a future in the cloud. With this problem solved, you’ll have time to focus on the other complexities of digital transformation to the cloud, like gaining visibility into where your all of your data is created, stored and shared.