Rob-Securing-Data-in-Cloud-Chaos-Code42-Blog

Securing Data in Cloud Chaos

To succeed, every enterprise depends on data and the insights that can be gleaned from that data. Enterprises today are creating much more data than in prior years—much of it critical to their digital transformation efforts. And how this data is stored within enterprises has changed dramatically, which is having a profound impact on how that data must be secured.

How so? At one time, most enterprise data resided within enterprise databases and applications, and these applications remained (relatively) safely on enterprise endpoints or tucked back in the data center. Not anymore.

“ Gartner estimates that 80 percent of all corporate data today is actually stored unstructured. ”

That was the age of structured data. Today, data is more likely to be stored unstructured and reside in the form of word-processing files, spreadsheets, presentations, PDFs and many other common formats. The research firm Gartner estimates that 80 percent of all corporate data today is actually stored unstructured.

This means today our enterprise data is scattered everywhere. And just because it’s not structured within an application doesn’t mean the data isn’t critical – unstructured data today includes financial information, trade secrets, marketing plans and work with contractors and business partners. Not all of this data is the same nor is it managed in the same way — yet this data must be protected.

How we share unstructured data is also changing. No longer is data sent merely as email attachments. Today, data is shared through social media programs, cloud apps and communication platforms, such as Slack. In many organizations, staff are sharing sensitive data, such as consumer information, intellectual property, prospect lists, financial data and the like. Security teams need to be alerted when sensitive information is shared.

These trends should cause pause within anyone concerned about securing their enterprise information.

“ One of the most important steps for any organization that wants to start proactively securing their unstructured data is to determine where that data resides and then find viable ways to protect that data. ”

According to our 2018 Data Exposure Report, 73 percent of security and IT leaders believe there is data in their company that only exists on endpoints and 80 percent of CISOs agree that they can’t protect what they can’t see. Seventy-four percent believe IT and security teams should have full visibility over corporate data.

Unfortunately, without a dedicated and continuous focus on securing unstructured data, such visibility won’t ever exist. Only chaos. 

Yes, most organizations take reasonable steps to protect their applications and databases from costly data breaches. They invest in endpoint technologies that protect their users’ endpoints from malware. They focus on database security, application security and related efforts. And they try to control access to their local enterprise network. But the challenging reality remains: even if an organization executed perfectly on such a security architecture, it would still leave itself open to a vast amount of data theft and exploitation. The reason is that organizations are ignoring roughly 80 percent of their data that exists unstructured. 

Legacy security methods haven’t kept pace

It’s critical enterprises get the security of their unstructured data right. Securing unstructured data is different than securing data stored within applications and databases. 

One of the most important, and likely first, steps for any organization that wants to start proactively securing their unstructured data is to determine where that data resides and then find viable ways to protect that data. Other capabilities they’ll need in place will include monitoring who has access to that data, the ability to index file content across storage devices, cloud storage, and cloud services, and monitor that data for potential data loss, misuse and theft.

Having these capabilities in place will not only help organizations to better secure that data and identify careless handling of data or even malicious insiders, but also improve the ability to conduct in-depth investigations and identify threats, preserve data for regulatory compliance demands and litigation situations, and rapidly recover lost or ransomed files.

The fact is that unstructured data is 80 percent of enterprise data today, and the places it’s being stored are expanding. It’s imperative you give it the appropriate level of focus. While you can’t put the unstructured data back in the centralized data center again, you can bring a structured approach to data security that will reign in the chaos and adequately protect your enterprise information.

We Are All Surfing with the Phishes

We Are All Surfing with the Phishes

Phishing is in the news again – and for good reason. Last month, the story first came to light regarding a “megabreach” drop of 773 million email and password credentials. At first, this disclosure made a sizable splash. But as researchers dug in further, it turned out the dump of online credentials had been circulating online for some time, as independent security journalist Brian Krebs covered in his blog, KrebsonSecurity. Maybe the news wasn’t as big of a deal as we first thought? 

The news turned out to be bigger, in some ways. More large tranches of credentials continued to be uncovered in the days that followed. These new collections of credentials bring the total to 2.2 billion records of personal data made public. Even if the vast amount of these records is old, and by all estimates they probably are, this massive collection of information substantially increases the risk of phishing attacks that will target these accounts after they’d been pushed above ground.

“ According to the State of the Phish Report, since 2017 credential-based compromises increased 70 and 280 percent since 2016. ”

Phishing remains one of the most common and, unfortunately, successful attacks that target users – and it’s not just user endpoints that are in the sights of the bad guys. Often, phishers aim first at users as a way to get closer to something else they are seeking, perhaps information on corporate executives, business partners, or anything else they deem valuable. When an employee clicks on a link or opens a maliciously crafted attachment, his or her endpoint can then be compromised. That not only makes a user’s data at risk from compromise or destruction, such as through a ransomware attack, but attackers can also use that endpoint as a platform to dig deeper into other networks, accounts and cloud services. 

Consider ProofPoint’s most recent annual State of the Phish Report, which found that 83 percent of global information security respondents experienced phishing attacks in 2018. That’s up considerably from 76 percent in 2017. The good news is that about 60 percent saw an increase in employee detection after awareness training. According to the State of the Phish Report, since 2017 credential-based compromises increased 70 and 280 percent since 2016. 

Unfortunately, the report also found that data loss from phishing attacks has tripled since 2018. Tripled.

“ Someone is going to click something bad, and antimalware defenses will miss it. ”

This latest uncovering of credentials is a good reminder as to why organizations always have to have their defenses primed against phishing attacks. These defenses should be layered, such as to include security awareness training, antispam filtering, and endpoint and gateway antimalware, along with comprehensive data protection, backup and recovery capabilities for when needed, such as following a malware infection or successful ransomware attack. 

However, even with all of those controls in place, the reality is that some phishing attacks are going to be successful. Someone is going to click something bad, and antimalware defenses will miss it. The organization needs to be able to investigate successful phishing attacks. This includes investigating and understanding the location of IP addresses, gaining insights into the reputation of internet domains and IP addresses, and establishing workflows to properly manage the case. These investigations can help your organization protect itself by blocking malicious mail and traffic from those addresses, notify domain owners of bad activity, and even assist law enforcement. 

When you find a file that is suspected of being malware, you can then search across the organization for that file. Chances are that, if it was a malicious file in the phishing attack, it may have targeted many people in the organization. Nathan Hunstad details how, in his post Tips From the Trenches: Enhancing Phishing Response Investigations, our hunt file capability integrates with security orchestration, automation and response (SOAR) tools to rapidly identify suspicious files across the organization and swiftly reduce risk. 

There’s another lesson to be learned here, one that is a good reminder for your organization and your staff: We are all on the dark web, where much of its information is about us – all of the information that has been hacked over the years; such as financial information, Social Security numbers, credit reports, background checks, medical information, employment files, and, of course, emails and logon credentials, is likely to be found there. 

That’s why, while much of the information in this trove of credential information that has surfaced from the depths of the web turned out to be old information, it doesn’t mean there aren’t lessons here that need reminding. For instance, it is critical to assume the increased risks as a result of all of the information that is out there and how it can be used in phishing attacks.

Code42-Time-to-Bring-Shadow-IT-Into-the-Light

It’s Time to Bring Shadow IT Into the Light

Mention shadow IT to most enterprise IT and security professionals, and you are likely to elicit a frown. It’s understandable. At its worse, shadow IT, such as an unsanctioned server or cloud storage service, operated (shall we say, less than ideally) by business managers, can place systems and data at serious risk.

However, there’s another side to shadow IT. Shadow IT allows staff to choose their cloud apps and services, which helps improve productivity and drive innovation. Not to mention increase employee happiness. 

Still, shadow IT can and does pose significant risks to the organization, such as with the poorly managed server we mentioned. When users decide what cloud services they’re going to use themselves or how to collaborate with co-workers, IT loses visibility into these systems and data. Ultimately, what this means is enterprise data is scattered across multiple cloud services, and visibility into vitally important data is lost. Not good.

“ According to Gartner, shadow IT comprises roughly 40 percent of enterprise technology purchases. That is, business leaders decide, manage, and control nearly 40 percent of technology purchases. ”

After all, if IT doesn’t know a technology is in place, then it’s impossible to secure it or the data it holds. And it’s impossible to know who is accessing that data and why. 

Regardless, shadow IT is a permanent part of the enterprise landscape and IT and security teams need to adapt. According to Gartner, shadow IT comprises roughly 40 percent of enterprise technology purchases. That is, business leaders decide, manage, and control nearly 40 percent of technology purchases.

That much technology and the data it holds can’t remain to lurk in the shadows. 

We know why business users are so quick to embrace shadow IT. It can often take weeks or months for IT departments to deploy new servers or applications. But with only a credit card, business users can access cloud applications and services within minutes. 

The question becomes, how do IT teams harness that innovation from their staff, while also ensuring their data is adequately secured and protected?

They need to bring it out of the shadows. 

The first step is to assess what shadow applications and cloud services are in place so that there is an accurate baseline of the cloud applications and services in use.

There are a number of ways to achieve this, and the best method depends on the nature and size of your organization. You could start with a simple survey of the business groups to collect information on the applications they are using. Or you could begin by monitoring traffic and endpoints to see what applications are in use and where data is traveling. 

However you establish your baseline, the important thing is to get started. 

“ Now that you’ve identified shadow IT, whether it be cloud apps, storage or platforms, the goal shouldn’t be to reprimand or shut down these services. It should be to ensure the services that the staff has chosen are correctly managed and secured. ”

Now that you’ve identified shadow IT, whether it be cloud apps, storage or platforms, the goal shouldn’t be to reprimand or shut down these services. It should be to ensure the services that the staff has chosen are correctly managed and secured so that IT and security teams have adequate data visibility. That is, they can see what data is flowing to these services and ensure access to that data is controlled, and that the data is protected and recoverable. 

This way, when that poorly managed server is uncovered, it can be an opportunity for an educational moment. Staff can be made aware (or reminded) of how vital patching and systems updates and properly monitoring systems and data are to the security of the organization. And rather than taking the server down, IT can then monitor and properly manage it. The same is true for all cloud services and applications. Rather than trying to ban them all, manage them. 

One way to manage them is to use a solution like Code42 Next-Gen Data Loss Protection. It was built to collect information about every version of every file, giving businesses full visibility to where data lives and moves — from endpoints to the cloud. With that kind of oversight, security teams can monitor, investigate, preserve and ultimately recover their valuable IP without having to block data use or rely on the restrictive policies that are part of traditional data loss prevention (DLP). Instead of security teams working with limited visibility to a subset of files (when they need to gauge the risk of all their data) or hindering employee productivity, next-gen DLP helps them foster open, collaborative work environments.  

When shadow IT is managed in this way, the organization derives some distinct advantages. IT and security teams become better business enablers and support the needs of staff and business users. They become a trusted advisor and facilitator that helps the organization go forward securely.

Code42-Dont-Let-Your-Security-Be-Blinded-by-Cloud-Complexity

Don’t Let Your Security Be Blinded by Cloud Complexity

It’s incredible how complex today’s IT environments have become. Among the central promises of cloud computing were simplified management and security. However, almost paradoxically, it is the ease of cloud deployment and use that led to an explosion of adoption that has presented a significant challenge for security teams.

The challenge isn’t necessarily just the number of cloud services in use but how scattered an organization’s data becomes across these services. It doesn’t seem too long ago when nearly all enterprise data was stored on local drives or shared storage in a data center. No more. With the rise in popularity of cloud services, files are likely to be stored on user endpoints as well as across a number of cloud services, including Box, Google Drive, OneDrive or collaboration platforms like Slack and others.

“ Unfortunately, the rise in IT management complexity will continue to make for rising security challenges. ”

To add to the complexity, the research firm Gartner estimates that more than 80 percent of enterprise data is unstructured data, and most of that data is expected to be stored in cloud systems.

And, while this may be surprising — because it feels like cloud adoption has been ongoing for some time now — the reality is that the move to the cloud is still in its early stages. According to the market research firm Stratistics MRC, the global cloud storage market is expected to grow from its $19 billion market size in 2015 to more than $113 billion by 2022. That’s an annual growth rate of roughly 29 percent.

All of this compromises the ability of security teams to peer into the movement and location of the organization’s sensitive data. Security teams simply cannot monitor organizational data for changes or see where it travels. Security investigations become harrowing and require complex workflows with multiple tools to attempt to analyze potential security events — and forget about knowing for certain whether specific data files are backed up and recoverable.

These are questions security teams need to be able to answer — not only for security and regulatory compliance demands but to also ensure data availability for business.

Unfortunately, the rise in IT management complexity will continue to make for rising security challenges. And, let’s be honest, security technologies have not always made the jobs for security professionals easier.

Consider how difficult most security tools are to set up and manage. This is unfortunately the case when it comes to most prevailing security technologies: web application firewalls, intrusion detection and prevention systems, encryption and so on. The same is true for traditional enterprise DLP.

The more complex the environment, the more challenging security becomes, and the more seamless to the workflow enterprise security managers must be.

This is why we made Code42 Next-Gen DLP straightforward to connect to cloud services and easy to use. Rather than being blinded by complexity, security teams can see where files are moving to and quickly scrutinize if something needs to be investigated. It provides a comprehensive view of file activity across both endpoints and cloud services.

Code42 Next-Gen DLP is designed to simplify investigatory workflows, shorten incident response time and help to reduce security and compliance risks.

In order to effectively manage cloud complexity, security teams need to be able to simplify their workflows — and do so regardless of the cloud services employees choose to use. After all, our IT environments aren’t going to get any easier to manage any time soon. We are creating more files, which are being stored in more cloud services, than ever before — and security threats and regulatory demands aren’t going to go away either. Your best defense is to ensure you have the necessary visibility to manage and secure your user data no matter where that data is being used and stored.

Code42-Tis-the-Season-the-Greedy-Go-Phishing

‘Tis the Season the Greedy Go Phishing

It’s the time of year where we (hopefully) spend a little more time away from work and more time with friends and family to relax and celebrate. It’s to be expected that many of us are a bit more relaxed during the holiday season. Perhaps off-guard. This is exactly where the bad guys want us. They’re counting on it. It’s why they are more active this time of year.

The holidays have always been a time for the greedy to strike. Years ago, their primary vectors of attack were telemarketing scams used to promote fake charities. Of course, criminals still do these types of scams, but they have also kept up with the technological trends of the times. Today you are just as likely — if not more — to be hit with a phishing email, instant message or scam on social media.

“ As staff use corporate devices for both work and shopping — and accessing data files as well as connecting to the network — there is an increased risk that clicking on the wrong file or link could expose your organization to malware, data theft, ransomware attacks and more. ”

But Rob, this is a corporate security blog — why are you writing about consumer security? Well, here’s the thing: the scam and phishing-related activity doesn’t just place consumers at risk. After all, your corporate employees are consumers — and think about how the separation between people as consumers and workers has been erased. The days of employees having personal devices and work devices are long gone. Many organizations are BYOD now, either by policy or the reality on the ground.

The reality is your employees are using work devices to click on emails, shop and research the holiday gifts they hope to share. As staff use these devices for both work and shopping — and accessing data files as well as connecting to the network — there is an increased risk that clicking on the wrong file or link could expose your organization to malware, data theft, ransomware attacks and more.

Here are just some of the techniques attackers use to trick employees:

  • Emails that look like they come from insiders of the organization or trusted partners
  • Bogus websites that promise deep discounts, but are really designed to siphon personal data and credit card numbers
  • Mass phishing scams that impersonate popular retail brands (that steal usernames and passwords that thieves will try to use elsewhere)
  • Spurious order or shipment update emails
  • Phony charities
  • Social media updates and tweets crafted to trick people to scam websites
  • Holiday ecards (isn’t anything sacred?)

The good news is because attackers are using the holidays as a moment of opportunity, you can do the same thing by taking constructive steps to build employee awareness about phishing and online scammers. To protect their safety and yours, now is a perfect time to help them to understand that they are being targeted during the holiday season.

Here are some things to remind employees to do to protect themselves and your organization:

  • Avoid public Wi-Fi and always be sure to connect to secure internet.
  • Always use best practices when it comes to password management.
  • Use unique passwords for each service and never reuse work passwords for home.
  • Use a separate email for online shopping.
  • Dedicate one credit card or prepaid card for online shopping, and don’t use debit cards (the rules for fraud protection are often different).
  • Be vigilant for phishing emails, social media posts and direct messages. Don’t ever click on unfamiliar links; when an offer seems too good to be true, it probably is.
  • Look closely at all email communications — watch for minor changes in email address name or domain, the validity of the domain the links refer to, typos in the text of the message and odd grammar.
  • Remind them to back up their devices and data; this is the best way to recover from such things as ransomware attacks.

Of course, much of the same advice holds all year around, but it’s worth being extra diligent this time of year. The less time spent cleaning up malware and recovering from attacks, the more time we all have to enjoy the season.

Cybersecurity That Users Are Thankful For

When do you most value your applications or ability to access your data? That would be the very second after something goes awry and your access is lost. It’s true, and it’s like the cliché: you don’t know what you have until its gone.

In this way, computing is a lot like a utility service: we just expect to flip a switch and have the lights go on. We plan to dial a number and have the phone system work. Moreover, we don’t tend to think about how much we appreciate these technologies until the moment they don’t work as we expect. If you don’t believe me, talk to people diligently working on your IT support team right now. Ask them how often they get calls when everything is working right from staff, thanking them for ensuring access to their business-technology systems has remained available and smooth. 

Then ask them how often the phone rings when something goes down.

Exactly.

Cybersecurity is very similar. No one thinks about the technologies protecting them until they fail, and there’s a breach or systems become inaccessible. How security professionals help others manage risk can also create challenges.

“ While some rules are necessary, security technology that is focused on prevention only can position security teams as blockers and deniers. ”

What I mean by this is that often, when staff hears from their security teams, it’s because something went wrong. The user did something wrong, or the security team is going to inform staff that they can’t continue doing things a certain way: Don’t access public Wi-Fi without a VPN. Stop using this password. Hurry up and patch and reboot all of these systems. No, you can’t use that cloud service; you have to use this cloud service instead.

While some rules are necessary, security technology that is focused on prevention only can position security teams as blockers and deniers. There are, however, other ways security teams can serve as business partners and architect solutions that not only secure data but also make it easier for users to get their work done. At Code42, we are always looking for ways to provide added value directly to the user.

Here’s an example. As part of the Code42 Next-Gen Data Loss Protection solution, we also provide users the ability to back up and secure their data. Data loss protection with that extra level of recoverability gives the user additional peace of mind. They know that if their notebook dies, or someone clicks on a malicious link, that they don’t have to panic. There’d be no reason to. They’ll see something went wrong, but they’ll know their data is backed up and safe and can be recovered.

Recently, I had the opportunity to watch this play out with a customer. They wanted to make a security purchase, but they were low on budget at the time. They thought they had to postpone their purchase. However, when the IT team found out that they would get data leak protection and the ability to consolidate their endpoint backup solution, they decided to move forward.

They ended up going forward with the investment because they realized that this was a win for the IT team, the security team and the end user.

My takeaway from this experience is also a good lesson for security professionals: don’t over-focus on prevention technology that is narrowly focused on denying and blocking. Look for solutions that enable end users and IT to be not only more secure but also more collaborative and productive. And that’s something everyone would be thankful for.

Security-must-enable-people-Code42-Blog

Security Must Enable People, Not Restrain Them

Do you ever think about why we secure things? Sure, we secure our software and data so that attackers can’t steal what’s valuable to us — but we also secure our environments so that we have the safety to do what we need to do in our lives without interference. For example, law enforcement tries to keep the streets safe so that civilians are free to travel and conduct their daily business relatively free of worry.

Now consider how everyday police work keeps streets safe. It starts with the assumption that most drivers aren’t criminals. Officers don’t stop and interrogate every pedestrian or driver about why they are out in public. That type of policing — with so much effort spent questioning law-abiding citizens — would not only miss spotting a lot of actual criminal behavior, it would certainly damage the culture of such a society.

There’s a lot we can learn about how to approach data security from that analogy. Much of cybersecurity today focuses on trying to control the end user in the name of protecting the end user. There are painful restrictions placed on how employees can use technology, what files they are able to access and how they can access them. Fundamentally, we’ve built environments that are very restrictive for staff and other users, and sometimes outright stifling to their work and creativity.

This is why we need to think about security in terms of enablement, and not just restraint.

“ Security should be about enabling people to get their work done with a reasonable amount of protection — not forcing them to act in ways preordained by security technologies. ”

Prevention by itself doesn’t work

What does that mean in practicality? Consider legacy data loss prevention (DLP) software as an example. With traditional DLP, organizations are forced to create policies to restrict how their staff and other users can use available technology and how they can share information and collaborate. When users step slightly “out of line,” they are interrogated or blocked. This happens often and is mostly unnecessary.

This prevention bias is, unfortunately, a situation largely created by the nature of traditional DLP products. These tools ship with little more than a scripting language for administrators to craft policies — lots and lots of policies, related to data access and how data is permitted to flow through the environment. And if organizations don’t have a crystal-clear understanding of how everyone in the organization uses applications and data (which they very rarely do), big problems arise. People are prevented from doing what they need to do to succeed at their jobs. Security should be about enabling people to get their work done with a reasonable amount of protection — not forcing them to act in ways preordained by security technologies.

This is especially not acceptable today, with so much data being stored, accessed and shared in cloud environments. Cloud services pose serious challenges for traditional DLP solutions because of their focus on prevention. Since so many legacy DLP products are not cloud native, they lose visibility into what is happening on cloud systems. Too often, the result is that people are blocked from accessing the cloud services they need. Once again, users are treated like potential criminals — and culture and productivity both suffer.

This is also a poor approach to security, in general. As security professionals who have been around a while know, end-user behavior should never be overridden by technology, because users will find ways to work around overbearing policies. It’s just the law of governing dynamics and it will rear its head when the needs of security technologies are placed above the needs of users.

Where’s the value for users?

There is one last area I’d like to go over where traditional DLP falls short when it comes to providing user enablement, and it’s an important one. Traditional DLP doesn’t provide any tangible value back to staff and others when they are working in an environment protected with legacy DLP. All they typically get are warning boxes and delays in getting their work done.

In sum, traditional DLP — and security technology in general — doesn’t just prevent bad things from happening, it also too often prevents users from doing what they need to do. They feel restrained like criminals for simply trying to do their jobs. In actuality, a very small percentage of users will ever turn malicious. So why should we make everyone else feel like they are doing something wrong? We shouldn’t.

Code42 Next-Gen DLP

At Code42 we believe it’s essential to assume the best intentions of staff and other users. That’s why Code42 Next-Gen Data Loss Prevention focuses on identifying malicious activity, rather than assuming malicious intent from everyone. It’s why the product is built cloud-native: organizations aren’t blind when it comes to protecting popular cloud services, and users aren’t blocked from working the way they want to work. It also doesn’t require policies that need to be created and forever managed that pigeonhole users to work certain ways.

Finally, we believe in providing value to the end user. It’s why we provide backup and restore capability in Code42 Next-Gen DLP. This fundamentally gives users the freedom to make mistakes and recover from them, and it gives them the knowledge that that their data is also protected and safe.

Because it doesn’t block or interrogate users every step of the way, we believe Code42 Next-Gen DLP helps users to be more secure and productive, and enhances organization culture. It also provides the security team the opportunity to be an enabler for their end users, not an obstacle.

In this sense, Code42 Next-Gen DLP is a lot like good police work. It gives its users the freedom they need to move about the world without every motion being questioned for potential malicious intent. This is a very powerful shift in the workplace paradigm; users should be empowered to behave and collaborate as they want without fear or worry regarding the security technology in place.

Gene Kim on DevOps, Part 3: DevSecOps and Why It’s More Important Than Ever (Video)

We at Code42 were fortunate to have our good friend Gene Kim, author of The Phoenix Project and the leading expert on DevOps, stop by our office for a conversation about DevOps and its implications for security. One of the best parts of the visit for me was talking with Gene about our DevSecOps experiences here at Code42 and how we have brought security into the DevOps model.

Here at Code42, we are on a mission to secure our customers’ ideas. That’s why our DevOps journey included bringing security into the DevOps model. I’m proud to say that we’ve been profoundly successful bringing those security risk controls into our process and making it part of our engineering process.

Security is often viewed—especially by engineering— as the department of “No.” Yet, in the DevOps model, you’re trying to embody self-service and autonomy, which can be difficult to square with accountability.

As our DevSecOps model has come together, our security team has been taking the time to establish the expectations for the engineering side of the house, and we’ve been able to implement those controls. One of the most gratifying outcomes for me is, instead of an after-the-fact security scan, we’re now proactively dealing with security as we design and build our software.

Now, engineering has the freedom to do what they need to do, because they’re able to talk more openly and collegially with the security team. A lot of the answers that were “No” before, when explained in the right context, become “Yes,” because the security team can enable the engineers to move forward.

During our interview, Gene echoed the advantages of bringing security to the DevOps table. “It’s been really gratifying to see organizations … call it not DevOps but DevSecOps,” said Gene. “Truly integrating all the information security objectives into everyone’s daily work.”

Hear what else Gene had to say about DevOps and its implications for security.

If you haven’t already, be sure to check out the previous two installments in our three-part blog and video series with Gene where he talks about what it takes to become a DevOps organization and the role of culture.

Gene Kim on DevOps, Part 1: How Do You Become a DevOps Organization?

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org (Video)

Gene Kim, author of The Phoenix Project and one of the most vocal thought leaders for DevOps, spent a day at Code42 headquarters in Minneapolis. During his visit, Gene talked about the optimal cultural conditions that must be in place for companies that embark on a DevOps journey and the advantages of bringing security to the table. This is the second installment in our three-part blog and video series, capturing our conversations with Gene.

As we’ve embarked on our own DevOps journey at Code42, we’ve experienced firsthand that the transformation must be embraced from a cultural perspective in order to make it happen. The core principals of DevOps require systematic thinking, coming together, gaining feedback and then at the same time, constant experimentation. For DevOps to work, it’s critical to have cultural norms that allow people to provide honest feedback without repercussions.

DevOps is not just for the engineering team. There’s something in DevOps that affects everybody from the systems architects to the operations teams to the very way in which QA is administered. In fact, the focus right now on privacy and security make the cultural perspective of DevOps more important than ever because it brings the security and engineering teams together in a very real way. That’s one of the things we at Code42 really appreciate about DevOps: that the cultural norms start to propagate around the organization, so you find groups collaborating across the company.

During my conversation with Gene, he reinforced the importance of team work. He said “Without a doubt, there has to be a sense of collegiality between information security and the engineering teams — that we are fellow team members working toward a common objective.  It’s so counter-intuitive how much more effective this is than the traditional high-ceremony and adversarial nature between infosec and everyone else!”

Listen to part two of my interview with Gene to hear what else he had to say about cultural norms, the absence of fear and empowering security.

“ Without a doubt, there has to be a sense of collegiality between information security and the engineering teams — that we are fellow team members working toward a common objective. ”

Check out the first part of our blog and video series with Gene for insights on how to become a DevOps org and part three — why DevSecOps is more important than ever.





Gene Kim on DevOps, Part 1: How Do You Become a DevOps Organization? (Video)

Gene Kim, author of The Phoenix Project, stopped by our offices. Gene, who is regarded in the industry as one of —if not the — most vocal enthusiasts of DevOps, is a friend of Code42 and a personal mentor of mine. I was thrilled to sit down and interview him. As a result of our visit, we created a three-part blog and video series, where we explore his views on DevOps — particularly security’s growing role. Our first installment opens with his thoughts on what goes into becoming a DevOps company.

The books Gene has written and his perspective on DevOps have changed the way we at Code42 think about our process. After going through our own DevOps journey, we’ve been optimizing our teams to improve our speed of delivery, ensuring we get our customers the features they need faster.

We are not the only ones to find ourselves on this transformational path. Many of our customers are on DevOps journeys of their own — or thinking about starting one — so we wanted to share our experiences and Gene’s best practices on becoming a DevOps organization.

When I was talking to Gene, I asked him about what it means to be a DevOps company, particularly in this day and age when security is such a top concern for businesses and consumers. We hope this video will help companies understand some of the implications and real advantages of adopting a DevOps model.

“ One of the biggest surprises coming off The Phoenix Project is just to see how much DevOps can dramatically improve the lives of not only developers, but also QA, operations and security. ”

During our conversation, Gene said, One of the biggest surprises coming off The Phoenix Project is just to see how much DevOps can dramatically improve the lives of not only developers, but also QA, operations and security.”

Be sure to check out the next two installments in our three-part blog and video series with Gene, where he talks about the role of culture in becoming a DevOps org and why DevOpsSec is more important than ever.

Gene Kim on DevOps, Part 2: The Cultural Impact of becoming a DevOps Org

Gene Kim on DevOps, Part 3: DevSecOps and Why It’s More Important Than Ever (Video)