It’s been over a month since I attended the Gartner IT Symposium/Xpo in Orlando and I’ve spent that time really chewing on some of the great sessions and thought leadership presented at the show. Modern IT practices remain a moving target so plugging into the analyst machine every once in a while helps me get a bigger picture beyond even our day to day at Green House Data (which can be pretty diverse itself, with big pushes on DevOps and digital transformation while we balance our existing data center, cloud, and managed services pillars).
It was interesting hearing Gartner start to shift their message from “cloud is the only option” to “cloud is an option.” As cloud adoption strategies have matured we have seen this attitude shift as well, with more organizations looking multi-cloud while maintaining some on-prem systems. One presentation on public cloud costs compared to on-prem data centers really helped drive this home. The bottom line is that the cloud is not automatically cheaper or even necessarily more efficient depending on the application or purpose of the deployment.
Other major topics included how to find digital talent, as the management of human capital and IT teams continues to evolve alongside the industry, as well as one of my favorite presentations, “Are You Maximizing Your Security Operations Center,” which had a ton of great information around security.
With the symposium still fresh in mind, here is my list of where enterprise IT operations are heading in 2020 and beyond.
If your organization is large enough to have an information security manager or an entire security team, then it’s likely that any security issue or task will be pushed in their direction. That’s why you hired them, isn’t it?
Security is a specialized area of IT and it requires specific skills for a holistic approach. It is also a moving target with many components and attack vectors across your technology stack. A dedicated security team or individual, whether in-house or contracted, can therefore be valuable. But security must be a shared responsibility among every user, no matter their role.
There’s an inherent problem here and its name is Diffusion of Responsibility. When everyone has a stake in security and there are dedicated managers to boot, users could be more likely to engage in risky behavior. After all, it’s taken care of! That’s why we hired that security guy.
There are two main categories of application security testing: dynamic and static. They can be thought of as testing from the outside-in and from the inside-out, respectively.
Dynamic testing is performed as an application is running and focuses on simulating how an outside attacker might access that application and associated systems. Static testing, on the other hand, examines the code itself and related documentation, often throughout the actual development process, to try and discover potential vulnerabilities before the application reaches production.
Should you use DAST or SAST for your applications? In truth it is not an either/or situation, as DAST and SAST are complementary and evolved indivually. First let's take a look at the key differences between them.
Ransomware is a digital attack in which an executable or malicious link opened by an unsuspecting (and likely untrained) user installs a program that blocks access to applications, phone systems, and/or data until a ransom is paid. It’s been making the rounds for many years now. But only lately have hackers begun zeroing in on a specific vertical: state and local governments.
In 2019, over 22 governments have been affected by ransomware – and that number was prior to recent news breaking that an additional 22 small towns in Texas were all targeted in a single coordinated attack.
Over 200 state, county, or city government IT systems have been targeted in recent years. With thousands and thousands of cities and towns across America, that may seem like a drop in the bucket. But ransomware is becoming easier and easier to distribute and users continue fall victim; usually via phishing emails or web exploits that deliver malware without any user action outside of visiting an apparently innocuous site.
Why are governments becoming a preferred target for ransomware? And how can you improve your chances of avoiding or mitigating ransomware?
In InfoSec we continually encounter the unknown, the unfamiliar. Technology marches ever forward, application design matures, bells and whistles chime and toot. This commonly results in the InfoSec professional needing to responsibly secure technology that they don’t holistically understand. Attackers know this, for it is within those gaps in understanding that malicious activity may most readily occur and may do so without notice.
A common InfoSec response to the unfamiliar is to attempt to cover all potential angles of attack, regardless of whether they are pertinent to the technology. This is done in order to ensure that we meet both risk and governance management goals. The result of this approach is rarely better security. Rather, it typically results in unnecessarily complicated security control implementations that are neither functional (e.g., they don’t do what we want/expect them to do) nor operational (e.g., our personnel can’t adequately manage them).
How do we avoid over-complication in our security controls? We focus on the fundamentals: Preparation, Awareness, Response.
Migrating e-mail and productivity apps to the cloud is a no brainer. Continuous updates, access from anywhere, no need to manage the supporting servers and associated hardware…the benefits are clear. As with any IT outsourcing, however, careful planning around security measures is essential. And with your O365 environment exposed to the public internet, security best practices are even more important.
While securing Office 365 is an ongoing effort, there are several top priorities that should be first to be addressed after your migration.
Digital transformation may be a bit of a catch all for adopting modern IT principles and technologies, from cloud platforms and services to mobility and big data to DevOps practices, but it is a real movement throughout the business realm.
The primary gist is to not only introduce new tech, but to also take a close look at the business processes and organizational units behind them to ensure that innovation can occur, and the bottom line is improved. In other words, technology for the sake of technology won’t solve any business problems. You must transform your entire organization with a combination of technology and process.
True digital transformation involves your entire organization and results in the integration of various systems and operations across the business. If that sounds like a major undertaking, it is.
It also comes with a slew of information security concerns that should not be overlooked in the rush to the cloud.
If you’ve newly set foot on the path of an InfoSec student, you will benefit from understanding this topic. If you’ve been around awhile, you’ve lived it.
There are two basic types of Information Security engagements in terms of how they are scoped. This is most applicable to managed services providers (MSPs), though it remains relevant to a practitioner supporting an internal corporate or public sector security team. For the sake of simplicity, I’m going to call them FFP and T&M. The purpose of this blog isn’t to dig deep into financial models, but rather to discuss, in a simplified manner, how they drive the delivery of work. And then, to discuss an alternative model.
With both Fixed Firm Price and Time & Materials engagements – and really any other model of InfoSec contract scope – there are some overlapping goals and realities.
Microsoft recently revealed a service called Azure Bastion that allows customers a more secure way to connect and access virtual machines (VMs). It uses Remote Desktop Protocol (RDP) and Secure Shell (SSH) network protocol alongside Secure Sockets Layer (SSL) encryption.
Bastion connects VMs, your local computers, and cloud resources without exposing them to public network connections. As a Platform as a Service, it simplifies the process of setting up and administrating bastion hosts or jumpboxes in your cloud environment.
But what are bastion hosts or jumpboxes? And why would you use them, or a service like Azure Bastion?
You would be woefully uninformed and unprepared as an IT admin if you didn’t know that two major Microsoft products, the 2008 versions of SQL Server and Windows Server, are each about to reach their end of support. That means it’s time to upgrade or migrate lest you fall victim to inevitable security vulnerabilities.
One big question when facing a major software upgrade such as this is whether to remain in place, so to speak, and update to the latest version from your current deployment scenario on premise or in a hosted environment, or to move to a cloud-based server – namely Azure, since that offers you tight integration and lower costs with Microsoft products such as these.
SQL Server end of support is imminent, coming up on July 9, 2019. Windows Server has a few months to go, ending support on January 14, 2020.