We've posted quite a bit about best user practices to maintain the integrity of your IT infrastructure, especially strong password hygiene, the use of antivirus/antimalware, and the importance of backups in the case something goes awry. With user negligence causing up to 68% of breaches, according to a Ponemon Research study, these practices are essential. But how can you make sure your employees adhere to them?
But a recent article covering the Clinton presidential campaign staff methods to encourage information security reveals one secret to IT security: being kind of annoying.
Placing data in the cloud comes with a set of concerns — accessibility (will my information always be available if the cloud has technical problems?) and security (how safe is my data when I can’t control the security measures?) chief among them. Of these, security has long been the primary concern for technology decision makers considering the cloud.
Recent surveys reveal that while security remains top of mind, the location of data is rising in prominence as a barrier or concern for cloud adoption. These concerns stem in part from the difficulty of visibility into data transit and storage. Customers might want to know where exactly their data is residing so they can retrieve it quickly — and also for legal implications.
Two recent court cases between Google, Microsoft, and the Federal Government highlight the legal entanglements that could come with storing information in the cloud. Read on to learn why the location of your cloud data is vital.
You’re probably familiar with the kind of performance issues inherent in antivirus/antimalware tools. Anyone who has used a PC when the antivirus scan boots up can attest to sluggish performance. The same issues rear their head when using antivirus in a virtual environment – but virtual machines come with their own set of wrinkles.
Antivirus software can be installed either on the VM itself or on the host. Depending on your approach, you’ll want to consider these key factors to maximize performance.
Many cloud discussions center around data security. When infrastructure is out of corporate control, it’s natural to be concerned about the precautions taken to protect vital information assets. Ultimately, cloud security is not any weaker than on-premise data centers, but it turns out that corporate IT departments aren’t really concerned about losing data, anyway.
They’re worried about what everyone else will think if they lose that data.
With only 25% of companies are equipped to handle data breaches, corporations still cite damage to reputation as the biggest risk of being hacked. A recent study from the International Association of Privacy Professionals found that 83% of public companies in the United States cite the impact to corporate reputation as the number one risk of a data breach.
Passwords – we love to hate them. Despite scribbled pages of notes and password keepers, we always forget them at the most inconvenient time. (By the way, written notes are a very insecure way to remember your password). They expire before we remember to reset them, as the IT department sets required password change rules. These days it feels like they have to be one hundred letters long, including hieroglyphics, roman numerals, and emojis.
And despite all that, they still aren’t very secure. Every few months we hear about another massive breach. One of the biggest, and most recent, was Yahoo. The company only just reported a 2014 breach that compromised 500 million users’ names, e-mail address, and other personally identifying information. If the password information could be decrypted and used along with this other PII, user accounts across other services – even bank logins – could be accessed. According the 2016 Verizon Data Breach Investigations Report, compromised passwords were used as a means of access for many attacks as well.
Is it time to ditch passwords all together? What might replace them? The technology, it turns out, is just around the corner.
Private vs. public cloud is a battle many thought was over years ago, and some recent think pieces seem to confirm that notion, claiming no one can match the economies of scale delivered by hyperscale cloud providers.
But private cloud, or on-premise virtualization, can still be a less expensive option — if you have the staff and capabilities to support it. A recent study from 451 Research describes when the tipping point is in the favor of private cloud and when public cloud has a lower total cost of ownership (TCO), based on utilization of hardware and efficiency of your staff.
Your business probably has faster internet than your home. If you’re with an enterprise, you almost certainly have some quality broadband. Plugging into the cloud can be a relatively painless process, albeit one that requires careful planning, but without considering your network design and connection speeds, even a simple cloud migration can become time-consuming, expensive, and difficult to manage.
If you work in IT, the idea of a data breach is probably a lot spookier than some ghost invading the data center. October is Cybersecurity Month in the United States, and organizations like the FBI, the National Cyber Security Alliance, Sophos, and others are promoting secure digital practices for home users and businesses. It’s the perfect time to reevaluate your approach to cybersecurity and make sure you’re cultivating a culture of cybersecurity.
With ransomware continuing to spread at an ever more rapid clip and the cost of IT system downtime hitting over $1 million for the average enterprise, you can’t afford to lose productivity to viruses, malware, or stolen intellectual property. Here are some quick tips to help foster secure digital practices in your workplace.
How secure is your data center? In order to guarantee security, maintain uptime, and pass HIPAA and SSAE 16 Type II certifications, Green House Data has over sixty auditable security, environmental, and compliance control measures. Each compliant data center is audited once per year.
Some of these control points are standard practice, while others had to be added to daily routines in some facilities in order to gain compliance and bring them up to our strict standards. This list can help you get your data center up to speed – or see just how much effort goes into keeping server rooms monitored, secured, and fully auditable.
See all 61 points we check for security and auditability after the jump.
According to a recent study by Emerson, cybercrime is the fastest growing cause of data center outages. To stay ahead of increasingly sophisticated attacks, infrastructure managers must combine software and hardware tools to constantly monitor, recognize, block, and remediate. Keeping an eye on network traffic is essential to accomplish this, and one developing method of network security control uses microsegmentation to do so.
Network microsegmentation is enabled by software-defined data center technology like VMware NSX. It gives network administrators new abilities to shape network traffic based on global policy, increasing security by crafting security policies around specific network segments or virtual machines.