By now, even your non-techy mom has probably heard of Big Data, with IBM and others advertising it on TV and every other IT vendor pushing their platform. If you don't know about big data, here is it is in a nutshell: as more and more devices are connected to the internet and storage capabilities continue to advance, we’re able to collect, store, and run analytics on massive sets of information in order to discover insights and make more informed decisions.
Some industries like research, oil and gas, manufacturing, and logistics, have been doing this for years, often on dedicated hardware. The advantages of virtualization can be levied for big data use, too, even though at its core, big data is focused on distribution of jobs over a wide array of resources, while virtualization as a concept is the exact opposite.
If you’re gearing up for a big data deployment, you can use VMware tools to stack it on top of virtual machines, allowing you to add resources easily when you need to run large analytics jobs and scale back when you don’t need as much processing power or want to delete old unused datasets from storage. This elasticity helps maximize your available compute resources and can be used in a mixed-workload environment. Plus, you can manage and automate your big data VMs from the same tools as your other infrastructure.
Here is quick primer on what to keep in mind with VMware big data platforms.
Early this month, VMware announced vSphere 6.0, the latest version of the most popular and powerful enterprise virtualization platform. As a longtime VMware shop with a number of VCP certified professionals, we’re excited for a number of the new features included in the latest release.
According to VMware’s CEO, vSphere 6.0 has over 650 new features. Green House Data doesn’t yet have a timeline for upgrading from 5.5 just yet (6.0 hasn’t received General Availability), but here’s what we’re most excited about.
Security is already high on the totem pole of IT priorities, but with 2015 kicking off with a massive Anthem health breach, encryption is a hotter topic than ever.
Many compliance mandates require or encourage some form of encryption, including the commonly encountered PCI and HIPAA standards (the HIPAA Security Rule, while it doesn’t require encryption, does require you to prove, in writing, why you believed encryption wasn’t necessary in your special case. Which, let’s be honest, if you are disclosing a large breach to the public as required, encryption was probably necessary).
There are many encryption methods and vendors on the market, but all of them require access to an encryption key in order to unscramble encoded data. If a malicious agent gets their hands on this key, it’s game over for your encrypted information.
This means that every enterprise needs a secure, organized system to manage all of their encryption keys. As data sets are updated with new keys, new data is added, different encryption systems are introduced, and user access is modified, encryption key management becomes even more essential.
Amazon is building a wind farm. Apple has tens of megawatts of solar power. Google builds in Nordic countries and sources power from hydroelectric facilities. Other providers started giving out Renewable Energy Credits to cover customers’ electric costs (and they aren’t the only ones buying RECs and PPAs—but this blog post isn’t just about Green House Data).
Despite the recent trend of data center operators going green, does the IT industry really care about the environmental footprint of their data center? According to our recent survey results, the answer is a resounding…maybe.
We polled 166 IT professionals, from system administrators to the CTO. All but two had input into IT and infrastructure decision-making at their company. What they had to say was surprising. In a nutshell, it makes smart business sense to have a green data center, because it saves on operating costs. Whether IT departments consider energy efficiency or sustainability when evaluating service providers is up for debate.
E-mail, as we noted in last week’s blog, remains critical to business functions, and Microsoft Exchange is the most widely used e-mail client in the world. Virtualizing Exchange servers on VMware can improve performance, allow you to consolidate various Exchange server roles, combine mailboxes, and increase flexibility of your Exchange infrastructure, so you can scale up or down as your e-mail loads demand.
You’ll end up with 5-10x less physical hardware and more responsive Exchange, plus you can design your environment for your current workload. No need to guess at your resource utilization 3-5 years down the road—just provision a few more VMs when the time comes.
While virtualization can increase performance (VMware claims a 16 core server with vSphere produced double the throughput as physical hardware), Exchange has its own set of requirements and demands, so take a look at these best practices before you start up the installer in your virtual environment.