As you grow more comfortable in the cloud, you’ll likely find more and more of your applications making the transition to your VMware environment even as your existing virtual apps continue to grow. Add backups or DR to the list and your storage use might start to get a little out of control, especially with linked clone virtual desktops or snapshot trees that save multiple VM states.
Storage costs can add up, so you’ll want to stay proactive in maximizing disk usage and eliminating inefficiencies. There are a few ways to maximize storage and reduce disk sizes depending on the VMware product and deployment:
Update: vCloud 5.5 (which the gBlock Cloud was recently updated to) will work with Firefox.
As of June 2014, the newest release of Mozilla Firefox (30) is not compatible with the console viewer of the vCloud Environment. Firefox v30 can be used to navigate the environment itself, but when attempting to console into VMs, the browser experiences hang-ups and is ultimately unable to connect.
VMware is aware of this issue, which is caused by a code update with Firefox 30 that is not supported by vCloud Director 5.1. Unfortunately the latest VMware vCloud Director version is v5.5, so we do not expect support patches for minor issues in v5.1. Currently, Internet Explorer 11 (with compatibility mode ON, and Enhanced Security Configuration OFF) works when attempting to use the console from the vCloud Environment.
The current workaround for the Firefox issue is rolling back to a release before v30 (preferably v29, as it contains many of the same features and look as the newer version). To rollback Firefox to a previously compatible version, follow these steps:
With more and more companies taking advantage of cloud computing for on-demand infrastructure and additional resources, penetration testers are being called upon to perform more security testing on virtualized environments. Clients may require testing for compliance standards like PCI DSS, or they may be evaluating multiple cloud providers for the most secure option. The cloud brings with it a new set of considerations for testers, as a virtual environment could house multiple tenants on the same architecture.
The first thing to decide is whether you are outsourcing pen testing to a third party or keeping it in-house with your security team. With a third party you will only need to mitigate any contract and SLA problems. Be sure to vet a third party thoroughly, asking exactly what they will test, what tools they will use, scan policies, whether they used white-box or black-box testing (in black box, the tester infiltrates without any previous knowledge of the environment, while white box is the opposite).
Either way you’ll need to know exactly what will be tested including which applications, database servers, devices including storage, and devices.
A recent post on Government Technology recounts the stressful tale of a data center fire in an Iowa government facility and the subsequent scramble to restore operations. Despite a second available data center, the team chose to get the original facility up and running. Thanks to the fire department and their own fire suppression systems, the equipment was salvaged and back online in just twelve hours.
With the amount of electrical equipment and heat generated in a data center, fires are a real and constant threat. Whether your company has in-house infrastructure or hosts with a data center service provider, knowing which suppression systems to use and how to respond if they fail is essential to avoid downtime should disaster strike.
Although they started to gain real momentum circa 2011 or so, modular and containerized data centers are still spreading their way across the industry. The two models share many similarities: ease of deployment, the ability to add more computing power more or less on demand, highly energy efficient operation, and some degree of prefabrication. Depending on the enterprise and IT needs, each has distinct advantages and disadvantages for data center design and infrastructure procurement.
Why go modular or containerized? Both models provide a standardized kit to scale out a data center piece by piece. A facility can be designed with an initial baseload for power and then built out with racks, cooling, and support equipment as needed. As more customers come on or the company grows larger, new servers and networking equipment are added to meet demand.