Update: vCloud 5.5 (which the gBlock Cloud was recently updated to) will work with Firefox.
As of June 2014, the newest release of Mozilla Firefox (30) is not compatible with the console viewer of the vCloud Environment. Firefox v30 can be used to navigate the environment itself, but when attempting to console into VMs, the browser experiences hang-ups and is ultimately unable to connect.
VMware is aware of this issue, which is caused by a code update with Firefox 30 that is not supported by vCloud Director 5.1. Unfortunately the latest VMware vCloud Director version is v5.5, so we do not expect support patches for minor issues in v5.1. Currently, Internet Explorer 11 (with compatibility mode ON, and Enhanced Security Configuration OFF) works when attempting to use the console from the vCloud Environment.
The current workaround for the Firefox issue is rolling back to a release before v30 (preferably v29, as it contains many of the same features and look as the newer version). To rollback Firefox to a previously compatible version, follow these steps:
With more and more companies taking advantage of cloud computing for on-demand infrastructure and additional resources, penetration testers are being called upon to perform more security testing on virtualized environments. Clients may require testing for compliance standards like PCI DSS, or they may be evaluating multiple cloud providers for the most secure option. The cloud brings with it a new set of considerations for testers, as a virtual environment could house multiple tenants on the same architecture.
The first thing to decide is whether you are outsourcing pen testing to a third party or keeping it in-house with your security team. With a third party you will only need to mitigate any contract and SLA problems. Be sure to vet a third party thoroughly, asking exactly what they will test, what tools they will use, scan policies, whether they used white-box or black-box testing (in black box, the tester infiltrates without any previous knowledge of the environment, while white box is the opposite).
Either way you’ll need to know exactly what will be tested including which applications, database servers, devices including storage, and devices.
Whether you’ve been handed a mandate to consolidate your data centers, like many federal government data center managers, or you’re evaluating consolidation as an option for aging or expensive in-house data centers, the process can deliver cost savings and higher efficiency without losing the uptime and power provided by your existing infrastructure.
What most data center managers worry about—and rightly so—as they face a consolidation mandate is uptime and cost for cloud or colocation infrastructure. The employee reaction to consolidation news is also worrisome, as inevitably some jobs will be cut.
What they may not realize is that if the data center shutdown isn’t smooth and the replacement services aren’t carefully evaluated and set up, the transfer process might eliminate any consolidation ROI. Here are some best practices that will maximize the benefit of data center consolidation. But first…
A recent post on Government Technology recounts the stressful tale of a data center fire in an Iowa government facility and the subsequent scramble to restore operations. Despite a second available data center, the team chose to get the original facility up and running. Thanks to the fire department and their own fire suppression systems, the equipment was salvaged and back online in just twelve hours.
With the amount of electrical equipment and heat generated in a data center, fires are a real and constant threat. Whether your company has in-house infrastructure or hosts with a data center service provider, knowing which suppression systems to use and how to respond if they fail is essential to avoid downtime should disaster strike.
Although they started to gain real momentum circa 2011 or so, modular and containerized data centers are still spreading their way across the industry. The two models share many similarities: ease of deployment, the ability to add more computing power more or less on demand, highly energy efficient operation, and some degree of prefabrication. Depending on the enterprise and IT needs, each has distinct advantages and disadvantages for data center design and infrastructure procurement.
Why go modular or containerized? Both models provide a standardized kit to scale out a data center piece by piece. A facility can be designed with an initial baseload for power and then built out with racks, cooling, and support equipment as needed. As more customers come on or the company grows larger, new servers and networking equipment are added to meet demand.