We like to tout our Cheyenne facilities as some of the safest data centers in the nation. Southeastern Wyoming experiences very few flooding, tornadoes, or earthquakes, and zero hurricanes.
But there is that pesky supervolcano underneath Yellowstone. You know, the one that could obliterate 2/3 of the United States.
That’s why this April 1st, Green House Data is announcing our facility is Supervolcano Resistant®.
Virtualization is a standard practice for IT shops around the world. However, as more data center operators look to consolidate and migrate to new virtualized environments, some legacy applications remain stumbling blocks on the way to a 100% virtualized infrastructure.
Legacy apps are tough nuts to crack: your users are accustomed to them, so they are highly efficient in business use, but they might clash with your more modern IT tools, they might no longer supported by the vendor, or the hardware underneath might be ready to kick the bucket.
“No worries,” I hear you say. “I can just virtualize the platform.”
That might work in most cases, but there are some legacy apps that either just won’t make the leap to virtualization or are too much trouble to virtualize to make it worthwhile. Here are the most common examples run into by our techs:
With demand for pre-built data center space continuing to grow, you’d expect to find facilities being built all across the country, with a concentration in major markets, some outliers, and a general distribution around other areas. To some extent, that’s true. But the distribution is hardly uniform, with competing providers and in-house facilities alike suddenly cropping up next to each other.
So what makes these data center clusters happen? Wouldn’t builders like to place facilities in more diverse areas in order to avoid cascading or single-point failures from the same power outage or natural disaster? The decision to build in a cluster goes beyond offering competition in a popular area.
Green House Data’s own data center in Orangeburg, NY is a part of one of these clusters of development, and there are a number of factors why we joined Bloomberg’s giant facility just down Ramland Rd.
As much as we like to compare ourselves to Liam Neeson, our particular set of skills won’t be able to help you avoid a hefty ransom if you fall victim to Cryptowall.
Cryptowall ransomware programs are usually hidden in e-mail attachments or in web links in shiftier corners of the web. They download and install a malicious program that completely locks down your computer system and all access to data. Without a backup stored outside your computer, you will have to pay a ransom—usually in BitCoin, which can be a hassle to obtain—in order for the hacker behind your plight to grant access to your files and apps.
Unfortunately, we can’t go all Taken to track down and take out the perpetrators. BitCoin is impossible to track as it is anonymous, and the e-mails are often relayed through masking methods. The best we can do is tell you a country of origin (Russia, mostly).
We’ve had a number of our managed services customers affected by Cryptowall attacks recently. While our data centers remain secure, your offsite systems can easily fall victim to encryption attacks without some common sense precautions. Luckily, you can prevent the majority of these attacks with a little staff training. Here’s how to avoid Cryptowall and other encryption hacks.
By now, even your non-techy mom has probably heard of Big Data, with IBM and others advertising it on TV and every other IT vendor pushing their platform. If you don't know about big data, here is it is in a nutshell: as more and more devices are connected to the internet and storage capabilities continue to advance, we’re able to collect, store, and run analytics on massive sets of information in order to discover insights and make more informed decisions.
Some industries like research, oil and gas, manufacturing, and logistics, have been doing this for years, often on dedicated hardware. The advantages of virtualization can be levied for big data use, too, even though at its core, big data is focused on distribution of jobs over a wide array of resources, while virtualization as a concept is the exact opposite.
If you’re gearing up for a big data deployment, you can use VMware tools to stack it on top of virtual machines, allowing you to add resources easily when you need to run large analytics jobs and scale back when you don’t need as much processing power or want to delete old unused datasets from storage. This elasticity helps maximize your available compute resources and can be used in a mixed-workload environment. Plus, you can manage and automate your big data VMs from the same tools as your other infrastructure.
Here is quick primer on what to keep in mind with VMware big data platforms.