It feels like we’ve been talking “cloud-first” or “cloud-only” when it comes to IT transformation and new procurement strategies for years now. But a little over a year ago, we already saw some signs of what analysts are now calling cloud repatriation. At the time we asked, are enterprises moving back to on-premise data centers?
The answer isn’t that simple, but there is certainly a time and place for cloud repatriation. Here’s why it’s trendy to move some workloads back on premise and how to decide whether its time for you to follow suit.
We’ve gone back and forth on this for many years now. Are enterprise data centers dying? Gartner seems to think so, recently predicting that by 2025, 80% of enterprises will have shut down their traditional data centers, compared to 10% today.
That’s less than ten years out. Do you foresee your data center being put out to pasture within a decade? Or largely decommissioned and consolidated? It doesn’t seem too far-fetched considering an average hardware lifespan of three years. You could cycle through your servers three times over before then — and most of those compute workloads will likely end up in the cloud or hosted elsewhere.
Here's how that change will affect how you procure and manage IT services.
The past five or ten years have been jam-packed with cloud computing hype. Indeed, the cloud is here to stay, without a doubt. But recent reports show analysts expect hardware sales for on-premise enterprise IT to tick up significantly.
High profile examples like Dropbox show that moving back to a more traditional data center can create efficiencies and free up cash flow. Is the enterprise data center – and by extension, colocation – about to put up a fight against the cloud?
Data centers are invariably focused on 100% availability, which comes down to reliability of power and various mechanical and electrical components throughout the facility. But energy efficiency is a major priority as well, even for data centers that don’t call themselves “green” or “sustainable”.
With electricity providing a bulk of the operating expense, any gains in efficiency can go a long way towards minimizing OpEx. Many data center efficiency measures focus on containment, cooling, and other measures within the white space, but critical power infrastructure can be a good target for efficiency gains as well.
Major UPS manufacturers often include an “ecomode,” or in the case of our Cheyenne data center, Eaton’s Energy Saver System (ESS). These modes can lead to efficiency gains of several percentage points, which sounds low, but in practice can lead to thousands of dollars of savings and carbon emission reductions in the hundreds or thousands of pounds.
We have arrived again at that time when year-end lists proliferate for perusal by a workforce distracted by the holidays. The data center industry continued to chug forward in 2017, with M&A activity heating up in particular. Here are the top stories that broke throughout the data center world, plus a list of the most visited posts from our own humble blog.
Data center containment is the practice of splitting the aisles of a data center into segregated hot and cold sections, depending on how each aisle is set up. For example, some data centers might have the front of their servers on the inside of the aisle, with fans blowing the exhaust outside the aisle. Others might have the front of their servers on the outside of the aisle, and vent heat inside the aisle.
Containment keeps the hot air exiting servers from mixing with the cold air coming in from the Computer Room Air Conditioning (CRAC), dramatically improving energy efficiency and also maintaining a more consistent temperature, which reduces the overall load on both air conditioning units and the servers themselves.
Green House Data uses full containment in our Cheyenne and East Coast data centers, but only recently implemented it in our Seattle, WA facility. This case study demonstrates how even a simple containment system can lead to significant energy efficiency improvements. We expect the system to pay for itself within the year, in part thanks to generous rebates from Seattle Public Utilities.
In the past decade, alongside the increased importance of digital tools for business, a new category of insurance has sprung up to cover digital data breaches and liability. With the average total cost of data breaches reaching $4 million dollars and the average cost of each lost or stolen digital record increasing to $158, it is clear that experiencing a data breach is an expensive affair.
While dedicated security response teams and encryption do decrease these costs, and IPS/IDS systems and other security measures can help reduce the risk, many organizations will still experience a data breach at some point.
Cyberinsurance can help mitigate the cost of a data breach by reimbursing your company for legal fees, helping with the cost of crisis management and investigation, notification costs, extortion liability fees, and third party damages relating to network or system outages. But does every organization need cyberinsurance?
Edge data centers have a lot of buzz these days as a way to deliver services outside of core markets. But do actual data center operators have any interest in edge facilities? And what exactly is an edge data center, anyway?
Green House Data surveyed 492 IT professionals, with 38% being Executive level. The results indicate a mild interest in edge data centers, but mostly for future deployments. 18% currently use an edge data center, with 46% planning to add an edge facility within the next 12 months. 54%, meanwhile, do not plan to add an edge data center.
Read on to see the full survey results.
Can you believe we’re already over a quarter of the way through 2016? Feels like we were just posting our 2015 blog wrap up yesterday. But here we are—the data center world keeps spinning. In case you missed something in the past three and a half months, we’ve collected our top blog posts and some of the most popular data center news headlines from around the blogosphere in today’s post.
As data center design continues to evolve, one stalwart piece hasn’t changed too much: cabinet or rack security and monitoring. After all, how complicated can a door lock get? While most every data center will have some form of lock on their racks and/or cabinets, especially colocation facilities as they have multiple clients accessing shared floor space, not all locks are created equal. Newer technologies allow automated access logs, biometric security, wireless unlocking, and more.
With different compliance standards and security requirements for various applications, some colocation providers will install custom locks for your cabinet if necessary. Physical security measures remain vitally important, as social engineering and theft can extend to hardware and not just data. How then do data center providers go about securing cabinets and racks?