When your monitoring systems start sending a deluge of alerts or your servers suddenly stop responding, it’s easy to go into crisis mode. That’s why Step One of this guide to troubleshooting is to remain calm. Let common sense prevail, be sure to maintain your documentation, and get down to the art of troubleshooting your IT systems. Just follow these eight general guidelines to pinpoint the issue and take steps towards remediation.
The holidays are looming, meaning many DevOps teams are about to have their apps take a beating as hundreds of holiday orders and new device users slam them all at the same time. Whether or not your systems are consumer-focused, there will eventually come a time when the overall load on your servers is pushed to the limit.
Load testing applications in the cloud allows development and testing staff to perform scale testing to see at what point virtual machines need to scale, when to add additional resources like storage or bandwidth, and when a failover solution might be necessary.
By thoroughly performing load tests throughout the DevOps process, your organization eventually lowers costs and your team doesn’t have to scramble during a major event. Here are some best practices when performing cloud-based load testing.
As you may have heard, Green House Data has completed our second acquisition of 2017 with the purchase of Ajubeo, a Denver-based cloud hosting provider. While more cloud resources are always beneficial to a nationwide cloud platform, some people might be left scratching their heads — after all, we have our headquarters just 100 miles north of Denver in Cheyenne, Wyoming. So what advantages are to be had from adding a Denver cloud node?
We thought everyone finally had cloud terminology all cleared up. You’ve certainly seen the countless blogs about IaaS, PaaS, and SaaS; not to mention the ever-proliferating surveys and reports on hybrid cloud being the deployment flavor du jour.
But things aren’t as clear as we might want them to be. For example, tell me what you think of when you hear the term “public cloud.”
Is it a hyperscale provider like AWS, Azure, or Google? It is, isn’t it? If not, you probably work with or for an organization similar to Green House Data, which has a public cloud offering with some major differences from the hyperscale players.
So how can we clear up the cloud? Has public become synonymous with hyperscale and self-provisioning? Has private cloud fallen by the wayside? And what should your business focus on, anyway?
Gartner anticipates that 90% of large organizations will have a Chief Data Officer by 2019.
This isn’t too surprising when you consider than the total amount of data is expected to grow exponentially, doubling in size every year until 2020. That’s 50 times more data in a decade.
Big data has plenty of insights for business large and small, and data-based initiatives are underway across the globe, as organizations seek to quickly understand and analyze mountains of information to glean a competitive advantage.
A Chief Data Officer makes key decisions around the storage, handling, and use of a business’ information, including the type of platforms used, connections to/from production applications, analytics processes, and efficient flow of data.
Let’s dig into what that means in practice and how a CDO can help reduce the significant costs around data storage, platforms, and access, while also improving business functionality and agility.