Sometimes you want to trigger a specific action when something is detected by one of your alert rules inside of Azure. If you want to immediately remediate the specific issue you are facing normally you would have to login to the machine once you receive the alert, but by using an Azure Automation account you don’t have to take any additional steps to fix whatever threw the alert — just create your script and leave it to run whenever the alert is triggered. As simple as that.
This works perfectly when you need to resolve a common issue with a trusty PowerShell script that you have often used. This method will save you time and effort; you can rest assured that the issue is being taken care of with the help of a Custom Script Extension.
Running a custom script on a specific machine when an alert is triggered in Log Analytics is quite easy. Here are the following steps you need to follow to achieve this.
You may be familiar with Microsoft Operations Management Suite (OMS) — a management system that works to simplify your IT processes. Troubleshooting, change tracking, and updates are just a few common IT tasks that OMS could handle. OMS components brought together backup services, site recovery, log analytics, and automation and are available for hybrid, multi-cloud, and on-premises environments.
Microsoft deprecated OMS as of January 2019, moving all functionality into the Azure portal. Learn more about why OMS and the new Azure portal are useful for your IT workflow and what has changed with the migration to Azure.
Cloud-native automation and orchestration tools make IT administration easier — at least once you know what you’re doing. While there is also some concern among the ranks of cloud technicians that automation could lead to job losses, by mastering the tools available you make yourself more valuable, while also finding and executing on efficiencies. Cloud automation is a win-win.
But where should you begin when it comes to automating your cloud environment? There are many moving parts in an enterprise cloud deployment, even within specific application clusters.
These are the three easiest targets for automation and orchestration.
When you decide to move your Exchange environment to the cloud, you might be confused to discover that you still need to maintain an on-premises Exchange server. There are several reasons for this, stemming from the migration process and on to Identity Management.
If you’re moving from an active on-premises Exchange deployment, you’ll first configure an interim “Exchange Hybrid” environment which hosts mailboxes within Exchange Online and your local Exchange server. The two locations share namespace, address books, free-busy, calendars, really every Exchange functionality is synced between them. Mail flow and other functions appear to be internal, but might actually be processed and stored in the cloud environment.
Azure Stack enables you to run Azure workloads on-premises or even within a colocation facility, enabling stronger security and control over your data and applications with a single management platform for your public Azure cloud infrastructure and your Azure Stack deployment.
You can use many of the best Azure tools, processes, and features — including add-ons and open source solutions from the Azure Marketplace — in the cloud of your choice, helping to meet regulatory or technical challenges.
Before you get started with this intriguing hybrid and private cloud technology from Microsoft, there are a few things you’ll need to keep in mind, however. Here are some of the most important.
When you work with Azure Automation — and especially if you use Hybrid Worker machines — sometimes you need to use the certificates that are part of the connections created by the automation account on a local VM or server.
Runbooks that use these kinds of certificates work fine in the Azure environment, but if you need to run it in your local environment, using Hybrid Worker machines, this represents a challenge. Here's how to get those connection certificates on your Hybrid Worker.
Multi-cloud is the IT service model du jour, but it comes with a set of challenges that many IT departments aren’t yet ready to tackle. There are many reasons to go with more than one cloud provider, including the use of specific services or abilities, backing up storage across various vendors, maintaining availability or minimizing latency, and even using different cloud vendors as bargaining chips for pricing negotiation.
A managed services partner might be the best way for you to take advantage of multi-cloud IT infrastructure and services, especially if you face the all-too-common cloud skills gap that many organizations encounter.
Read on for statistics on multi-cloud adoption and cloud skills difficulties, as well as ways in which a partner can help you alleviate the top multi-cloud obstacles.
Azure Automation is a cloud-based configuration service that automatically manages your Azure and non-Azure environments based on your runbooks, update management features, and shared capabilities like access controls, global storage of credentials/certificates/etc, tags, and more.
Included in Azure Automation is the option to extend your libraries. You can import a set of libraries called Modules to your automation account from a preexisting list that can be found in the Gallery or by uploading script files of your own.
Below you’ll see where to upload or choose these Modules.
Automated patching server application patching can alleviate a lot of work for IT management teams. It shifts the patching and updating process outside of business hours. In an ideal world, Microsoft’s System Center Configuration Manager (SCCM) would flawlessly execute server application patches.
However, there are some gaps in SCCM patching functionality, especially when it comes to orchestration, validation, and report logs. These can cause issues with QA and risk mitigation and can drive frustrations among your IT staff.
When you work with Azure Automation, you might find yourself coding locally, putting all the initial logic into the script, copying and pasting the code to the web to run it, and then testing the code from the portal.
Usually this practice takes longer to execute and will require a printout of variables or comments to follow the code execution, as you are not debugging your script.
There is another, possibly better, way to get your PowerShell code into Azure.