I spend a lot of time working at the Powershell console so it is common for me to type (or retype) the same command multiple times; of course I tend to not really retype the same command every time but rather use Get-History and related cmdlets to re-execute previous commands: The quickest way to re-execute a previous command is to use Invoke-History followed by the Id from Get-History: I like to use shortcuts (or aliases, as you can see from my history above 😊) so I would normally use h (Get-History) and r <id> (Invoke-History <id>). Sometimes I need to slightly modify the command before running it again, there’s an easy solution for that as well: Simply copy the CommandLine value from Get-History to the clipboard, paste it to the console and change what’s needed PSReadLine improves the history search capability, here’s a list of bound key handlers related to history management: For example, using Ctrl+R (search history backwards) I can type a part of a string (“keyhan” in this example) and PSReadLine shows the first matching command, I can either hit TAB to accept the command and run it, or use CTRL+R to cycle through other commands matching the…
-
-
Publish module to Powershell Gallery from Azure Pipelines
Now that I have my module on Github and built the module on Azure Pipelines, I want to publish it to the Powershell Gallery. The first step it of course to create a free account; once you have the account you can head to your profile, click on API Keys and then Create: The Create dialog allows to choose the key name, the expiration and very important, the scope (what permission the key will have on the Gallery) and which packages the key is allowed to control through Global Pattern. In my case I want the key to expire every year, I want the key to grant permission to publish new packages and update existing ones and I want this key to be used only for LSE modules: notice I used LSE* as Global Pattern, this way this same key will allow me to publish and manage new packages as long as their name begins with LSE. Azure Pipelines allows to securely store secrets (passwords and keys) as variables, if you want to do so you can use the Variables tab in your Pipeline then click the padlock icon: Anyway I prefer to store the key in Azure KeyVault since…
-
X509Certificate is immutable on this platform. Use the equivalent constructor instead
Quick tip today. Recently I decided to switch to to Powershell Core as default on all my machines for my daily work and it’s working great (except a few corner cases where I’m forced to go back to Powershell Desktop due to some old module incompatibility). To do so, over the last few weeks I had to go through the modules and script I use the most and port them to Pwsh. One of my cmdlets is meant to convert a certificate to and from its Base64 representation (this is useful to export certificates from Azure KeyVault for example), the heart of the code where the transformation happens looks like this: Unfortunately though, while testing the code on Powershell Core I got this error: It turns out the problem is with how I was creating the certificate object and loading its data. To avoid the exception the solution is to go from this: To this: I didn’t spent too much time to figure out why the exception is thrown (especially considering the Import() method is available on .NET Core/Pwsh) but at least I hope this will save someone else some time (and a headache 😉). I have never met a man so ignorant that…
-
Run Pester tests on Azure Pipelines
Now that I have some simple functions and some tests in my master branch on Github, I want to make sure the tests are executed (and pass!) every time I merge some new code. At this point I do not care too much about my dev branch because I know I’ll commit often and things will not always be stable, but I’ll make sure all my test will pass before I merge from dev to master (and later I way want to publish from master to the Powershell Gallery, but only if all tests pass of course!). The first thing I need is an account on Azure DevOps (you can start with a free account) and when ready, head to Pipelines then Builds: Since my code is in Github, that’s what I’ll choose: The first time we setup this integration, Azure Pipelines must be authorized to access Github: Since I don’t have a yaml file already, I’ll select Starter pipeline At this point in my tests things got a bit murky. The Azure Devops Marketplace has (as of this post) two free extensions to run Pester tasks so I decided to try them. I installed both extensions and added them…
-
Test Azure custom modules with Pester
Before I go too far along with building my LSECosmos module I must add proper tests. Just as a quick refresher (or to get some context if you’re not familiar with the concept), here are some pointers about Test Driven Development and Unit Testing: Test Driven Development (Wikipedia) Unit Testing (Wikipedia) Software Testing Fundamentals While it is relatively straightforward to test simple scripts (we would likely manually run the script testing a 2-3 core scenarios to make sure nothing terrible happens), things can get complicated fairly quickly with longer scripts or modules, especially when they are using a variety of cmdlets to take actions (think about Azure resources for example, or any other system-wide on-prem operation), need to pass data and objects back and forth between calls and so on. If you have written enough lines of code (no matter the language/tool you use), I bet you can remember at least one occasion where you decided to make an apparently small and innocent change to a well working piece of software an all hell broke loose 😵. I recently came across this meme on Facebook, it sums it up nicely 😅 (thanks to CodeChef for sharing): At its core proper…
-
CosmosDb module
CosmosDb is a non-relational, distributed database you can to build highly scalable, global applications in Azure. It is: – Always on – Offers throughput scalability – Low latency, guaranteed – No schema or indexes management – Globally distributed – Enterprise ready – Use popular no-Sql APIs https://docs.microsoft.com/en-us/azure/cosmos-db/introduction It’s all great from a development perspective, but when it comes to management things are a bit different. CosmosDb can be managed to a certain extent through Azure CLI but as of this writing there is no Powershell module available: I admit I have only superficially explored the existing modules and while it is great to see the Powershell Community sharing modules and scripts, it would probably be nice to have an official module by Microsoft as many other Resource Provider offer. This is a good opportunity though to continue exploring the topic I have introduced in my previous post about calling Resource Provider actions and since these script will likely be reused (I can actually use them at work), why not build a reusable module? Module basics Writing a Windows Powershell Module is a good point to start to get an overview of this topic; I’ll write a script module and I’ll…
-
Invoke Azure Resource Provider actions with Powershell
Recently I started to convert all my script and modules to run on Powershell Core and I soon realized I have a problem. When it comes to Azure resources, I work with a combination of both ARM and RDFE and all is good in Powershell Desktop on Windows: just load (or let Powershell load for me) both the Azure module and the combination of Az.<RPName> components I need. I have now changed to Powershell Core as my default on Windows (on macOS/Linux I don’t really have a choice 😉) but I encountered compatibility and runtime errors with the Azure and Azure.Storage modules, even if I import Windows Compatibility first. Typically Powershell Core complains about duplicate assemblies already loaded but I also got weird runtime errors trying to run basic cmdlets. Since I want to move to Powershell Core anyway I decided to not try to figure out how to solve the problem but rather move to ARM completely, writing my own cmdlets and functions not otherwise available. Luckily the Resource Providers I am interested in (Cloud Services for example) expose APIs and actions for Classic (RDFE) resources, so to get started I just need to find the right one 🤓.…
-
RTM (it does not mean what you think)
Well… it actually stands for… Read The Manual (no cuss words please 😉). I realized it while I was experimenting with some more Dynamic Parameters scenarios and I was playing with filters. It is a basic scenario, Get-ChildItem (or one of its forms) is a fairly common cmdlet I use every day without thinking too much about it, but it still surprised me. This is how I (and I guess, likely most of the Powershell users) use Get-ChildItem: And if I’m looking for some specific files I can do: Easy enough. Anyway Get-ChildItem offers more advanced filtering capabilities, so let’s say I want to get the list of txt files but I also want to exclude the file1.txt from the output: No files returned? 😲Ok, let’s try to qualify the path (even though Get-ChildItem by default takes the current directory as -Path): Again no output, no matter if I pass “.” (current folder) or explicitly pass the full folder path to the -Path parameter 🤔Well, let me try to explicitly include the files I want then: No matter the parameter combination I try I cannot get the output I expect. Time to admit defeat and go back to the basics:…
-
Dynamic Parameters discoverability
In my previous post About Dynamic Parameters I forgot to mention an important point about discoverability. When I come across a new script or module, usually the first thing I do is to check its syntax to get an idea of the kind of arguments it can accept, like this: This concise syntax tells me for example that all parameters in the first ParameterSet are optional (each parameter and its type is enclosed in square brackets), meaning I can simply run Get-AzResource on an Azure Subscription and get the list of all available Resources. The second ParameterSet on the other hand requires at least the ResourceId parameter since it is not enclosed in square brackets; the other parameters are optional though, so I may or may not use them. And so on. Get-Help too shows the script’s syntax, along with additional help details if available: Dynamic Parameters are special though: As you can see, FolderPath is displayed as an optional parameter (expected) but there is no sign of FileName which we know will be created at runtime. That is the core of the matter: FileName does not appear in the param declaration, therefore Powershell does not see this as a…
-
About Dynamic Parameters
A fundamental best practice for any programming or scripting language is do not trust your input parameters, always validate data users (but also other pieces of automation) pass into your program. It is easy it imagine how things can go badly wrong when a user by mistake passes a string where you are expecting an integer, or an array in place of a boolean, not to mention the security implications (and potential disaster) of, for example, accepting and running things such as a sql command or other shell commands malicious users may try to use to exploit your system. In Powershell we can use Parameter Validation attributes to check the format or the type of an input parameter, or check for null or empty strings, or that the passed value falls within a certain range, or force the user to pass only a value selected from a restricted list. This last type is called ValildateSet and allows the script author to decide the list of values the user can chose from and have Powershell throws an error if this constraint is not respected. I used it often in my scripts and modules, this is how a very simple script looks like: [CmdletBinding()]param (…