PowerShell Core Web Cmdlets in Depth (Part 3)


Part 3 Intro

In Part 1, I covered the primary changes in the actual code base of the PowerShell Core Web Cmdlets Invoke-RestMethod and Invoke-WebRequest and how those changes manifest themselves in the PowerShell user experience.

In Part 2, I covered outstanding issues as well missing and/or deprecated features.

In Part 3, I will cover new features available in PowerShell Core 6.0.0 Invoke-RestMethod & Invoke-WebRequest. I will also cover future plans for the cmdlets.

If you have not read Part 1 and Part 2, please do so before reading Part 3. This blog series goes in depth and requires a great many words to do so. To save space I will not repeat some information and will assume the reader as read Part 1 and Part 2.

A quick bit of news: PowerShell Core v6.0.0-rc.2 was released. Unless any blocking issues are discovered, this will be the final RC release and the next release will be GA in January 2018.


PowerShell Core Web Cmdlets in Depth (Part 2)


Part 2 Intro

In Part 1, I covered the primary changes in the actual code base of the PowerShell Core Web Cmdlets Invoke-RestMethod and Invoke-WebRequest and how those changes manifest themselves in the PowerShell user experience.

In Part 2, I will cover outstanding issues as well missing and/or deprecated features. Some of this will be an extension of Part 1 because some of the feature reduction is due mostly to the switch to HttpClient. Any plans or fixes I mention in this section are tentative and may change when and/or if they are implemented.

If you have not read Part 1, please do so before reading Part 2. This blog series goes in depth and requires a great many words to do so. To save space I will not repeat some information and will assume the reader as read Part 1.

Also, this will be my first blog post as a new Microsoft MVP. I wanted to take a quick moment to thank the people who nominated me and the PowerShell community for your support in getting me here. It is a great honor and I am overjoyed! Thank you!


PowerShell Core Web Cmdlets in Depth (Part 1)



I recently spoke at the North Texas PC Users Group's PowerShell Special Interest Group on the topic of the Web Cmdlets in PowerShell Core. I spoke for full hour because there is just so much new and different about Invoke-RestMethod and Invoke-WebRequest between Windows PowerShell 5.1 and PowerShell Core 6.0.0. In fact, because I was limited to an hour, I couldn't go as in depth or cover as many things as I would have liked. This blog series will cover what I covered in that presentation and more. At this time I plan to have 3 parts with a possible 4th as an addendum should anything change between now and GA.


PowerSMells: PowerShell Code Smells (Part 1.5)



I am writing this next blog post in this series much sooner than I had anticipated. I wanted to do so in order to clarify a few things about PowerSMells and to address some of the negative feedback I have received. I will still include a PowerSMell, but this is mostly to address shortcomings in the Part 1 of the series.

Table of Contents for this series


PowerSMells: PowerShell Code Smells (Part 1)



A few weeks ago, someone in the slack channel asked what some of the Code Smells are in PowerShell. If you are not familiar with the term Code Smell, a simple definition is code that when you see it you know there will be issues just like when you smell something rotten know you will find something unpleasant. After some joking around the term PowerSMell was coined to describe a Code Smell particular to PowerShell.

This is the first in an ongoing series of Code Smells in PowerShell. Rather than make a massive single entry, I will try to cover only a few at a time and spread them out. Unlike my other series, this one wont be dumped all at once and I will work on this series between other blog posts. I may never truly be done with this series and I have no clue how many parts it will be. There are many Code Smells in PowerShell and it seems like I'm finding more each week.

One possible plan for the future is to begin a community PowerSMell project with PSScriptAnalyzer rules for linting these PowerSMells. I know there are several projects out there that already have some of these, but it would be nice to get them all together. If you are interested in working together on this let me know and we can begin coordinating.

Each PowerSMell will include a "The Smell" and a "The Source" section. The "The Smell" section will have a code example of the PowerSMell. The "The Source" section will have information about the problems that the smell hints to. Some PowerSMells are not always PowerSMells. When a certain PowerSMell has a non-smelly usage, it will include a "The Pleasant Aroma" section providing an example of the non-smelly usage.

Table of Contents for this series


Why pwsh Was Chosen For PowerShell Core



With the release of PowerShell Core 6.0.0-beta.9 the binary for PowerShell Core was renamed from powershell.exe and powershell on Windows and Linux/Unix/macOS respectively to pwsh.exe and pwsh. This major change has drained the pitchfork and torch emporiums of their stock as PowerShell users from all over the globe have taken to internet forums, twitter, slack, and the PowerShell repo to voice their opposition and disbelief in this change. The responses range from minor annoyance to expletive heavy diatribes.

It's clear that many PowerShell users are not happy with this change.

So why did the PowerShell Team decided to make such and unpopular change? There are several reasons the change was necessary and for why pwsh was chosen. I will attempt my best here to explain them all.

As you read, please keep in mind that I'm not a Microsoft employee nor am I a member of the PowerShell Team. I'm just a community contributor with collaborator access to the GitHub repo. I don't have any special insider knowledge, but I do follow the repo and PowerShell Core very closely. I am speaking to you as a fellow member of the community as someone with a decent insight into the project, not as an outright authority on all things PowerShell.


New PowerShell Core Feature: Basic and OAuth Authentication for Invoke-WebRequest and Invoke-RestMethod



PowerShell has always supported Basic authentication on Invoke-WebRequest and Invoke-RestMethod via the -Credential parameter. Those of you who have tried to use it on any modern APIs are probably scratching you head at what I just wrote. You probably know full well that the -Credential parameter is not sending Basic Authentication credentials when you expect it to. But it's true, Basic authentication has been there all along. The problem is that PowerShell will only send credentials if challenged.

This is a problem for accessing modern API's, especially for requesting OAuth Tokens from authentication endpoints. Modern APIs often do not issue 401 status codes with WWW-Authenticate headers. This results in the Web Cmdlets not issuing credentials to the remote server even though they were supplied. Modern API's expect you to present your credentials explicitly without being challenged first. This has resulted in a common frustration and feature request.

Additionally, the Web Cmdlets did not have any native support for OAuth bearer tokens. This means that all calls to APIs requiring OAuth tokens required passing an Authorization header. Since many OAuth grant flows require the client ID and client secret be sent as Basic authentication without a challenge, the entire OAuth process becomes a manual task.

I have run into this myself with the PSRAW and PSMSGraph projects. In both projects I have created wrapper functions around Invoke-WebRequest and Invoke-RestMethod that provide OAuth capabilities and handle the conversion of PSCredential to Basic Authentication. I have long wish this functionality was native to the cmdlets.

I'm pleased to announce that beginning with PowerShell Core 6.0.0-Beta.9, Invoke-WebRequest and Invoke-RestMethod  natively support explicit Basic and OAuth authentication.

If you want this functionality now, build the current master branch or pickup the nightly build. this was added in Pull Request #5052.


-NoTypeInformation is Now Default For Export-Csv and ConvertTo-Csv in PowerShell Core



On Monday morning I opened the PowerShell slack and started to catch up on the conversations I missed. The most recent conversation that morning was someone, yet again, asking how to get rid of the annoying first line in the output from Export-Csv. This is such a common question and an issue that trips up many PowerShell novices. I've probably answered this question several dozen times on /r/PowerShell. It also caused me grief when I was just starting out with the language.

This prompted me to ask myself and the slack channel "Does anyone actually use the default behavior of including the type information?" My gut and the few users online at the time told me no. So I created an Issue in the PowerShell repo then asked for feedback from /r/PowerShell, Twitter, and Slack. The result was 175 thumbs up, no thumbs down, and not a single person coming out in support of the default behavior.

Since the change is a breaking change, the issue went before the PowerShell Committee Wednesday and was ultimately approved. I quickly made my PR and early this morning the PR was merged to master. That's right: Starting with PowerShell 6.0.0-Beta.9 -NoTypeInformation will be the default behavior on Export-Csv and ConvertTo-Csv.

If you want this feature sooner, rather than later, you can either build the current master or pick up tomorrow's nightly build.

I use ConvertTo-Csv for all of my examples in this post for convenience. There is little difference between it and Export-Csv besides the fact that one puts a string in the output stream and the other in a file. They both share the same base class where this change was implemented so they both share the same new behavior.


Hacktoberfest 2017 and PowerShell

Image source: https://hacktoberfest.digitalocean.com/


Hacktoberfest 2017 is upon us! Hacktoberfest is a month long celebration of open source software held by DigitalOcean in partnership with GitHub. The basis are thus:

  1. Go to https://hacktoberfest.digitalocean.com/
  2. Register with your GitHub account
  3. Make Pull requests in GitHub Repositories
  4. Make open source better
  5. Get a t-shirt (see site for details)

I posted about this recently on Reddit and I had many questions about what the bar is to become a contributor. I figured this was a fitting blog topic so I will cover it here.



New PowerShell Core Feature: Invoke-RestMethod -ResponseHeadersVariable

Pictured: The new -ResponseHeadersVariable parameter in action


I wanted to share with you another new feature added to the PowerShell Core Invoke-RestMethod Cmdlet: The -ResponseHeadersVariable parameter.

This feature will be available starting with PowerShell Core v6.0.0-beta.8 and is available now in the nightly builds or if you build it yourself from master.


Multipart/form-data Support for Invoke-WebRequest and Invoke-RestMethod in PowerShell Core

Pictured: A packet capture of a current build of PowerShell Core submitting a multipart/form-data POST request from Invoke-WebRequest.


Over the past few months I have been donating a generous portion of my spare time to help improve the Web Cmdlets (Invoke-WebRequest and Invoke-RestMethod) in PowerShell Core. This is partly because I want and need certain functionality for both personal and work related projects. It is also because I have had some minor gripes about these Cmdlets for some time.

One common ask I have seen repeated in just about every PowerShell forum is multipart/form-data support. It seems like a reasonable thing to ask when there are many endpoints that will only work with a multipart/form-data submission. There is an open issue (#2112) on the PowerShell GitHub echoing the same request. It was brought to my attention and I decided to give it a serious look.

The result is that PowerShell Core now has partial multipart/form-data support in both Web Cmdlets. This change didn't make the cut for 6.0.0-beta.7 but it will be available starting in 6.0.0-beta.8 and is available now if you build it manually or grab the latest nightly build.

This blog will cover some of the challenges involved in supporting multipart/form-data, how to make use of this new feature, and about future plans for additional support.

Because typing multipart/form-data is annoying, I will be shortening it to just multipart. Please don't let this be mistaken for other multipart submission methods.

Also, I will be referring collectively to Invoke-WebRequest and Invoke-RestMethod as Web Cmdlets. In this case, there is no need to call out each command as they offer the same base functionality for multipart support.


Porting PSRAW to PowerShell Core: Lessons Learned



I took a significant break from commits to my PSRAW project. I have spent that time learning more about Open Source projects and Object Oriented Programming. Before I could move the project forward I had some architectural decisions to make and I don't have quite enough knowledge to make those decisions yet, but I'm getting there.

On July 14th, this blog from Microsoft dropped a bit of a bombshell on the PowerShell community by making it clear the path forward is in PowerShell Core.Windows PowerShell will still be developed/maintained, but the primary focus will be PowerShell Core. There was also a call to test out PowerShell Gallery modules. I put off the leap to Core for awhile, but it seemed that now was the time. My module is still young and flexible and I suspected that most of it would work on Core.

Shortly after that blog post, I  unleashed the Kraken and created a new branch in my local PSRAW repo named CoreRefactor, installed PowerShellCore 6.0.0-beta.4, switched VS Code to use PowerShell Core for the integrated terminal, and fired up my pester tests. Thus flowed a glorious sea of red failures and errors. This kicked off 2 weeks of refactoring. I wanted to share what I have learned from the experience.

At the time of this post PowerShellCore 6.0.0-beta.5 has just been released. I just completed my test against it. So everything in this post is at least relevant to that release.


Bye Bye Backtick: Natural Line Continuations in PowerShell


PowerShell is a language which lends itself to lengthy lines of code. Recommended Best Practice (RBP) is that functions and cmdlets along with their parameters be named descriptively and without abbreviation. It is also RBP to not use aliases for them in your code. Additionally, RBP is to use full names for variables in Pascal Case or Camel Case. The point of this practice is to make the code read more like a story and less like a cheap furniture construction manual. As far as RBP’s go, these I whole heartedly agree with. In addition to the verbosity of keywords, combining multiple commands in a pipeline will also greatly increase the number of characters per line.

PowerShell offers several ways to break up your lengthy lines into multiple shorter lines. Some of these are well known and others not so much. One of them many PowerShell users will encounter very early and it is the worst option. Most often this is the first method for multi-line commands a PowerShell novice learns and sometimes it is the only one. Thus, this worst possible method has ingrained itself into the annals of PowerShell history and permeates like a cancer.

I’m talking, here, about the backtick (or backquote, or grave accent). You know, this hard to see character:


Line continuation, word wrapping, line wrapping, and line breaking are all terms used to describe the process of splitting a single command across multiple line. I prefer the term Line Continuation because I have only seen it used in the context of programming and not in typesetting or other crafts.

In this blog post I will cover why line length matters, why we should avoid the backtick for line continuations, and various alternative methods and strategies for reducing line length and splitting lines of code. This post should be suitable for new and experienced PowerShell users. Some terminology may be foreign to new users. The PowerShell major versions targeted by this post are 5 & 6. The methods described in this post may work in later versions or earlier versions but have not been tested or verified in those versions.


How Safe Are Your Strings?


Source: https://www.facebook.com/knitsforlife


A question about using plain text strings for passwords was recently asked on /r/PowerShell. The poster was making a wrapper for LastPass’s CLI and wanted to know if they should be using [System.Security.SecureString] objects. This question gets asked often and my stock answer is to always use [SecureString] objects to house secrets in memory regardless of how frequently the secret is converted from or to a plain text string.

My stock answer has had some pushback in the past. The problem is that when you do convert a [SecureString] to a normal string, that string object now exists in memory as plain text. If you know anything about how the CLR garbage collector works, you will know that the string may even hang around in memory long after the variable that housed it has been destroyed or overwritten in the code. Effectively, once you convert a [SecureString] to a normal string, the plain text secret can reside in memory until the program/script exits. The argument against using [SecureString] objects that will be converted to and from plain text is that it adds a level of complexity to the code for no effective gain.

This argument makes my eye twitch every time I see it. I’m a huge proponent of layered security and believe that security should be baked into every level of the stack every with chance possible. The idea is that we never know where our code will end up and we do not want our code to be the weak link in the chain. While the blame rests with the person who uses your insecure code in their sensitive environment, I don’t think we are totally without fault if we didn’t make an effort to be secure in the first place.

This is especially true with password manager wrappers. I have reviewed no less than 40 PowerShell based password manager wrapper modules and scripts in the past 2 years. The overwhelming majority of them are not using [SecureString] objects. When the lack of [SecureString] usage is pointed out that “inefficient complexity” argument  invariably rears its head.

“Mark, [SecureString] objects should never be converted to plain text the first place!” Let me remind you PowerShell is glue. It is being used to glue together various APIs. Many of these APIs are not Windows native or local and therefore don’t accept [SecureString] objects. This makes it necessary to convert the [SecureString] to plaintext and either submit it as plain text or encode it in some way. Also, sometimes we are accepting secrets from APIs, such as OAuth access tokens, and we don’t want these sitting around as plain text. It becomes necessary to convert back and forth quite a bit.

I wanted to see what can be done about this and to get a deeper understanding of the problem myself. In this post I will go in-depth with when and where [SecureString] objects give up their plain text secrets and how we can add some security around that process. This post will likely be a stretch for those new to PowerShell and is not intended as an introductory how-to.


Let’s Kill Write-Output


I think it is time we put down our old friend Write-Output for good. That’s right, Write-Output and not Write-Host. You should definitely not be using Write-Host outside of functions beginning with the Show- verb in console driven PowerShell apps, but I’m not here to kill off Write-Host. No, my target is Write-Output.

What is this craziness I’m speaking? Well, if I went back in time to 10 months ago and told myself not to use Write-Output, I would think my future self had gone mad. In fact, my journey towards murdering Write-Output began around that time.  There was a post in the /r/PowerShell  (sadly, I can’t find it in my history) where I was “correcting” someone for not using Write-Output. This prompted another user to link me to this discussion. At first I rejected the notion, but over time I came to embrace it. And today, I think I was crazy for not killing it off sooner.

What I’m saying here wont be new or revolutionary. My only hope with this post is to expose others to the issues with Write-Output and contribute to its demise.

What’s my beef with Write-Output? There are three concerns I have about using Write-Output: Performance, “Security”, and a “False Sense of Security”.

Before you begin, please understand that this is not an introduction-level PowerShell topic. This post was written with experienced PowerShell users and those who provide guidance and training to novice PowerShell users in mind. Also, this post is not specifically about console output, but about the use of the Write-Output command in general and its common use for “putting objects in the pipeline” (e.g. to “return” objects from a function). Most of it applies whether the objects ultimately go to the console or not. The only section dealing with console output is the Text Output section. Even then, the output for the targeted scenarios is more often a log file than it is a console.


Permission Granted: Using SharePoint Online, Flow, Azure Automation and PowerShell to Automate OneDrive for Business Permission Requests


A few weeks ago a request came through to create a group that would have full access to all OneDrive for Business accounts in our Office 365 Tenant. I’m am patently against blanket access to things, even for administrators. It turns out the goal was to enable our Service Desk staff to manage user’s OneDrive’s as we ramp up our adoption rate through various “to the cloud” projects in the works.

We have a very small team who have admin rights to our SharePoint Online and we are wary of granting frontline technicians admin rights to it as it is a complex beast and there is sensitive data to consider. Currently, all requests to access another user’s OneDrive requires an escalation to that small team. This creates a constraint that isn’t much of a problem today, but will become one as our adoption rate grows.

I identified two user stories we needed to support:

  1. IT Technicians needing temporary access to a user’s OneDrive to assist them with various tasks
  2. Users needing to access the OneDrive of another user permanently (e.g. a manager needing access to the OneDrive of an employee on Extended leave)

We already have automation built into our leaver process which grants managers access to their leaver subordinate’s OneDrive for 30 days. Unfortunately, that functionality is tightly coupled with the leaver process so it can’t really be used for these two user stories.

I am a big fan of SharePoint lists. They make it really easy to make a web based form and tracking mechanism that can support RBAC. I’m also real big on PowerShell automation. I have quite a bit of automation involving both in production. One thing I don’t like is that most of this automation is on a scheduled basis and not a trigger basis. I noticed that SharePoint Online now has triggers for Microsoft Flow and that Microsoft Flow added the ability to run Azure Automation Runbooks. So that’s where I decided to go.

This blog will cover the temporary admin access solution. It’s intended to be more of an overview and not a deep dive or tutorial. This is a PowerShell blog, but most of this post will be taken up by Flow as it is the glue of the solution. However, I won’t go into great detail of Flow either.


The Classy PlatyPS: Automated Class, Enum, and Private Function Documentation for PowerShell Module CI/CD Pipelines

 Picture source: http://newproductions.deviantart.com/art/platypus-even-more-gentlemanly-261415442


I have spent the past month working feverishly to release the core functionality of my PSRAW module. I started several blog posts to try and cover some of the things I was doing as I was doing them, but I ended up changing directions so often and quickly that trying to "blog as I go" became a wasted effort. But, I’m really excited about much of what the project has shaped up to be.

One feature of the project I am most proud of is what I’m calling "The Classy PlatyPS" auto-documentation feature. This is basically a total revamp of the auto-documentation system in my 4 Part Blog series. I planned to do another series on the entire new pipeline, and I likely will, but I’m just too excited about this feature to wait!

What is The Classy PlatyPS?

As a quick review, PlatyPS is a PowerShell module that makes it easy to maintain module help as markdown. This works great for your module's public functions but offers nothing in the way of generating and maintaining documentation for your module’s Classes, Enums, or Private Functions. It does, however, offer the ability to create and integrate about_ topics into your help system. But this would be a manual step to generate an about_ topic for each Class and Enum and then another manual task to update that topic when you add or remove properties, methods, constructors and fields.

The Classy PlatyPS adds automatic about_ topics for the Classes and Enums in your module. It also will add and remove sections for properties, methods, constructors, and fields as you add and remove the same from your Classes and Enums. It works in concert with PlatyPS so that these markdown files are added to the modules and thus available via Get-Help.  It also adds and maintains online documentation for Module Private Functions via PlatyPS.

On Module Classes

This really deserves a blog post of its own. I have started such a blog post, but it is more of a series because there is just so many possibilities and issues surrounding Classes in modules. So, rather than go into all the intricacies here, I will just promise that at some point I will deliver a post/series on this topic at length and that you will just have to take my word here for now.

For now, what you need to know is that The Classy PlatyPS requires a “new” way of using Classes and Enums in your module. First, the Class and Enum definitions must be in separate .ps1 files. The Class definition files will also need to begin with a 3-digit number to indicate its load order. Lower numbers will be loaded first so if on class depends on another, the dependency needs to be loaded first. Second, the Class and Enum files need to be “exported” by the module by adding them to the ScriptsToProcess setting in the module manifest. Finally, they also need to dot sourced in the .psm1.

Folder Layout:

.\PSRAW │ PSRAW.psd1 │ PSRAW.psm1 ├───Classes │ 001-RedditOAuthScope.ps1 │ 002-RedditApplication.ps1 │ 003-RedditOAuthCode.ps1 │ 004-RedditOAuthToken.ps1 │ 005-RedditApiResponse.ps1 └───Enums RedditApplicationType.ps1 RedditOAuthDuration.ps1 RedditOAuthGrantType.ps1 RedditOAuthResponseType.ps1


# ...Snip ScriptsToProcess = @( 'Enums\RedditApplicationType.ps1', 'Enums\RedditOAuthDuration.ps1', 'Enums\RedditOAuthGrantType.ps1', 'Enums\RedditOAuthResponseType.ps1', 'Classes\001-RedditOAuthScope.ps1', 'Classes\002-RedditApplication.ps1', 'Classes\003-RedditOAuthCode.ps1', 'Classes\004-RedditOAuthToken.ps1', 'Classes\005-RedditApiResponse.ps1' ) #Snip...


# ...Snip $functionFolders = @('Enums', 'Classes', 'Public', 'Private' ) ForEach ($folder in $functionFolders) { $folderPath = Join-Path -Path $PSScriptRoot -ChildPath $folder If (Test-Path -Path $folderPath) { Write-Verbose -Message "Importing from $folder" $FunctionFiles = Get-ChildItem -Path $folderPath -Filter '*.ps1' -Recurse | Where-Object { $_.Name -notmatch '\.tests{0,1}\.ps1' } ForEach ($FunctionFile in $FunctionFiles) { Write-Verbose -Message " Importing $($FunctionFile.BaseName)" . $($FunctionFile.FullName) } } } # Snip..

One cool thing about this method is that the Classes and Enums will be automatically imported into the calling scope when Import-Module is called. Again, I plan to cover this more in depth in another blog post/series as this too is another feature I'm pretty excited about. but it's necessary to understand that for The Classy PlatyPS to do its magic, you need to use this method.

The Classy PlatyPS BuildDocs

The Classy PlatyPS Believes Markdown is the One True God

If you recall from my 4-part series on auto-documentation, I was using a psake task named BuildDocs. I discovered some really interesting bugs while trying to use the same process from PSMSGraph (which used the dynamic type system instead of classes). First, I discovered that PlatyPS was not meant to use the Comment Based Help as the source of truth. With PlatyPS you are supposed to use the markdown as the source of truth. My first order of business was to adjust my process around that. You will notice a complete absence of Comment Based Help from function definitions. This is necessary if you want the external help generated by PlatyPS to work. Comment Based Help trumps External Help so it's presence causes some maintenance issues.

The Classy PlatyPS Gets a Job

The next bug I ran into is that PlatyPS as some weird scope issues where classes defined in the module aren't visible when it is parsing the module functions, but only when it is called from a psake task. I don't even know where to begin with this issue, but the gist of it is that since PlatyPS is ignorant of the classes it starts to throw errors trying to process arguments for the public functions that take the module classes as parameters. Again, this only happens within the psake context. if you run the same exact code as a standalone script it works fine.

The solution, then, was to create a separate script and run int as a job from psake. Dot sourcing, & invoking and Invoke-Command all have the same weird scope issue for which I have no explanation. But, Jobs run in a completely separate PowerShell context so we now call the separate BuildDocs.ps1 script as a job.


# ...Snip Task BuildDocs -depends CodeCoverage { $lines Start-Job -FilePath "$ProjectRoot\BuildTools\BuildDocs.ps1" -ArgumentList @( $env:BHPSModuleManifest $ModuleName $MkdcosYmlHeader $ChangeLog $ProjectRoot $ModuleFolder $ReleaseNotes $true $true $true ) | Wait-Job | Receive-Job "`n" } # Snip...

Putting the "Class" in The Classy PlatyPS

After I had the BuildDocs process separated out I wanted to get the auto-documentation of Classes and Enums added to the process. The first problem I needed to solved was just how you figure out what Classes and Enums are being exported by the module. Like most things related to PowerShell v5 classes, there is no documentation on this and I was pioneering new territory.

After some digging around on the types exported from my module I discovered that there is something a bit unique to PowerShell v5 classes from regular .NET classes. It's hard to explain, so here is the RedditApplicationType Enum from my module as an example:

PS C:\> [redditApplicationType].Assembly.Modules[0]
MDStreamVersion : 131072 FullyQualifiedName : ModuleVersionId : 7b468bd7-15ff-4669-bbac-1213a8e12afb MetadataToken : 1 ScopeName : ⧹C։⧹PSRAW⧹PSRAW⧹Enums⧹RedditApplicationType.ps1 Name : Assembly : ⧹C։⧹PSRAW⧹PSRAW⧹Enums⧹RedditApplicationType.ps1, Version=, Culture=neutral, PublicKeyToken=null CustomAttributes : {} ModuleHandle : System.ModuleHandle

You can see that the ScopeName has a funky version of the script path where the Enum is defined. if you look at the scope name of other .NET classes, you get something closer to the dll they are in:

PS C:\> [datetime].Assembly.Modules[0]
MDStreamVersion : 131072 FullyQualifiedName : C:\Windows\Microsoft.NET\Framework64\v4.0.30319\mscorlib.dll ModuleVersionId : d401a7e1-a31e-47e3-87e3-cb92c12b437f MetadataToken : 1 ScopeName : CommonLanguageRuntimeLibrary Name : mscorlib.dll Assembly : mscorlib, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089 CustomAttributes : {[System.Security.UnverifiableCodeAttribute()]} ModuleHandle : System.ModuleHandle

PS C:\> [System.Web.HttpUtility].Assembly.Modules[0]
MDStreamVersion : 131072 FullyQualifiedName : C:\WINDOWS\Microsoft.Net\assembly\GAC_64\System.Web\v4.0_4.0.0.0__b03f5f7f11d50a3a\System.Web.dll ModuleVersionId : f914f5be-9ef1-4be0-afc3-3c88d39da087 MetadataToken : 1 ScopeName : System.Web.dll Name : System.Web.dll Assembly : System.Web, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a CustomAttributes : {[System.Security.UnverifiableCodeAttribute()]} ModuleHandle : System.ModuleHandle

This means that it is possible to find the module Classes and Enums using reflection. This is great! This means that not only can we detect the module classes for the purposes of documenting them, but also for testing. I created a Get-ModuleClass function to help with this:

function Get-ModuleClass { [CmdletBinding()] param ( [Parameter( Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true )] [string[]] $ModuleName ) process { foreach ($Name in $ModuleName) { $Module = $null Write-Verbose "Processing Module '$Name'" $Module = Get-Module -Name $Name -ErrorAction SilentlyContinue if (-not $Module) { Write-Error "Module '$Name' not found" continue } [System.AppDomain]::CurrentDomain. GetAssemblies(). gettypes(). where( {$_.Assembly.Modules[0].ScopeName -match "$Name" -and $_.IsPublic }) } } }

You supply the module name and it returns all the Classes and Enums associated with the module. This does require that export method I detailed before is used and that the module is already imported in the calling scope.

The Classy PlatyPS Embraces Change

Need to add a new Class or Enum? Need to add a new Property, Method, Constructor, or Field? Sick of manually documenting all those changes? Never fear, The Classy PlatyPS will cover all those bases.

The Classy PlatyPS Creates Templates for New Classes

PlatyPS includes a template for about_ topics. This became the template for my templates. I had originally planned to use Plaster for this, but, there are too many moving parts for Plaster to handle. Using reflection, The Classy PlatyPS looks at Classes and Enums and and generates a PlatyPS compatible Markdown template containing entries for all Properties, Methods, and Constructors. It provides information for them such as what their data types are, whether they are hidden or static, definitions, etc. It looks very similar to MSDN documentation for .NET classes. The Best Example of this is the RedditApplication class.

The code for the Reflection is a bit too lengthy to post here. For the full code see BuildDocs-Helper.ps1. As an example here is the MethodText function:

Function MethodText { [cmdletbinding()] param( [Parameter( Mandatory = $true, Position = 0, ValueFromPipeline = $true )] [Object[]] $Method ) process { foreach ($MyMethod in ($Method | Sort-Object Name ) ) { $Params = ($MyMethod.GetParameters() | ForEach-Object { $Type = $_.ParameterType -replace '^System\.([^.]*)$', '$1' "{0} {1}" -f $type, $_.name }) -Join ", " $Access = '' $IsHidden = $False if ($MyMethod.CustomAttributes.AttributeType.Name -contains 'HiddenAttribute') { $Access = 'hidden ' $IsHidden = $true } $IsStatic = $MyMethod.IsStatic $Scope = '' if ($MyMethod.IsStatic) { $Scope = 'static ' } $Name = $MyMethod.Name $ReturnType = $MyMethod.ReturnType.FullName -replace '^System\.([^.]*)$', '$1' $Definition = "{0}{1}{2} {3}({4})" -f $scope, $Access, $ReturnType, $Name, $Params $Heading = MethodHeading $MyMethod $ExecutionContext.InvokeCommand.ExpandString($MethodTemplate) } } }

It takes a System.Reflection.RuntimeMethodInfo object. You can get the methods from a class with something like this:

$Class = [RedditApplication] $Methods = $Class.GetMethods() | Where-Object {$_.IsSpecialName -eq $false}

There is some logic in place to sort methods, properties, fields, constructors alphabetically and by number of arguments.

The Classy PlatyPS Adds and Removes Documentation as Code Changes

This piece is new. I added it after the PSRAW release, but I began working on it a few weeks ago. I wanted to make it easy to keep the Class and Enum documentation up to date as I make changes. Some of these classes are still not in their final form and will have more properties and methods added as the project grows. The problem, however, is parsing markdown

The Classy PlatyPS Puts the Smack-down on Markdown

I first thought of approaching this problem with re-engineering PlatyPS as it contains code to convert Markdown to something code-readable. But after about 3 hours in the PlatyPS code I realized I was in over my head and it would be quicker to use either a ready-made PowerShell Markdown parsing module or a C# dll. My searches revealed that while there are tools out there, they all lack documentation. So, it was time to make my own purpose-built markdown parsing engine.

I created a ConvertFrom-Markdown function which returns a PowerShell object representation of the Markdown file. It's not an overly complex one like HTML DOM or XML. It just nests headings and sub headings and the inner text.

function ConvertFrom-MarkDown { [CmdletBinding()] param ( [Parameter(Mandatory = $true, Position = 0, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true, HelpMessage = "Path to one or more locations.")] [Alias("PSPath")] [ValidateNotNullOrEmpty()] [string[]] $Path ) process { foreach ($MyPath in $Path) { $Object = [System.Collections.Generic.List[System.Collections.Hashtable]]::new() $Headingindex = -1 Get-Content -Path $MyPath | ForEach-Object { $Line = $_ if ($Line -match '^\s*#[^#]') { $Object.add(@{ Heading = $Line -replace '^\s*#\s*' Text = [System.Collections.Generic.List[System.String]]::new() Subheadings = [System.Collections.Generic.List[System.Collections.Hashtable]]::new() }) $SubheadingIndex = -1 $Headingindex++ return } if ($Line -match '^\s*##[^#]') { $Object[$Headingindex].Subheadings.add(@{ Heading = $Line -replace '^\s*##\s*' Text = [System.Collections.Generic.List[System.String]]::new() }) $SubheadingIndex++ return } if ($SubheadingIndex -ge 0) { $Object[$Headingindex].Subheadings[$SubheadingIndex].Text += $Line return } $Object[$Headingindex].Text += $Line } $Object } } }

If you had the following Markdown

# Heading1 Heading1 text
## Subheading1 subheading1 text
## Subheading2 subheading2 text
# Heading2 Heading2 text
## Subheading3 subheading3 text
## Subheading4 subheading4 text

It would create the following PowerShell object

@( @{ Heading = 'Heading1' Text = 'Heading1 text' Subheadings = @( @{ Heading = 'Subheading1' Text = 'subheading1 text' Subheadings = @() } @{ Heading = 'Subheading2' Text = 'subheading2 text' Subheadings = @() } ) } @{ Heading = 'Heading2' Text = 'Heading2 text' Subheadings = @( @{ Heading = 'Subheading3' Text = 'subheading3 text' Subheadings = @() } @{ Heading = 'Subheading4' Text = 'subheading4 text' Subheadings = @() } ) } )

Conversely, I made a ConvertTo-Markdown function which converts an object like that back into Markdown. It also handles the sorting of properties, methods, constructors and fields.

function ConvertTo-MarkDown { [CmdletBinding()] param ( [Parameter(Mandatory = $true, Position = 0, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true )] [ValidateNotNullOrEmpty()] [System.Collections.Generic.List[System.Collections.Hashtable]] $InputObject ) process { foreach ($Heading in $InputObject) { "# {0}" -f $Heading.Heading $Heading.Text $Subheadings = $Heading.Subheadings if ($Heading.Heading -match 'Properties|Methods|Fields') { $Subheadings = $Heading.Subheadings | Sort-Object {$_.Heading} } if ($Heading.Heading -match 'Constructors') { $Subheadings = $Heading.Subheadings | Sort-Object -Property @{ Expression = { if ($_.Heading -match '\(\)') {-1} else {($_.Heading -split ',').count} } Ascending = $true } } foreach ($Subheading in $Subheadings) { "## {0}" -f $Subheading.Heading $Subheading.Text } } } }

The Classy PlatyPS Puts It All Together

Being able to manage markdown as an object makes it easy to see what has and has not ben put in the documentation. We just need to compare "what was" with "what is" and generate "what ought". For that I created an Update-ClassMarkdown function. It's too lengthy to post here but you can see it in BuildDocs-Helper.ps1.

Finally, BuildDocs combines it all:

if ($ClassDocs) { "Processing Classes..." $Classes = Get-ModuleClass -ModuleName $ModuleName | Where-Object {$_.IsClass} $AboutHelpDocs = Get-ChildItem -Path $ModuleHelpPath -Filter 'about_*.md' foreach ($Class in $Classes) { $HelpDoc = $AboutHelpDocs | Where-Object {$_.basename -like "about_$($Class.Name)"} if ($HelpDoc) { Update-ClassMarkdown -Class $Class -Path $HelpDoc.FullName continue } $AboutPath = Join-Path $ModuleHelpPath "about_$($Class.Name).md" Classtext $Class | Set-Content -Path $AboutPath } }

There are similar patterns in place for Enums. They are only separated from classes so that text can use "Enum" instead of "Class" in the templates. Enum objects have Fields instead or properties and of course, those fields don't have any special data type. So the Enum code is pretty boring in comparison to classes, but it uses much of the same reflection techniques.

The Classy PlatyPS Scoffs at Privacy

PSRAW will be a community project. As such, we need good documentation for the Private Functions of the module just as much as we do for the Public Functions. That way contributors can hit the ground running much quicker without having to dig deeply into source to figure out the inner workings. PlatyPS does not include any out-of-the-box support for Module Private Functions. But, it will support automated updating of documentation for functions in general.

The Classy PlatyPS Demands to See Your Privates

The first problem to overcome is getting a list of private functions. Now, most projects, including mine, use a Private folder in the module root to store them. But, some projects may have private functions defined in other locations, including the .psm1. I wanted a universal way to get private functions from a module.  I didn't know where to start but then I realized that pester has this functionality to work with mocking functions. Digging around the source for Mock I discovered something really neat I did not know about.

If you store the result of Get-Module into a variable you can execute commands in the module's context like this:

$Module = Get-Module PSRAW & $Module {Get-Command -CommandType Function} | Where-Object {$_.Source -like 'PSRAW'}

The result will include all the functions the module sees for its own context. That includes both public and private functions! I created a Get-ModulePrivateFunction function to make this easy.

function Get-ModulePrivateFunction { [CmdletBinding()] [OutputType([System.Management.Automation.FunctionInfo])] param ( [Parameter( Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true )] [string[]] $ModuleName ) process { foreach ($Name in $ModuleName) { $Module = $null Write-Verbose "Processing Module '$Name'" $Module = Get-Module -Name $Name -ErrorAction SilentlyContinue if (-not $Module) { Write-Error "Module '$Name' not found" continue } $ScriptBlock = { $ExecutionContext.InvokeCommand.GetCommands('*', 'Function', $true) } $PublicFunctions = $Module.ExportedCommands.GetEnumerator() | Select-Object -ExpandProperty Value | Select-Object -ExpandProperty Name & $Module $ScriptBlock | Where-Object {$_.Source -eq $Name -and $_.Name -notin $PublicFunctions} } } }

This means we can now use them in documentation and testing!!! I put both Get-ModuleClass and Get-ModulePrivateFunction in the ModuleData-Helper.ps1.

The Classy PlatyPS Documents Your Privates

Now that we have a way to get a list of private functions in a module, we need to make PlatyPS able to document them. Unfortunately, we cant just pass functions returned from Get-Command to PlatyPS and get documentation. The functions need to exist in memory. I had hoped that I could maybe call PlatyPS from within the module scope, but it didn't work. The simple solution is to load the definition results from Get-ModulePrivateFunction into memory and then call PlatyPS.


# ...Snip If ($PrivateDocs) { "Processing Private Functions..." $PrivateFunctions = Get-ModulePrivateFunction -ModuleName $ModuleName $PrivateHelp = Get-ChildItem $PrivateHelpPath -Filter '*.md' -ErrorAction SilentlyContinue foreach ($PrivateFunction in $PrivateFunctions) { $HelpDoc = $PrivateHelp | Where-Object {$_.basename -like $PrivateFunction.Name} $FunctionDefinition = "Function {0} {{ {1} }}" -f $PrivateFunction.name, $PrivateFunction.Definition . ([scriptblock]::Create($FunctionDefinition)) if (-not $HelpDoc) { $Params = @{ Command = $PrivateFunction.name Force = $true AlphabeticParamsOrder = $true OutputFolder = $PrivateHelpPath WarningAction = 'SilentlyContinue' } New-MarkdownHelp @Params } $Params = @{ Path = "$PrivateHelpPath\{0}.md" -f $PrivateFunction.Name AlphabeticParamsOrder = $true WarningAction = 'SilentlyContinue' } Update-MarkdownHelp @Params Remove-Item "function:\$($PrivateFunction.name)" -ErrorAction SilentlyContinue } } # Snip...

Why not just dot source the files? Well, how do we find what functions were exported without also doing an AST scan of the files? Also, dot sourcing the .psm1 should be troublesome depending on what all it does.We don't want to execute code so much as we want to create function definitions and get them loaded into memory.

The Classy PlatyPS Results

The BuildDocs script will create several files under the project's \docs folder. The \docs\Module folder contains the markdown documentation for Classes, Enums and Module Public Functions. The \docs\PrivateFunctions folder contains the markdown documentation for Module private functions

\DOCS ├───Module │ about_RedditApiResponse.md │ about_RedditApplication.md │ about_RedditApplicationType.md │ about_RedditOAuthCode.md │ about_RedditOAuthDuration.md │ about_RedditOAuthGrantType.md │ about_RedditOAuthResponseType.md │ about_RedditOAuthScope.md │ about_RedditOAuthToken.md │ Export-RedditApplication.md │ Export-RedditOAuthToken.md │ Get-RedditOAuthScope.md │ Import-RedditApplication.md │ Import-RedditOAuthToken.md │ Invoke-RedditRequest.md │ New-RedditApplication.md │ PSRAW.md │ Request-RedditOAuthToken.md │ Update-RedditOAuthToken.md └───PrivateFunctions Get-AuthorizationHeader.md Request-RedditOAuthCode.md Request-RedditOAuthTokenClient.md Request-RedditOAuthTokenCode.md Request-RedditOAuthTokenImplicit.md Request-RedditOAuthTokeninstalled.md Request-RedditOAuthTokenPassword.md Request-RedditOAuthTokenRefresh.md Show-RedditOAuthWindow.md Wait-RedditApiRateLimit.md

It will also generate the external help. I'm using the en-US culture. This is created under the \PSRAW\en-US folder.

\PSRAW\EN-US about_RedditApiResponse.help.txt about_RedditApplication.help.txt about_RedditApplicationType.help.txt about_RedditOAuthCode.help.txt about_RedditOAuthDuration.help.txt about_RedditOAuthGrantType.help.txt about_RedditOAuthResponseType.help.txt about_RedditOAuthScope.help.txt about_RedditOAuthToken.help.txt PSRAW-help.xml

The about_ topics each have their own .txt file and the Public Functions are in MAML format in the PSRAW-help.xml. Private Functions do not get added to the module's external help as they are not normally accessible from the console and therefore don't need to have in-console documentation.


This has been a blast to work on. Trying to figure this all out really expanded my understanding of PowerShell. It also exposed some painful shortcomings, both within PowerShell and within my own comprehension. In any event, I have been so excited to share this piece of my CI/CD pipeline. It's not something I'm seeing done any other projects.

It's still a bit immature. The sorting can definitely use some work. I skimped on it because I was rushing to get the release out. However, as it stands now, most of the auto-documentation of Classes, Enums, and Private Functions "just works"! It may not be perfect but it is functional beyond my initial expectations.

I only hope that this makes contributing to the PSRAW project easier. I have a full documentation requirement that will fail tests if any Function (public and private), Parameter, Class, Enum, Property, Method, Constructor, Field, or link is missing. The Classy PlatyPS should aid contributors by providing the scaffolding.  All they have to do is fill in the blanks!