Azure-functions-host: Feature planning: first class PowerShell support

Created on 5 May 2016  路  19Comments  路  Source: Azure/azure-functions-host

This is a tracking item for First Class PowerShell support. We'll be tracking the requirements and issues for first class PowerShell support via this item. Feel free to leave comments about features/design patterns.

Function format

For the Azure Function PowerShell format, we'll keep the existing "scripting" model and limit the run file to the .ps1 format. A future feature will address .psm1 support with a proper "function" model.

This means we use the existing pattern of communicating data via files & environment variables. It also means that we don't "cache" the environment that we run the script from. This will mean an inherent performance overhead, but this is likely acceptable for the scenarios where PowerShell scripting will be used. For more advanced scenarios, we'll need to address those issues with psm1 support.

The scripting format, as is looks as so:

$in = Get-Content $Env:input

[Console]::WriteLine("Powershell script processed queue message '$in'")

$output = $Env:output
$json = $in | ConvertFrom-Json
$entity = [string]::Format('{{ "timestamp": "{0}", "status": 0, "title": "Powershell Table Entity for message {1}" }}', $(get-date -f MM-dd-yyyy_HH_mm_ss), $json.id)
$entity | Out-File -Encoding Ascii $output

Breaking this down, data coming in (via triggers and input bindings) is passed along via files which are communicated via environment variables, the names of which derive from the name property of the corresponding binding in the function.json. Data out works the same way. Any data being sent to output bindings is output to a local file specified via the environment variable corresponding to the name parameter in the function.json for the corresponding output bindings.

Data Type formats

All data is transferred via files. This means that it's up to the user to parse the file to the right format.

Assuming the user knows which format the data in the file is in, all formats should be supportable.

  • String

    • File contents as is, assuming UTF-8 encoding

    • input example: [string]$str = Get-Contents $Env:input

    • output example: $str > $Env:output

  • Int

    • File contents, assuming file only contains a number

    • input example: [int]$int = Get-Contents $Env:input

    • output example: $int > $Env:output

  • Bool

    • File contents, assuming file only contains 0 or 1

    • input example: [bool]$bool = [int](Get-Contents $Env:input)

    • output example: $bool > $Env:output

  • Object/JSON

    • File contents, assuming its valid JSON

    • input exampe: $json = Get-Content $Env:input | ConvertFrom-Json

    • output example: $json | ConvertTo-Json > output.txt

  • Binary/Buffer

    • File contents

    • input example: [byte[]] $byte = Get-Content .$Env:input -Encoding Byte

    • output example $byte | Set-Content $Env:output -Encoding Byte

  • Stream (via file stream)

    • File contents

    • input example: $reader = [System.IO.File]::OpenText($Env:input)

    • output example $writer = [System.IO.StreamWriter] $Env:output

  • HTTP

    • _TBD_

Version & Package management

TBD

Testing/CI

TBD

Change log

  • 5/5 - Initial plan
feature lang-PowerShell

Most helpful comment

I was thinking more about _azure_ doing the C# part, and letting people write PowerShell functions that would, to quote you:

mimic as close as possible the command-line and scripting workflows on the PowerShell.exe console.

As a general rule, when people are writing functions for the PowerShell console, it would be an anti-pattern to use file IO for parameters _or_ output. Instead, functions should take parameters and output objects.

In my ideal world, your sample PowerShell template function would look more like this:

param($request)
Write-Verbose "PowerShell HTTP trigger function processed a request. RequestUri=$($req.RequestUri)"
<# ... do stuff #>
$name = $request.GetQueryNameValuePairs().Where({$_.Key -eq "Name" }).Value

return $req.CreateResponse("OK". "Hello $name");

But obviously that requires some code a little like what you wrote in your C# example -- I don't think people should have to write that themselves _nor_ settle for having to serialize and deserialize through JSon on disk ...

All 19 comments

Need to make sure that new model solves the log streaming issue described in this thread.

@christopheranderson first sentence talks about both PowerShell & Python. Copy/paste error I assume? :)

Yes. That's what happened. Nice catch.

I noticed the current version runs on PowerShell v4. I'd really like to see that get bumped to V5.

Need to have a means of reporting a failed execution. Related issue: https://github.com/Azure/azure-webjobs-sdk-script/issues/371

Need docs around external .dlls/modules/etc. for powershell - Related issue: https://github.com/Azure/azure-webjobs-sdk-script/issues/372

@tohling Does your new PowerShell support address the issues @trondhindenes is raising?

Per @christopheranderson's point, I would love more docs. Currently experiencing problems with 1) executing an exe and b) using a module

I'm a bit confused. This is clearly not First Class PowerShell support -- it appears that you've created a way to support "any executable" by routing IO through the file system. That is, it looks like you're trying to create a single method for providing minimal support for PowerShell, Python, Perl, Ruby, et. al.

Compared to, say, the C# implementation, there's nothing "first class" about this -- you might as well just have a "command-line app" support and let us provide files and command lines.

Why can't you use the PowerShell API, host it, and pass a parameter with the request object, collecting the output the same as you would from C# (example in the closed 309 above).

@Jaykul, your observation is correct. This feature is intended to support customers who want to write Azure Functions using PowerShell cmdlets. As such, the user experience will try to mimic as close as possible the command-line and scripting workflows on the PowerShell.exe console.

When you mentioned your preference for PowerShell API, are you referring to using _System.Management.Automation.dll_ and writing C# code that calls PowerShell APIs, as shown in the example here)? If that is the case, you can author your code in a C# Function.

Here is a simplified example of a C# HTTP-triggered Function that uses the PowerShell API.

Sample code:

#r "System.Management.Automation"

using System.Net;
using System.Collections.ObjectModel;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");

    // parse query parameter
    string name = req.GetQueryNameValuePairs()
        .FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
        .Value;

    var ps = System.Management.Automation.PowerShell.Create();
    var results = ps.AddCommand("Get-Process").Invoke();
    foreach (var result in results)
    {
        log.Info($"Get-Process returned: {result}");
    }

    ps.Commands.Clear();
    results = ps.AddCommand("Get-Date").Invoke();
    var date = results.First();
    log.Info($"Get-Date returned: {date}");
    ps.Commands.Clear();

    // Get request body
    dynamic data = await req.Content.ReadAsAsync<object>();

    // Set name to query string or body data
    name = name ?? data?.name;

    return name == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
        : req.CreateResponse(HttpStatusCode.OK, "Hello " + name + ". Today's date is: " + date);
}

Sample log output:

2016-09-30T09:25:12  Welcome, you are now connected to log-streaming service.
2016-09-30T09:25:15.758 Function started (Id=092a6ae3-99b0-46bb-a938-ce0c5bcfb4c6)
2016-09-30T09:25:15.758 C# HTTP trigger function processed a request. RequestUri=https://functions56772475.azurewebsites.net/api/powershellcsharp
2016-09-30T09:25:17.071 Get-Process returned: System.Diagnostics.Process (w3wp)
2016-09-30T09:25:17.071 Get-Date returned: 9/30/2016 9:25:17 AM
2016-09-30T09:25:17.084 Function completed (Success, Id=092a6ae3-99b0-46bb-a938-ce0c5bcfb4c6)

Sample HTTP response message:
"Hello Azure. Today's date is: 9/30/2016 9:25:17 AM"

I was thinking more about _azure_ doing the C# part, and letting people write PowerShell functions that would, to quote you:

mimic as close as possible the command-line and scripting workflows on the PowerShell.exe console.

As a general rule, when people are writing functions for the PowerShell console, it would be an anti-pattern to use file IO for parameters _or_ output. Instead, functions should take parameters and output objects.

In my ideal world, your sample PowerShell template function would look more like this:

param($request)
Write-Verbose "PowerShell HTTP trigger function processed a request. RequestUri=$($req.RequestUri)"
<# ... do stuff #>
$name = $request.GetQueryNameValuePairs().Where({$_.Key -eq "Name" }).Value

return $req.CreateResponse("OK". "Hello $name");

But obviously that requires some code a little like what you wrote in your C# example -- I don't think people should have to write that themselves _nor_ settle for having to serialize and deserialize through JSon on disk ...

@Jaykul, thank you for the clarification. I understand now that your expectation was that the user experience would be more akin to a PowerShell function. The term "PowerShell function" has a plural context at this point. We will keep this in mind and update our documentation to make the distinction clearer.

Unfortunately, the ideal workflow you suggested is not a supported scenario at this 'Experimental' stage. We appreciate this feedback and will add it as a consideration for our future planning.

I agree with @Jaykul. Their example is much more compelling than yours, @tohling

@aharpervc, thank you for your input. We have added this request to our list and hope to support it in our future release.

Just throw in my 2c. I write PowerShell to work with SharePoint and Office 365. These are scripts originally intended to be run from a client PC. I then upload them (with assemblies & modules) to Azure Function and they run just as they would locally, but now in the cloud.

So, I think mirroring this use case for experimental stage is perfect. It works exactly as I expect. Whether or not I get pure PowerShell Functions with proper object streaming is almost secondary. I think as long as the documentation makes it clear what's being passed in/out I think it's fine.

Would it be possible in v2 to support object streaming via a different binding syntax?

A different point - how would one call one PowerShell azure-function from another?

@johnnliu, thank you for trying out PowerShell in Functions.

Q1: Would it be possible in v2 to support object streaming via a different binding syntax?
I want to be sure I understand your request correctly. Could you elaborate with an example?

Q2: A different point - how would one call one PowerShell azure-function from another?
All PowerShell Functions can be activated to execute with one of the supported triggers. For instance, Function B could be setup as an HTTP-triggered Function and Function A could call Function B with an HTTP request.

Q2: calling Azure Function from another Azure Function is a conversation for another thread I think.

Q1: object streaming as parameter or output is the same issue that was raised earlier by @Jaykul - PowerShell, in contrast to Bash, promotes one script to pipe object natively to the next script.

So in powershell, $req should be the request object. Not the path to a temporary file that holds the contents of the request. My thinking is that this isn't top priority for me personally in experimental, but wondering aloud if this can be added later. The trouble is, similar to when $env:req was renamed to $req - lots of Powershell scripts will break if this changes.

@johnnliu, thank you for the clarification. Yes, we will be adding that to our list of items to support in future updates.

Closing this as all the work is now tracked at https://github.com/azure/azure-functions-powershell-worker

Was this page helpful?
0 / 5 - 0 ratings