Powershell: Dash as a parameter crashes PWSH

Created on 15 Apr 2019  路  19Comments  路  Source: PowerShell/PowerShell

PowerShell 6.2.0 on Windows 10

Steps to reproduce

test.ps1

[CmdletBinding()]param(
     [string]$p1,
     [string]$p2,
    [Parameter(ValueFromPipeline)][string]$InputObject
)
process{
    $input.replace($p1, $p2)
}

Run the following from CMD:

echo hello world | powershell -f test.ps1 e E
hEllo world
echo hello world | powershell -f test.ps1 e -
CRA$H@#*
echo hello world | powershell -f test.ps1 e "-"
CRA$H@#*

CRA$H@#* = PowerShell crashes
image

Expected behavior

h-llo world

Actual behavior

CRASH of pwsh.exe !

Environment data

Name                           Value
----                           -----
PSVersion                      6.2.0
PSEdition                      Core
GitCommitId                    6.2.0
OS                             Microsoft Windows 10.0.16299
Platform                       Win32NT
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0鈥
PSRemotingProtocolVersion      2.3
SerializationVersion           1.1.0.1
WSManStackVersion              3.0
Issue-Bug Resolution-Fixed WG-Engine

Most helpful comment

Responded to the comment.

Also, That would be a new issue. Please open a new issue for new issues. It is VERY confusing to use the same issue for multiple issues.

All 19 comments

NB:

  • This also crashes PowerShell v5
  • It seems to be the dash - which crashes things
  • It only crashes from CMD, run from within the powershell shell things work fine:
echo "hello world" | .\test.ps1 e -
h-llo world

@cawoodm Please share the content of test.ps1 file.

I've updated the original question with the code.

Just double-checked that it crashes PowerShell 6/Core (pwsh.exe) - it does:

C:>echo hello world | pwsh .\test.ps1 e -

image

Simple repo from cmd and pwsh:

pwsh any.ps1 -
any.ps1 : Cannot process argument because the value of argument "name" is not valid. Change the value of the "name" argument and run the operation again.
+ CategoryInfo          : InvalidArgument: (:) [any.ps1], PSArgumentException
+ FullyQualifiedErrorId : Argument,any.ps1

I debug the code and guess that it is "by design". In "pwsh any.ps1 -" we interpret the dash as parameter name prefix and get null as parameter name. If we run "> .\any.ps1 -" we interpret the dash as argument value.

I don't know can/should we fix this or no.

/cc @SteveL-MSFT @daxian-dbw @mklement0

It happens only with (implied) -File, not with -Command.

With -Command, it behaves as expected: the - is interpreted as a _positional argument_ rather than the start of an (invalid) parameter name.

While there are - regrettable[1] - by-design differences between argument parsing with -File vs. -Command (literals vs. as-code), they should behave the same in this case (and, obviously, there shouldn't be a crash).


[1] In short: for conceptual clarity and predictable parameter passing (in line with POSIX-like shells), only ever the _first_ argument following -Command should be interpreted as a snippet of PowerShell code, whereas all remaining arguments should be treated as _literal_ arguments _to pass as arguments to that first argument_, as with -File.

It seems the problem come from #4019 or #7449 (@TravisEz13 could you please look this too?)
https://github.com/PowerShell/PowerShell/blob/ceed73d7375cb98a199ceb19796bb2311b0b0002/src/Microsoft.PowerShell.ConsoleHost/host/msh/CommandLineParameterParser.cs#L1133-L1141

Debug shows "i" has value 0 but I think it must be 1.

@mklement0 Look comment in the code.

@iSazonov: looking at it purely from a behavioral perspective:

- is only special as the _first_ - and then _only_ supported - argument passed to -File - which is not the case here.

Since -f (-File) is used _explicitly_, #4019 shouldn't come into play - don't know anything about #7449.

As an aside: both -File - and -Command - are currently badly broken (the // Process interactive input... comment provides a hint) - see #3223

Based on this comment https://github.com/PowerShell/PowerShell/issues/9362#issuecomment-483235798, this was not introduced in either of these PRs

@TravisEz13 I checked that Windows PowerShell 5.1 (10.0.17763.437) is not crashed.

I think @mklement0 is correct here. There is a logic error in the code. It should verify the immediate parameter before was -file or -command

I have a fix

Responded to the comment.

Also, That would be a new issue. Please open a new issue for new issues. It is VERY confusing to use the same issue for multiple issues.

Unfortunately, @cawoodm's code snippet points to a larger (separate) problem:

If you look closely, the process block doesn't actually use the declared parameter variable, $InputObject, but the automatic $Input variable.

Sadly, this is actually _necessary_ in order for the script to receive outside stdin input _directly_, and while you can apply this hack in your own scripts, it is not an option with standard cmdlets.

In short: outside stdin input doesn't automatically translate to pipeline input - explicit use of $Input is required - see #9497

@mklement0 My code came about hacking around trying to find a variable which consistently represented the piped stream of input (stdin). What I came up with was that in a begin{} block it's $InputObject, in an end{} block it's $input and in the process{} block it's $_ which seems crazy but it's what I found.

I'd be eternally grateful of a resource describing when $_, $InputObject and $Input are to be used.

@cawoodm - here's my $0.02:

Ideally, all you should use in your advanced script is $InputObject (whose freely chosen parameter-variable name is a _convention_) - from within PowerShell, this will work as expected.

The lack of support for automatic mapping of external stdin input to PowerShell pipeline input detailed in #9497 prompted your use of $Input as a workaround.

A workaround that wouldn't require modification of your script is:

echo hello world | powershell -command  '$Input | ./test.ps1 e E'

That is the primary use of $Input, I'd say: enumerating the lines of external stdin input; see below for a secondary use.

Generally:

  • If you have explicitly declared pipeline-binding parameters, use only them - no need for $_ or $Input.

    • Your pipeline-binding parameters, say $InputObject, has no meaningful value in begin - unless you've passed a value as an _argument_ instead, though even then you should perform processing in the process block.
    • In the process block, $InputObject represents the input object at hand (or a _property_ value of it, if the parameter is declared with ValueFromPipelineByPropertyName).
    • Because external stdin input isn't mapped to pipeline input, $InputObject is not bound in your scenario (if you just used $InputObject, the process block would be entered _once_, with $InputObject containing the type-specific default value)
    • That using $Input actually works in this case is surprising, and I wouldn't rely on it; even more obscurely, you could alternatively place $null = $input in your begin(!) block, and then use $InputObject in your process block, as you normally would - I have no idea why that works.
    • In the end block, $InputObject still contains the _last_ input object.
  • If not:

    • use $_ in the process block ($Input works too, but is wrapped in a single-element ArrayList in advanced scripts / functions).
    • if you don't have a process block and you just want to collect all objects first and process them in the (possibly implied) end block, use $Input there. (If you do have a process block, $Input will be empty in the end block.)

It would appear, however, that $Input is not available in an end{} block if a process{} block is present. So I see no way of making a function/script which can work in both modes - like being able to filter objects in the stream (via process{}) OR, in a different mode, summarize/sort the entire stream (via end{}).

Consider the following function/script which should convert an object or stream of objects to JSON (producing "{object}, {object2}..." output) and, optionally with the -AsArray switch pass the entire stream through in one go (producing "[{object},{object2}...]" output):

[CmdLetBinding()]
[OutputType([String])]
param(
  [Parameter(Mandatory,ValueFromPipeline)]
  [object]$InputObject,
  [switch]$Compress,
  [switch]$AsArray
)
process {
  if (-not $AsArray) {
    return $InputObject | ConvertTo-Json -Compress:$Compress
  }
}
end {
  if ($AsArray) {
    Write-Host $Input # Nothing, unless we remove the process block above
    if ($Input -is [array] -and $Input.length -gt 1) {
      # We have an array, pass to convertto-json as is
      return $Input | ConvertTo-Json -Compress:$Compress
    } else {
      # Coerce single object to an array
      return ConvertTo-Json -InputObject @($Input) -Compress:$Compress
    }
  }
}

AFAIK this can only be solved with 2 different scripts/functions.

$Input is not available in an end{} block if a process{} block is present

Correct (it's what I tried to say with "if you don't have a process block ...").

If you do have a process block, you must perform collecting all input manually, which is best done by instantiating a list type such as [System.Collections.Generic.List[object]] in begin and appending to it with .Add() in the process block.

While I can't speak directly to the design intent, it makes sense to me to _consume_ the input in a process block, as it is being received, so as to enable _streaming_ processing that doesn't require collecting all input in memory. Collecting therefore only happens in the absence of a process block.

As an aside: $Input is _always_ an array; what you need to guard against is that array only having 1 element, and, if so, use , $Input, not @($Input).

Was this page helpful?
0 / 5 - 0 ratings