Aws-cdk: [core] unable to run CDK diff in upgrading from 1.51.0 to 1.62.0: : Assets must be defined indirectly within a "Stage" or an "App" scope

Created on 11 Sep 2020  Β·  21Comments  Β·  Source: aws/aws-cdk


I just upgraded my cdk version from 1.51.0 to 1.62.0, and I updated the code so that npm build runs successfully for the project. I then tried to do a diff of my stacks with cdk diff <stackname> and it failed with this error:

unable to determine cloud assembly output directory. Assets must be defined indirectly within a "Stage" or an "App" scope
Subprocess exited with error 1

In googling that I came across https://github.com/aws/aws-cdk/issues/9546 and I followed the directions to make sure all my cdk packages were pinned at the exact same version. after doing that I still ran into the error. I do use custom lambda functions packaged like so:

 const statusLambda = new lambda.Function(this, 'StatusLambda', {
            code: lambda.Code.asset(path.join(__dirname, '../lambdas/codepipeline_status_lambda')),
            handler: 'index.handler',
            timeout: cdk.Duration.seconds(300),
            runtime: lambda.Runtime.NODEJS_10_X,
            environment: {
                ACCESS_TOKEN: cdk.SecretValue.secretsManager(props.gitHubSecretArn).toString(),
                DATADOG_API_KEY: cdk.SecretValue.secretsManager(props.dataDogSecretArn).toString(),
                ENV_TYPE: props.env_type,
                STAGE_TYPE: props.stage_type,
                APP_URL: props.url,
            },
        })

which worked just fine before the upgrade, but now it still is failing with the same error, that issue i linked above mentioned that using lambda assets could cause this. and I am unsure of how to proceed to fix this, and also why this suddenly became an issue.

Reproduction Steps

use a custom lambda asset like so:

 const statusLambda = new lambda.Function(this, 'StatusLambda', {
            code: lambda.Code.asset(path.join(__dirname, '../lambdas/codepipeline_status_lambda')),
            handler: 'index.handler',
            timeout: cdk.Duration.seconds(300),
            runtime: lambda.Runtime.NODEJS_10_X,
            environment: {
                ACCESS_TOKEN: cdk.SecretValue.secretsManager(props.gitHubSecretArn).toString(),
                DATADOG_API_KEY: cdk.SecretValue.secretsManager(props.dataDogSecretArn).toString(),
                ENV_TYPE: props.env_type,
                STAGE_TYPE: props.stage_type,
                APP_URL: props.url,
            },
        })

What did you expect to happen?

cdk diff to pass without error

What actually happened?

cdk diff returned this error:

unable to determine cloud assembly output directory. Assets must be defined indirectly within a "Stage" or an "App" scope
Subprocess exited with error 1

Verbose output:

CDK toolkit version: 1.62.0 (build 8c2d7fc)
Command line arguments: {
  _: [ 'diff' ],
  profile: 'outline-dev',
  v: 4,
  verbose: 4,
  'ignore-errors': false,
  ignoreErrors: false,
  json: false,
  j: false,
  ec2creds: undefined,
  i: undefined,
  'version-reporting': undefined,
  versionReporting: undefined,
  'path-metadata': true,
  pathMetadata: true,
  'asset-metadata': true,
  assetMetadata: true,
  'role-arn': undefined,
  r: undefined,
  roleArn: undefined,
  staging: true,
  'no-color': false,
  noColor: false,
  fail: false,
  'context-lines': 3,
  contextLines: 3,
  strict: false,
  '$0': 'cdk',
  STACKS: [ 'DevCoreStack' ],
  stacks: [ 'DevCoreStack' ]
}
cdk.json: {
  "app": "npx ts-node bin/corestack.ts"
}
cdk.context.json: {
  "availability-zones:account=349142687622:region=us-west-2": [
    "us-west-2a",
    "us-west-2b",
    "us-west-2c",
    "us-west-2d"
  ],
  "availability-zones:account=349142687622:region=us-east-1": [
    "us-east-1a",
    "us-east-1b",
    "us-east-1c",
    "us-east-1d",
    "us-east-1e",
    "us-east-1f"
  ],
  "availability-zones:account=703994937577:region=us-east-1": [
    "us-east-1a",
    "us-east-1b",
    "us-east-1c",
    "us-east-1d",
    "us-east-1e",
    "us-east-1f"
  ],
  "availability-zones:account=703994937577:region=us-east-2": [
    "us-east-2a",
    "us-east-2b",
    "us-east-2c"
  ]
}
merged settings: {
  versionReporting: true,
  pathMetadata: true,
  output: 'cdk.out',
  app: 'npx ts-node bin/corestack.ts',
  context: {},
  assetMetadata: true,
  profile: 'outline-dev',
  toolkitBucket: {},
  staging: true
}
Determining if we're on an EC2 instance.
Does not look like an EC2 instance.
Toolkit stack: CDKToolkit
Setting "CDK_DEFAULT_REGION" environment variable to us-east-1
Resolving default credentials
Retrieved account ID 349142687622 from disk cache
Setting "CDK_DEFAULT_ACCOUNT" environment variable to 349142687622
context: {
  'availability-zones:account=349142687622:region=us-west-2': [ 'us-west-2a', 'us-west-2b', 'us-west-2c', 'us-west-2d' ],
  'availability-zones:account=349142687622:region=us-east-1': [
    'us-east-1a',
    'us-east-1b',
    'us-east-1c',
    'us-east-1d',
    'us-east-1e',
    'us-east-1f'
  ],
  'availability-zones:account=703994937577:region=us-east-1': [
    'us-east-1a',
    'us-east-1b',
    'us-east-1c',
    'us-east-1d',
    'us-east-1e',
    'us-east-1f'
  ],
  'availability-zones:account=703994937577:region=us-east-2': [ 'us-east-2a', 'us-east-2b', 'us-east-2c' ],
  'aws:cdk:enable-path-metadata': true,
  'aws:cdk:enable-asset-metadata': true
}
outdir: cdk.out
env: {
  CDK_DEFAULT_REGION: 'us-east-1',
  CDK_DEFAULT_ACCOUNT: '349142687622',
  CDK_CONTEXT_JSON: '{"availability-zones:account=349142687622:region=us-west-2":["us-west-2a","us-west-2b","us-west-2c","us-west-2d"],"availability-zones:account=349142687622:region=us-east-1":["us-east-1a","us-east-1b","us-east-1c","us-east-1d","us-east-1e","us-east-1f"],"availability-zones:account=703994937577:region=us-east-1":["us-east-1a","us-east-1b","us-east-1c","us-east-1d","us-east-1e","us-east-1f"],"availability-zones:account=703994937577:region=us-east-2":["us-east-2a","us-east-2b","us-east-2c"],"aws:cdk:enable-path-metadata":true,"aws:cdk:enable-asset-metadata":true}',
  CDK_OUTDIR: 'cdk.out',
  CDK_CLI_ASM_VERSION: '5.0.0',
  CDK_CLI_VERSION: '1.62.0'
}
unable to determine cloud assembly output directory. Assets must be defined indirectly within a "Stage" or an "App" scope
Subprocess exited with error 1
Error: Subprocess exited with error 1
    at ChildProcess.<anonymous> (/Users/grahambooth/.nvm/versions/node/v12.18.3/lib/node_modules/aws-cdk/lib/api/cxapp/exec.ts:118:23)
    at ChildProcess.emit (events.js:315:20)
    at ChildProcess.EventEmitter.emit (domain.js:483:12)
    at Process.ChildProcess._handle.onexit (internal/child_process.js:275:12)

Environment

  • CLI Version : 1.62.0
  • Framework Version:
  • Node.js Version:v12.18.3
  • OS : macOS
  • Language (Version):TypeScript (3.8.3)

Other


This is :bug: Bug Report

@aws-cdcore bug in-progress

Most helpful comment

I think I've tracked this issue down.

As @jogold pointed out, it has to do with nesting different copies of @aws-cdk/core. However I politely disagree and would argue it is a very valid use case, especially with monorepos. It occurs whenever packages aren't hoisted to the top. Lerna provides a mode for this and even npm-v7 workspaces seem to encourage this node_modules structure.

Generally speaking I don't think it sits well with CDK to enforce a single instance of the code, not just the same version number.

Therefore the example repo @Shogan created is still a good starting point. The issue occurs with any kind of assets. Lambda Functions with Code Assets are an obvious one as they are often pulled out into a reusable library. However like @flochaz this also appears to happen when the code asset is just a parameter to a lambda lib. The reason being that a lambda with logRetention will cause assets for the custom resource to be stages, which will then fail.


Enough of the prelude. The issue surfaces because stages are enforced for asset staging as of 1.57.0. Specifically this change by @eladb

Now, Stages have been around for a long time. However they have been pretty much optional and I guess where only used with the new pipelines. Long story short, they use a different mechanic to determine if a thing is a stage object.

The Stage class uses instanceof which will return false for objects that are not from the same code instance.

All other "global" packages use a mechanism with Symbol.for: App, Stack, Aspect


Unless there is a good reason for this difference, I'd suggest to use the same mechanism for Stages which will fix this issue. If someone could confirm the direction, I'm happy to provide a PR as it is holding us back.

I'd also suggest to include a test for nested code copies.

All 21 comments

Can you try to also remove your lock file (rm package-lock.json) and then reinstall (npm i)?

I would also guess it's a version mismatch somewhere. Make sure all @aws-cdk packages in your package.json have the same version (and make sure you don't have ^1.51.0 caret versions in there).

I would also guess it's a version mismatch somewhere. Make sure all @aws-cdk packages in your package.json have the same version (and make sure you don't have ^1.51.0 caret versions in there).

@rix0rrr there is also the "multiple copies of @aws-cdk/core issue", have a look at https://github.com/aws/aws-cdk/issues/10210#issuecomment-690093916

Oh good fine. In that case, running npm dedupe might also help.

I had this and a delete of the package-lock.json and the node_modules folder fixed it for me. Must've been the package-lock.json as just a clear out of the node_modules didn't do it for me.

hmm yeah i tried the node modules + package lock deletion steps, as well as making sure all my versions were pinned to 1.62.0, but no luck. ill try npm dedupe to see what that shows. i eventually just gave up and only upgraded to 1.56.0 from 1.51.0 and none of these issues were present

I am hitting this bug with CDK versions 1.64.0 and 1.64.1. There are two ways of triggering this bug:

  1. Mismatched versions of cdk modules in a typescript project
  2. Creating a Lambda function using source code that is nested in a different folder e.g. up to parent directory and then within a sub-directory of that, or in a child directory of the current working directory.

I am experiencing number 2 above (lambda with Code.fromAsset).

Here is an example:

let lambdaFunc = new lambda.Function(this, `MyFunc`, {
      runtime: lambda.Runtime.NODEJS_12_X,
      code: lambda.Code.fromAsset(path.join(__dirname, 'lambda/myfunc')),
      handler: "index.handler",
      memorySize: 128,
      timeout: Duration.seconds(5)
});

I've tried deleting node_modules, package-lock.json, and cdk.out, and have triple checked all aws-cdk packages are at the exact same, pinned version. Then done npm install && tsc. cdk diff or cdk synth or cdk deploy all result in the error:

unable to determine cloud assembly output directory. Assets must be defined indirectly within a "Stage" or an "App" scope
Subprocess exited with error 1

As soon as I downgrade to 1.52.0 (or 1.57.0) then everything works fine.

I am hitting this bug with CDK versions 1.64.0 and 1.64.1.

@shogan Can you share a repro?

@Shogan I have been running into the same issue. On a whim, I added my 'assets' directory to my package.json file, in the "directories" property and both cdk diff and cdk synth started working again!

{
    ...
    "directories": {
        "assets": "assets"
    }
   ...
}

@Shogan I have been running into the same issue. On a whim, I added my 'assets' directory to my package.json file, in the "directories" property and both cdk diff and cdk synth started working again!

{
    ...
    "directories": {
        "assets": "assets"
    }
   ...
}

Thanks @jamiepeloquin, but not exactly sure which assets directory to specify here? Do you mean the cdk.out assets directory? I tried with just 'assets' but didn't see any difference.

@jogold I've homed in on the issue now, initially I set up a simple AWS CDK app and could not reproduce. However my existing project still emitted the problem.

It happens when you have a cdk library project, and then create a nested cdk app project within that library, referencing the library itself with the parent directory path. E.g. importing the library from the app with path ../../index. Also importantly, you must be creating Lambda functions using the codeFromPath method, where the lambda function assets are created from code in a path in the library.

Here is a cut-down minimal reproduction project. See README.md for reproducing steps once cloned.

https://github.com/Shogan/aws-cdk-issue-10314-repro

Hope that helps!

@Shogan Sorry for the hazy advice… I have a directory named β€œassets” in the root of my CDK project. This is where I put my Lambda code (each Lambda Function gets its own subdirectory). I hope the example below helps to clarify my finding.

Example:

Project structure (abbreviated)

β”œβ”€β”€ Jenkinsfile
β”œβ”€β”€ README.md
β”œβ”€β”€ assets
β”‚Β Β  β”œβ”€β”€ my-fargate-task
β”‚Β Β  └── my-lambda-function
β”‚   β”‚   └── index.js
β”œβ”€β”€ bin
β”‚Β Β  └── index.js
β”œβ”€β”€ cdk.context.json
β”œβ”€β”€ cdk.json
β”œβ”€β”€ cdk.out
β”œβ”€β”€ lib
β”‚Β Β  └── stack-prototype.js
β”œβ”€β”€ node_modules
β”œβ”€β”€ package-lock.json
└── package.json

My Lambda Resource

const myLambdaFunction = new lambda.Function(this, 'MyLambdaFunction', {
  code: lambda.Code.fromAsset(path.join(__dirname,'../assets/my-lambda-function/')),
  handler: 'index.handler',
  runtime: lambda.Runtime.NODEJS_12_X,
  timeout: cdk.Duration.seconds(5)
});

My package.json

{
  "directories": {
    "assets": "assets",
    "lib": "lib"
  },
}

@Shogan you are nesting different copies of @aws-cdk/core (same version but different copies) this is the problem, see https://github.com/aws/aws-cdk/issues/10210#issuecomment-690093916 for an explanation.

You should use a monorepo/lerna for this kind of architecture.

@Shogan Sorry for the hazy advice… I have a directory named β€œassets” in the root of my CDK project. This is where I put my Lambda code (each Lambda Function gets its own subdirectory). I hope the example below helps to clarify my finding.

Example:

Project structure (abbreviated)

β”œβ”€β”€ Jenkinsfile
β”œβ”€β”€ README.md
β”œβ”€β”€ assets
β”‚   β”œβ”€β”€ my-fargate-task
β”‚   └── my-lambda-function
β”‚   β”‚   └── index.js
β”œβ”€β”€ bin
β”‚   └── index.js
β”œβ”€β”€ cdk.context.json
β”œβ”€β”€ cdk.json
β”œβ”€β”€ cdk.out
β”œβ”€β”€ lib
β”‚   └── stack-prototype.js
β”œβ”€β”€ node_modules
β”œβ”€β”€ package-lock.json
└── package.json

My Lambda Resource

const myLambdaFunction = new lambda.Function(this, 'MyLambdaFunction', {
  code: lambda.Code.fromAsset(path.join(__dirname,'../assets/my-lambda-function/')),
  handler: 'index.handler',
  runtime: lambda.Runtime.NODEJS_12_X,
  timeout: cdk.Duration.seconds(5)
});

My package.json

{
  "directories": {
    "assets": "assets",
    "lib": "lib"
  },
}

Thanks for this @jamiepeloquin. I did give it a try as per your workaround but that didn't help in my situation. Probably down to the particular nesting strategy I was using of an app inside a library repo.

@jogold, thanks for the link and explanation. This makes sense now. I had nested things in this way just for local development. I think it's important to note though, that with previous versions of CDK this setup worked fine. (E.g. if I switch back to 1.52.0 or prior it is fine).

Getting the same issue as @Shogan but not with my own lambdas but using AwsCustomResource ... I'll try to replace them by my own custom resource implementation ... (same as above, work fine with 1.52.0 version)

I think I've tracked this issue down.

As @jogold pointed out, it has to do with nesting different copies of @aws-cdk/core. However I politely disagree and would argue it is a very valid use case, especially with monorepos. It occurs whenever packages aren't hoisted to the top. Lerna provides a mode for this and even npm-v7 workspaces seem to encourage this node_modules structure.

Generally speaking I don't think it sits well with CDK to enforce a single instance of the code, not just the same version number.

Therefore the example repo @Shogan created is still a good starting point. The issue occurs with any kind of assets. Lambda Functions with Code Assets are an obvious one as they are often pulled out into a reusable library. However like @flochaz this also appears to happen when the code asset is just a parameter to a lambda lib. The reason being that a lambda with logRetention will cause assets for the custom resource to be stages, which will then fail.


Enough of the prelude. The issue surfaces because stages are enforced for asset staging as of 1.57.0. Specifically this change by @eladb

Now, Stages have been around for a long time. However they have been pretty much optional and I guess where only used with the new pipelines. Long story short, they use a different mechanic to determine if a thing is a stage object.

The Stage class uses instanceof which will return false for objects that are not from the same code instance.

All other "global" packages use a mechanism with Symbol.for: App, Stack, Aspect


Unless there is a good reason for this difference, I'd suggest to use the same mechanism for Stages which will fix this issue. If someone could confirm the direction, I'm happy to provide a PR as it is holding us back.

I'd also suggest to include a test for nested code copies.

I am hitting this bug with CDK versions 1.64.0 and 1.64.1.

@Shogan Can you share a repro?

I've created the simplest example I can think of, reproducing the problem here:
https://github.com/gwriss/upgrade_bug

Extra useful information about the issue can be found here:
https://github.com/aws/aws-cdk/issues/10977

Please let me know if there is anything I can do to help solve the issue

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

@eladb is this fix already part of a release? Dont know where to look it up. Facing this issue too in respect to typescript / lerna packages and CDK code (especially lambdas) spread across different lerna packages even with the CDK packages pinned at one exact version (in my case 1.70.0)

@eladb is this fix already part of a release? Dont know where to look it up. Facing this issue too in respect to typescript / lerna packages and CDK code (especially lambdas) spread across different lerna packages even with the CDK packages pinned at one exact version (in my case 1.70.0)

I think this will be released in 1.72.0, so you need to be patient πŸ˜ƒ

@logemann We faced a similar problems but found no satisfying solution with lerna (see: https://github.com/aws/aws-cdk/issues/10977#issuecomment-721049591)

My conclusion is that there are 4 solutions to this:
1) Use pnpm
2) Use Rush (with yarn/npm) (can fail for indirect dependencies, but should not be a problem in CDK projects?)
3) Use lerna + yarn workspaces (be aware of phantom dependencies)
4) Publish all packages, don't link dependencies (not very satisfying while developing)

Please comment if other solutions exists? We are still evaluating what path to take, going forward.

We've just updated a bunch of our lerna stacks to 1.72.0 and it finally works again! πŸŽ‰

yup just updated to 1.72.0 and it works. Thanks!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

cybergoof picture cybergoof  Β·  3Comments

peterdeme picture peterdeme  Β·  3Comments

NukaCody picture NukaCody  Β·  3Comments

ababra picture ababra  Β·  3Comments

abelmokadem picture abelmokadem  Β·  3Comments