Stencil: Browser loader file is not changing [hash] when a children is updated

Created on 1 Nov 2019  路  6Comments  路  Source: ionic-team/stencil

Stencil version:

 @stencil/[email protected]

I'm submitting a:

[x ] bug report
[ ] feature request
[ ] support request => Please do not submit support requests here, use one of these channels: https://stencil-worldwide.herokuapp.com/ or https://forum.ionicframework.com/

Current behavior:
Here is a scenario:

  • We deploy our generated assets to S3 + CF
  • We fix a bug from one of the component
  • We deploy again - but some [hash] are still the same ( even if a dependency changed). As a result, customers do not get the right version of the file since it's cached by the CDN.

When generate a build for the first time, i get the current file dependencies for system files (nomodule):

music.js -> p-6f5c5f8f.system.js -> p-psgpjyfa.system.js

Now I make a CSS change to one of my component and I get the following file dependencies:

music.js -> p-6f5c5f8f.system.js -> p-slsd14o.system.js

Expected behavior:

I expect the browser loader files (p-6f5c5f8f.system.js) to have a new hash since a file it depends on has changed.

p-6f5c5f8f.system.js is an entry file but according to rollup, the [hash] should be based on the content of its dependencies (quote [hash]: A hash based on the content of the entry point and the content of all its dependencies.) but that's not what i'm seeing.

triage

Most helpful comment

I'm also facing this issue using @stencil/[email protected].

We're self-hosting the files and caching them "forever", which breaks because the hashed filename of the "namespace"-file doesn't update correctly, even though a child-component has changed - and thereby also the contents of the "namespace"-file.

鈿狅笍 I think this is a critical issue, please prioritize it :pray:

All 6 comments

I've also run into the same caching issue with S3 + CF and have considered adding a hash to the entry file as a workaround, i.e. music-h4shplz.js.

I already do that actually but because the second file (p-6f5c5f8f.system.js) has the same hash it does not matter.
In the short term I will deploy each build to a new S3 folder.

Ah okay I understand the issue now. Yeah, I agree the hash should change if it's dependency has changed.

I'm also facing this issue using @stencil/[email protected].

We're self-hosting the files and caching them "forever", which breaks because the hashed filename of the "namespace"-file doesn't update correctly, even though a child-component has changed - and thereby also the contents of the "namespace"-file.

鈿狅笍 I think this is a critical issue, please prioritize it :pray:

Workaround for our project is trying to create custom script which generating hash based on content (sha256 - same as rollup) and compare it with hash extracted from entry point file name.

If they are different, rename entry point file name to new generated hash then update entry point in system loader file.

My custom script

const fs = require('fs-extra');
const path = require('path');
const crypto = require('crypto');

const buildDir = path.join('www', 'build');
const systemLoaderPath = path.join(buildDir, 'music.js');
const ENTRY_REG_EXP = /p\-(.*)\.system\.js/;
fs.readFile(systemLoaderPath, 'utf8', (err, systemLoaderContent) => {
  const entryMatches = systemLoaderContent.match(ENTRY_REG_EXP);
  if (entryMatches && entryMatches.length > 0) {
    const entryPath = path.join(buildDir, entryMatches[0]);

    fs.readFile(entryPath, 'utf8', (err, entryContent) => {
      const hash = generateContentHash(entryContent, 8);
      const oldHash = entryMatches[0].replace(ENTRY_REG_EXP, '$1');
      if (hash !== oldHash) {
        var newEntryPath = `p-${hash}.system.js`;
        fs.renameSync(entryPath, path.join(buildDir, newEntryPath));
        systemLoaderContent = systemLoaderContent.replace(ENTRY_REG_EXP, newEntryPath);
        fs.writeFileSync(systemLoaderPath, systemLoaderContent, 'utf8');
        console.log(`Re-generate hash of entry point from ${oldHash} to ${hash}. New entry file: ${newEntryPath}.`);
      }
    });
  }
});

function generateContentHash(content, length) {
  let hash = crypto.createHash('sha256').update(content).digest('hex').toLowerCase();

  if (typeof length === 'number') {
    hash = hash.substr(0, length);
  }

  return hash;
}

Ran in to this issue today with users receiving old files from our CDN. I know the team is busy but 10 months without a response kinda sucks.

Was this page helpful?
0 / 5 - 0 ratings