This is going to be done in several steps:
googleapis-common
npm module that just contains shared bits that will be used by all packagesnpm publish
process so that each individual package is published along with the larger meta-packageThese steps are of course subject to change as we discover things through the process :)
--- UPDATE ---
We're down to the last step of actually publishing the individual modules. @alexander-fenster, @jkwlui, and @JustinBeckwith last got together to chat about this, and came to a few conclusions:
npm run generate
, and then submitting a PR with like 1344247584 changes. src/apis
directory, and re-creating every API from scratch every time. This makes it easy to detect new APIs, and easy to detect removed APIs. synthtool
for this.releasetool
and semantic-release
could be used to cut individual releases. A scoped tag would be used for each package release from the mono-repo.There's still a lot to figure out, but I suspect @bcoe is gonna love this problem.
@lukesneeringer #1167 is already merged so it can be marked as done I believe.
Are tasks created for all the pending steps in order to have this feature implemented?
👋 We are at the stage where each individual API can have npm pack
run against it, and theoretically pushed to individual modules. Honestly the problem at this stage is the release pipeline and process. We could start cutting these today, but it would mean a semver major release every time we cut a build (sorta like it is today). With 232 APIs, we can't realistically look at the change log for each, and manually cut a build every time there's a change. We need to build automation tools.
We're really blocked on https://github.com/googleapis/google-api-nodejs-client/issues/525. There are a variety of ways to approach it, but it hasn't been pressing just yet.
Just to add to the list: since each API can now be used in a browser, we need to be able to publish each individual API as a webpack bundle, versioned (e.g. drive-v1.2.3.min.js). This is a separate (but related) problem to autoreleases.
Is webpack the better choice here? I think other bundlers such as rollup should be considered, so you don't have to ship the full API to the browser but just what the user uses.
👋 some hot takes from this conversation so far:
googleapis-common
module, we could divide up shared libraries along whatever logical lines make sense and use a resolution tool to determine when new publications of dependents are necessary.synthtool
.webpack
, vs., rollup
, etc; I think this is probably a separate conversation topic, although we should make sure as we refactor the codebase we don't make decisions that are hostile to bundlers, e.g., lazy loading dependencies, using compiled dependencies.@nicoabie @bcoe re: webpack, there is no need to bundle all APIs since we now support webpacking just one API: you need to run webpack from inside the API folder. E.g. see the README from Drive API: https://github.com/googleapis/google-api-nodejs-client/tree/master/src/apis/drive#building-a-browser-bundle (the same applies to all other APIs).
(I'm not arguing on webpack vs rollup vs whatever here, it's just webpack that I used and know it works, other bundlers will probably work as well)
@alexander-fenster fair point, I think this issue is almost resolved then.
The thing with Webpack is that you are shipping the full, in this example, drive library to the browser when the user is just using {drive, auth}
. Using Rollup for building the library instead of Webpack would allow the user's bundler (probably it will be webpack) to perform tree shaking of the parts of the library that are not being used and lower the amount of data that is being sent with no need to the browser.
any updates on this?
https://bundlephobia.com/[email protected]
https://packagephobia.now.sh/result?p=googleapis
It says install size is 45MB
Could you also make it side effects free so that it's tree shakable?
Any updates would be appreciated. The current version of the npm package is costly, 42.6MB, 846 files, and 165 folders.
In case it's useful to others finding this thread… my use of googleapis
was quite light (just one endpoint in the Google Sheets API) and I got good results by depending directly on googleapis-common
and google-auth-library
and copying over the methods I was interested in.
Before, using googleapis
monolith:
import {google, sheets_v4} from 'googleapis';
const creds = JSON.parse(await fs.readFile(credentialsFile, 'utf8'));
const client = new google.auth.JWT({
email: creds.client_email,
key: creds.private_key,
scopes: ['https://www.googleapis.com/auth/spreadsheets.readonly'],
});
await client.authorize();
const sheets = google.sheets({version: 'v4', auth: client});
const result = await sheets.spreadsheets.values.get({
spreadsheetId: sheetId,
range: 'A:ZZ',
});
After, using the underlying libraries:
import {JWT} from 'google-auth-library';
import {AuthPlus, createAPIRequest} from 'googleapis-common';
interface GetParams {
spreadsheetId: string;
range: string;
}
export interface ValueRange {
majorDimension?: string | null;
range?: string | null;
values?: any[][] | null;
}
// This is adapted from google-api-nodejs-client:
// https://github.com/googleapis/google-api-nodejs-client/blob/master/src/apis/sheets/v4.ts#L6483
// See https://github.com/googleapis/google-api-nodejs-client/issues/806
function getValues(auth: JWT, params: GetParams) {
return createAPIRequest<ValueRange>({
options: {
url: 'https://sheets.googleapis.com/v4/spreadsheets/{spreadsheetId}/values/{range}',
method: 'GET',
},
params,
requiredParams: ['spreadsheetId', 'range'],
pathParams: ['range', 'spreadsheetId'],
context: {
_options: {
auth,
},
},
});
}
const creds = JSON.parse(await fs.readFile(credentialsFile, 'utf8'));
const auth = new AuthPlus();
const client = new auth.JWT({
email: creds.client_email,
key: creds.private_key,
scopes: ['https://www.googleapis.com/auth/spreadsheets.readonly'],
});
await client.authorize();
const result = await getValues(client, {
spreadsheetId: sheetId,
range: 'A:ZZ',
});
I ran across this issue in the context of TypeScript getting slow. Removing googelapis
let me remove 300+ .d.ts
files from my project.
Hello, is there any update on this?
Having trouble treeshaking googleapis. My particular use case is for Google Cloud Functions (Firebase functions). The cold start of functions using googleapis is extremely slow since the function needs to load the whole package at cold start, and I'm only using calendar and auth from all the apis.
@nermaljcat just a heads up, I edited your comment to remove a bit of sass. I'll refer you to our code of conduct, in case there are any questions.
ok @JustinBeckwith - I've deleted my comments. I reject censorship and prefer not to participate in a censored forum.
It was delightful to read that your work on the automated release process hit its next step. I recall that was blocking this issue here. Thank you for the great work! https://github.com/googleapis/google-api-nodejs-client/issues/525#issuecomment-639891200
Just to keep y'all in the loop - one of the big remaining concerns over the split was creating confusion between the packages for cloud focused APIs in google-cloud-node. In many cases here, there will be two very similar packages.
For example, as laid out today we'd have a @googleapis/datastore
and a @google-cloud/datastore
for about 40 APIs. This is wiiiiiildly confusing if you don't understand the differences. I'm interested in how folks think we can make this more clear.
As a first step, we landed https://github.com/googleapis/google-api-nodejs-client/pull/2242 which adds a specific callout on the individual READMEs.
The next adventure is https://github.com/googleapis/release-please/issues/471
I'm interested in how folks think we can make this more clear.
@JustinBeckwith wouldn't the cloud focused package be able to depends on the @googleapis/*
ones, if those were split in a separate artefact? If yes, that could simply be explained as a low-level interface (just the raw types, slim client) vs a high level one (higher level wrapper, fat client w/ friendly dx).
Sorta not really :/ With a few exceptions, the majority of cloud packages in the @google-cloud
scope are based on grpc / proto based interfaces. There is an entirely different generator that creates those packages.
We're starting to come around to the idea of @googleapis/datastore-rest
as the naming convention for these, while keeping @google-cloud/datastore
for the higher level modules.
Something to consider might be to start splitting API that are not cloud first?
Most helpful comment
any updates on this?
https://bundlephobia.com/[email protected]
https://packagephobia.now.sh/result?p=googleapis
It says install size is 45MB
Could you also make it side effects free so that it's tree shakable?