Hi,
JSON is an important data exchange format. At present there is no explicit way to annotate an object as pure JSON. We were expecting union types to solve this problem via the following:
interface Json {
[x: string]: string|number|boolean|Date|Json|JsonArray;
}
interface JsonArray extends Array<string|number|boolean|Date|Json|JsonArray> { }
There are currently a number of problems with this.
``` typeScript
interface Id extends Json {
id: number;
}
var z = (): Id => ({id: 'foo'}); // Error: Missing index signature
```
class Bass {
f(): Id{ return undefined;}
}
// Error: missing index signature
class Foo extends Bass {
f() { return { id: 10 }}
}
interface Foo extends Json {
foo: { val: string }; // Error: Not assignable
}
Other problems:
// Error: Missing index signature
var result: Id[] = 'a|b'.split('|').map(item => {
return { id: 0 };
});
The first two problems look likely to be resolved, but not the last two.
This motivates the introduction of a natively supported json
type that conforms to the JSON spec.
It should be possible to derive a custom JSON object from the json
type with non of the problems above. Furthermore, since JSON is a data exchange format it should be possible to mark fields as nullable on types deriving from json
(which would trigger a type guard).
I am of course aware that this is all rather vague!
What kind help could the compiler provide with this?
In the case where you're _receiving_ JSON, the fact that you know everything is a string, number, boolean, array, null, or object isn't terribly useful? You could prevent yourself from passing a JSON object into an function expecting a Function
, but in all other cases it's within spitting distance of any
.
In the case where you're _emitting_ JSON, the type system doesn't really encapsulate all the aspects of JSON that are important. JSON.stringify
skips properties from the prototype and properties of type Function
, both of which are going to be common when doing class-based programs. I think people wouldn't like it if JSON.stringify(myClass)
were an error, but now we're really back in the case where it's practically any
(minus calling JSON.stringify(someFunc)
which is hopefully rare and a quick error to spot). And obviously we have no way to warn you about circular data structures that will cause an error.
For the record:
master
.The discussion is with respect to the case where both the client and server are written in JavaScript. For this case the JSON is shared and can be quite large and complex.
There are use-cases even where Ajax is not involved where data transfer only works when the payload is string. An example of this is the HTML5 dataTransfer object. For this case having the json
type ensures the payload is correct on both ends and that JSON.parse
is not going to fail.
In the case where you're receiving JSON, the fact that you know everything is a string, number, boolean, array, null, or object isn't terribly useful
True if we were simply looking at how the received JSON is subsequently used. As I mentioned above, the json
annotation helps to ensure invalid data structures are not created in the first place, e.g.
interface Foo extends json {
image: File; // error
}
In the case where you're emitting JSON, the type system doesn't really encapsulate all the aspects of JSON that are important. JSON.stringify skips properties from the prototype and properties of type Function, both of which are going to be common when doing class-based programs.
The point is classes encapsulate _behaviour_. JSON represents _data_. People who .stringify
classes are a special breed - rather like the dodo :smile:.
Edit: Ignore the following in the light of strict-null-checks
I am also very interested in exploring what (if anything) can be done to describe nullability in JSON types. Pretty much all the null reference problems in my experience occur around the use of JSON. Since null
is explicitly a JSON value, perhaps it should be an error to access a property on a json
-derived type without a type guard:
interface Foo extends json {
bar: string;
}
var x: Foo;
x.bar.toString(); // Error
x.bar && x.bar.toString(); // okay
@jbondc, I'm not sure that I understand. How does that help to provide compile-time safety for JSON?
Seems like not-null types
Yes, but shouldn't break existing code ;) I'm trying to explore the implications of defining a json
type, same as any
or int
. As I noted above, this type would be an ideal candidate to have default nullable properties, firstly because that's part of the JSON spec and secondly, since we use JSON to transfer data, that's where a lot of the null reference problems occur.
JSON is a structured object, so an interface should be able to describe with 100% fidelity the data it contains.
Yes, but with int
for example the following is an error:
var x: int = "10";
But not so for interfaces:
interface Foo extends json {
html: HtmlElement; // We would like this to be an error
}
Since when JSONs are objects?
type Json = string;
@AlicanC,
JSON is string over the wire, but when one calls JSON.parse
it becomes an object. This object is a subset of regular JavaScript and that is what we'd like to model.
Yes, the JSON becomes an object and it's not JSON anymore. JSON is what you give to JSON.parse()
, not what you get from it. What you get is a value. Maybe you guys should rename your types to JsonValue
.
Maybe you guys should rename your types to JsonValue.
Can you elaborate? Which guys and what types?
Seems like the problems that you described above are resolved now? Trying out this approach and it looks like it's working just fine.
Also: I think this has deeper utility that just representing some arbitrary format like JSON. It enables us to more clearly distinguish data from operations on data. That's a quite fundamental piece of our domain as software developers, I don't think that would be a totally useless concept to have available and tangible in code, even if OOP still might be the norm.
https://github.com/electricessence/TypeScript.NET/blob/master/source/JSON.d.ts
This is what I have, but I'm not sure if it's adding any value really. I still have to constantly add <type>
constraints everywhere to make sure it's correct.
I'm wondering if it's simply more effective to do run-time validation than be concerned with compile time constraints.
I think part of the challenge is that a JSON blob can consist of either a map or an array and in those cases the indexers are different.
So for example, I've had to do use <T extends JsonMap | JsonArray>
for the expected output parameter.
https://github.com/electricessence/TypeScript.NET/blob/master/_utility/file-promise.ts#L59-L80
And therefore still have to pass one or the other in order for it to work as show in the first link.
So again, I wish there were more examples of where typing JSON helps.
This type would be useful for data that must be able to roundtrip through JSON serialization and deserialization, such as data to be stored or otherwise faithfully reproduce across a remoting boundary.
It seems to me that this simple definition of Json
works to provide some guidance for some library that works with arbitrary user data in that way, but there are a few problems. I don't know if there is already a solution to them, or if not, what the solution ought to be. I'm just saying it would be useful. Not sure why the type system needs an explicit extend JsonMap
here to recognize the compatibility. I don't think there's any way to disallow classes from extending Json
.
export interface JsonMap { [member: string]: string | number | boolean | null | JsonArray | JsonMap };
export interface JsonArray extends Array<string | number | boolean | null | JsonArray | JsonMap> {}
export type Json = JsonMap | JsonArray | string | number | boolean | null;
interface Document extends JsonMap {
one: string;
two: boolean;
3.141592: "pi" | boolean;
}
function clone<T extends Json>(data : T) { return <T>(JSON.parse(JSON.stringify(data))); }
var a : Json = "Hello"; clone(a).toLowerCase(); // π
var b : Json = 42; clone(b).toExponential(); // π
var c : Json = true; { let t = clone(c); } // π
var d : Json = null; { let t = clone(d); } // π
var e : Json = [1, 2, ""]; clone(e).length; // π
var f : Json = {}; {let t = clone(f); } // π f : JsonMap
var g : Json = { a: "Hello" }; clone(g).a; // π g .. l : JsonMap, not especially useful as written
var h : Json = { b: 42 };
var i : Json = { c: true };
var j : Json = { d: null };
var k : Json = { e: [1, 2, ""] };
var l : Json = { e: { 5.4: "foo", mixed: "key types" } };
var m : Json = () => "baz"; // ERROR π
var n : JsonMap = { a : "bar", fn: () => "baz" }; // ERROR π
var o : JsonMap = [{ a : "bar", fn: () => "baz" }]; // ERROR π
{
let p : Document = { one : "foo", two : false, 3.141592: "pi" };
let t = clone(p); // t : Document π
t.one; // : string π
t.two; // : boolean π
let tt = t[3.141592]; // tt : "pi" | boolean π
let tu = t[3.1415]; // tu : string | number | boolean | null | JsonArray | JsonMap
}
var q = () => "baz"; clone(q); // ERROR π
var r = { a : "bar", fn: () => "baz" }; clone(r); // ERROR π
var s = [{ a : "bar", fn: () => "baz" }]; clone(s); // ERROR π
class MyClass implements JsonMap { [key : string] : string }
clone(new MyClass()) instanceof MyClass; // π± false
// Structural typing isn't enough for otherwise-compatible interfaces
interface OtherLibDocument {
one: string;
two: boolean;
3.141592: "pi" | boolean;
}
{
let p2 : OtherLibDocument = { one : "foo", two : false, 3.141592: "pi" };
let t = clone(p2); // π± Property 'includes' is missing in type 'OtherLibDocument'
}
interface OtherLibDocument2 {
one: string;
two: boolean;
3.141592: "pi" | boolean;
}
// Definition-Merge other libraries' compatible interfaces
interface OtherLibDocument2 extends JsonMap { }
{
let p3 : OtherLibDocument2 = { one : "foo", two : false, 3.141592: "pi" };
let t = clone(p3); // t : Document π
t.one; // : string π
t.two; // : boolean π
let tt = t[3.141592]; // tt : "pi" | boolean π
let tu = t[3.1415]; // tu : string | number | boolean | null | JsonArray | JsonMap
}
@NoelAbrahams
The point is classes encapsulate behaviour. JSON represents data. People who .stringify classes are a special breed - rather like the dodo π.
I think you are on to something here π but, judging by the number of Angular users on Stack Overflow who expect the simple _use_ of TypeScript to automagically transform JSON.parse
into Newtonsoft.Json.JsonConvert.DeserializeObject
, they have no idea they are destined for extinction π.
Another "me too." Here's what we ended up with:
export type JSONPrimitive = string | number | boolean | null;
export type JSONValue = JSONPrimitive | JSONObject | JSONArray;
export type JSONObject = { [member: string]: JSONValue };
export interface JSONArray extends Array<JSONValue> {}
Unfortunately, usage on arbitrary types requires a type assertion like unmarshal(jsonType as {})
.
Looks like not too late to the party. Google search didn't take long to hit on this issue & what others suggest. What I came up with is basically same as @niedzielski. Despite caveats, it's simple enough to be useful and sits a comfortable distance from settling for just any
.
In difference to @niedzielski's version, preferring Json
to all-caps JSON
prefix and thinking JsonMap
instead of JsonObject
(as others also have above) as feel "Map" is less overloaded than "Object".
Also going with AnyJson
rather than JsonValue
.
End result is
type AnyJson = boolean | number | string | null | JsonArray | JsonMap;
interface JsonMap { [key: string]: AnyJson; }
interface JsonArray extends Array<AnyJson> {}
Ambivalent about whether or not to separate out JsonPrimitive.
For those finding this issue after the release of TypeScript 2.9, TypeScript now has Support for well-typed JSON imports.
TypeScript is now able to import JSON files as input files when using the node strategy for moduleResolution. This means you can use json files as part of their project, and theyβll be well-typed!
These JSON files will also carry over to your output directory so that things βjust workβ at runtime.
This would still be quite useful. For example, I would like to describe a RPC interface via typescript, but assert that inputs/outputs are JSON-serializable. Unfortunately, even with the JSON*
interfaces described above, that doesn't seem feasible w/ current typescript:
export type JSONPrimitive = string | number | boolean | null;
export type JSONValue = JSONPrimitive | JSONObject | JSONArray;
export type JSONObject = { [member: string]: JSONValue };
export interface JSONArray extends Array<JSONValue> {}
export interface ServiceDeclaration {
[key: string]: (params?: JSONObject) => Promise<JSONValue>;
}
// Expected: No errors.
interface MyService extends ServiceDeclaration {
// Error: Property 'doThing' of type '(params?: { id: string; } | undefined) => Promise<string>' is not assignable to string index type '(params?: JSONObject | undefined) => Promise<JSONValue>'.
doThing(params?: { id: string }): Promise<string>;
}
One trick that gets me closer is to have template types that ask for keys explicitly, to drop the index type (e.g. the same as typescriptβs built in Record
type):
export type IsJSONObject<TKeys extends string = string> = { [Key in TKeys]: JSONValue };
interface Foo {
id: string;
}
function doThing(params: JSONObject) {}
function doThing2<T>(params: IsJSONObject<keyof T>) {}
const foo: Foo = null as any;
// Error: Argument of type 'Foo' is not assignable to parameter of type 'JSONObject'.
// Index signature is missing in type 'Foo'.
doThing(foo);
// No error!
doThing2(foo)
I'm not sure how to express that in terms of ServiceDeclaration
, thoughβ¦
Just wanted to mention that this type would also be very useful for me in light of the TS 3.1 breaking change "narrowing functions now intersects {}, Object, and unconstrained generic type parameters". My existing code essentially has a type that could be either something JSON-serializable or a function; before TS 3.1, the former was blissfully expressed as an unconstrained generic, but now I need to figure out how to narrow its type, and doing that would be much easier if a JSON type existed. (On the other hand, I might be going about solving this problem in entirely the wrong way.)
My use case is strictly typing cloneable values that can be passed to WebWorkers through postMessage()
. If an object contains a method for example, it will through a runtime error when trying to send it to the Weorker. This should be possible to catch at compile time.
I think @indiescripter got it right. That code snippet is pretty much standard boilerplate for me.
Perhaps that could get added to the TS standard type library? That would solve the issue for me, at least.
I'm in the same boat. Typing an RPC interface, need to describe "arguments must be JSON". Would also love if this was in the standard library.
Another use case is typing JSON Patch (RFC6902) request objects. For "add", "replace", and "test", the value
property must be valid JSON but otherwise can be any arbitrary value (or null). Using the example JSONValue
type above (thank you!), I have:
export interface IJsonPatchWithValue {
op: "add" | "replace" | "test";
path: string;
value: JSONValue; // ideally would be "value: json;"
}
export interface IJsonPatchWithFrom {
op: "copy" | "move";
path: string;
from: string;
}
export interface IJsonPatchRemove {
op: "remove";
path: string;
}
export type JsonPatch = IJsonPatchWithValue | IJsonPatchWithFrom | IJsonPatchRemove;
FYI: Just stumbled upon this PR https://github.com/microsoft/TypeScript/pull/33050. I believe it would address this issue.
Yep, I think this can be classed as done in 3.7.
Looking at the nightly playground it doesn't look like we include a Json
type in the global namespace, maybe this issue could represent that now
I made a playground link on nightly with the example of the issue thread (and added one more). 3 problems are solved now, but 2 are still unsolved.
What about supporting types that have a toJSON
function?
So we should have two new types in the standard library - Json
and JsonSerializable
, where the latter expands the former to include complex types which contain a toJSON
method.
If this feature would be introduced, the TS compiler could generate JSON.parse
of the JSON as a string literal, because it was recently measured to be faster in all JS engines:
https://v8.dev/blog/cost-of-javascript-2019#json
I found this issue when searching for a way to type an arbitrary JSON object that comes from a third party library.
The solution described by @niedzielski works nicely, except that typescript-eslint was giving me a warning when using interface JSONArray extends Array<JSONValue> {}
:
An interface declaring no members is equivalent to its supertype. eslint(@typescript-eslint/no-empty-interface)
So here is my slightly modified version:
export type JSONPrimitive = string | number | boolean | null;
export type JSONValue = JSONPrimitive | JSONObject | JSONArray;
export type JSONObject = { [member: string]: JSONValue };
export type JSONArray = JSONValue[];
Thanks @niedzielski!
Is there any way to use this in a more restrictive way for objects? I want to achieve two things:
Extend from something like JSONObject
to define interfaces with specific properties, so I can ensure that the actual objects are always JSON compatible and still comply with my definition of properties.
I want to access the properties in the expected way.
The problem with implementations like the above is that if my custom interface extending JSONObject
I can still access object members that should not exist according to my definition, because { [member: string]: JSONValue }
still applies to my custom type.
Example:
interface Car extends JSONObject {
make: string;
}
const mycar = {
make: 'Xxx',
price: 1000, // This should not be allowed, but it is.
};
const {
make,
colour, // This should not be allowed, but it is.
} = mycar;
Now colour
is recognised as a JSONValue
, but I really want this to be an error. Also, setting the price
property should not be allowed, because it's not part of my definition. I tried a few things with Pick<β¦>
etc. but I can't solve it. In my opinion, the JSONObject
-like type should behave more like a native object in TypeScript. Any ideas?
My use case: I already have a pretty large number of interfaces for a JSON-based web API, with functions for both client and server relying on the data being JSON compatible, most of them extending something like the above mentioned JSONObject
. When I have to change one of the defined properties, let's say I remove one, it is very difficult to find all the places in the code base that still rely on that property and it won't show up anywhere as an error, neither in the IDE, nor at compile time.
@Manc
I have a similar use case to your's. Here is one (ugly) workaround I came up with:
Car
and JSONObject
Car
is not compatible with JSONObject
anymore the dummy interface errorsCar
interface CarJsonCheck extends Car, JSONObject {}
interface Car {
make: string;
}
The obvious drawbacks are that:
Therefore I will wait until there is something better ...
I think I got it!
If you are wondering about the Pick / Required thing in the middle, it was necessary to accept optional properties, but otherwise disallow undefined. I got the idea from this blog post.
type Json =
| null
| boolean
| number
| string
| Json[]
| { [prop: string]: Json };
type JsonCompatible<T> = {
[P in keyof T]: T[P] extends Json
? T[P]
: Pick<T, P> extends Required<Pick<T, P>>
? never
: T[P] extends (() => any) | undefined
? never
: JsonCompatible<T[P]>;
};
function test<T extends JsonCompatible<T>>(json: T): T {
return null as any;
}
interface A {
a: number;
}
class B {
a!: number;
}
interface C {
a?: number;
}
interface X {
a: Date;
}
interface Y {
a?: Date;
}
interface Z {
a: number | undefined;
}
interface W {
a?: () => any;
}
const any = null as any;
// compiler OK
test(null);
test(false);
test(0);
test("");
test([]);
test({});
test([0]);
test({ a: 0 });
test(any as A);
test(any as B);
test(any as C);
// compiler throws
test(new Date());
test([new Date()]);
test({ a: new Date() });
test({ a: undefined });
test(any as X);
test(any as Y);
test(any as Z);
test(any as W);
@osi-oswald, This is amazing!
It seems like there is a case it doesn't catch which is an interface with an optional function:
The below should type fail but it doesn't:
interface W {
foo?: () => void,
};
declare const w: W;
test(w);
@skylerjokiel Interesting... π I extended my JsonCompatible type to also catch your interface W as non-JsonCompatible
My use case for this feature is to have the compiler preventing unsafe assumptions about JSON, both when receiving it as input:
const app = express()
app.post('/token', express.json(), (req, res, next) => {
// Compiler ensures I treat this as string | number | null | JSONArray | JSONObject
// and won't let me assume it's a string
const { grant_type } = req.body;
});
And from a service:
const response = await fetch('/.well-known/openid-configuration')
// Compiler ensures I treat this as string | number | null | JSONArray | JSONObject
// and won't let me assume it's a string
const { token_endpoint } = await response.json();
JSON.parse()
returning any
encourages unsafe property and method access. Now obviously both of these libraries could include a JSON typing or I could create my own copy/paste it into every codebase I use. But that's a bit onerous, both to write and to evangelize across a team, hence my π
@osi-oswald You should PR that to the DefinitelyTyped repo to get published to NPM.
I still have a use case that this fails to solve unfortunately. I am designing an ts RPC service that looks like this:
type Json = void | Date | null | boolean | number | string | Json[] | { [prop: string]: Json }
type JsonCompatible<T> = {
[P in keyof T]: T[P] extends Json
? T[P]
: Pick<T, P> extends Required<Pick<T, P>>
? never
: T[P] extends (() => any) | undefined
? never
: JsonCompatible<T[P]>
}
type ApiModule = {
[method: string]: <T extends JsonCompatible<T>>(...args: any[]) => T
}
type ApiDefinition = {
[moduleNamespace: string]: ApiModule
}
type ValidateApiDefinition<T extends ApiDefinition> = T
and when typing out my api definition:
type Api = {
tag: {
search(tagStr: string): string[]
}
}
// validate that the api definition has the correct structure
type X = ValidateApiDefinition<Api>
gives an error
Property 'tag' is incompatible with index signature.
Type '{ search: (tagStr: string) => string[]; }' is not assignable to type 'ApiModule'.
Property 'search' is incompatible with index signature.
Type '(tagStr: string) => string[]' is not assignable to type '<T extends JsonCompatible<T>>(...args: any[]) => T'.
Type 'string[]' is not assignable to type 'T'.
'T' could be instantiated with an arbitrary type which could be unrelated to 'string[]'.
which makes sense, perhaps my error has more to do with the fact that I am attempting to use generics inside a index signature, but I still do not think I am able to type this particular structure in typescript.
Update: I was able to achieve a Json type that does not need to be validated with a generic!
type JsonPrimitive = string | number | boolean | null
interface JsonMap extends Record<string, JsonPrimitive | JsonArray | JsonMap> {}
interface JsonArray extends Array<JsonPrimitive | JsonArray | JsonMap> {}
type Json = JsonPrimitive | JsonMap | JsonArray
The trick was using Records
and stealing the trick from this user https://github.com/microsoft/TypeScript/issues/14174#issuecomment-518944393 to avoid circular references. The prior example now compiles on the playground
constraining the args to be json as well turns out to be much harder. E.g. the following fails with
type Api = {
tag: {
search: (tagStr: string) => string[];
};
}
Type 'Api' does not satisfy the constraint 'ApiDefinition'.
Property 'tag' is incompatible with index signature.
Type '{ search: (tagStr: string) => string[]; }' is not assignable to type 'ApiModule'.
Property 'search' is incompatible with index signature.
Type '(tagStr: string) => string[]' is not assignable to type '(arg: string | number | boolean | JsonMap | JsonArray | null) => string | number | boolean | JsonMap | JsonArray | null'.
Types of parameters 'tagStr' and 'arg' are incompatible.
Type 'string | number | boolean | JsonMap | JsonArray | null' is not assignable to type 'string'.
Type 'null' is not assignable to type 'string'.(2344)
but this is still a good start!
I would argue what you really want to do is something like this:
type Json = void | Date | null | boolean | number | string | Json[] | { [prop: string]: Json }
type JsonCompatible<T> = {
[P in keyof T]: T[P] extends Json
? T[P]
: Pick<T, P> extends Required<Pick<T, P>>
? never
: T[P] extends (() => any) | undefined
? never
: JsonCompatible<T[P]>
}
type ApiModule = {
[method: string]: (...args: any[]) => Json // T // | Rx.Observable<JsonCompatible<T>>
}
type ApiDefinition = {
[moduleNamespace: string]: ApiModule
}
type ValidateApiDefinition<T extends ApiDefinition> = T
type Api = {
tag: {
search: (tagStr: string) => NotJsonAsJson
}
}
type X = ValidateApiDefinition<Api>
class NotJson {
whatAmI() {
return 'not json'
}
}
type NotJsonAsJson = JsonCompatible<NotJson>
The generics in the index signature of your ApiModule
type don't really serve a purpose, and all you really want to do is validate that the methods in an ApiModule
return something that is Json
.
@restjohn I dont think that actually does what I want it to do. The point of the ValidateApiDefinition
is to throw a typescript error if the return type of a function is not serializable. In reality that type will be part of a createServerRPC
or createClientRPC
method, and may even do some type manipulation. (e.g. wrapping the function responses in promises)
const client = tsRpc.createClientRPC<Api>('/rpc-route')
await client.tags.search('something')
the update I posted above yours solves the use case I was having, I think maybe we just had a real life race condition and posted answers at the same time :sweat_smile:
Any serializable type based on interfaces is flawed, because it doesn't allow sub-interfaces to constrain their keys. I've recently had success using type
s, with the limitation that interface
types can't be used.
export type SerializableScalar = string | number | boolean | null;
export type SerializableObject = {
[key: string]: SerializableScalar | SerializableObject | SerializableArray;
}
export type SerializableArray = Array<SerializableScalar | SerializableObject | SerializableArray>;
export type Serializable = SerializableScalar | SerializableObject | SerializableArray;
function isomorphicWidget<P extends SerializableObject>(props: P) {
// ...
type MyWidgetProps = {
a: number;
b: string;
};
const myWidgetProps: MyWidgetProps = {
a: 1;
b: 'hello',
};
const myIsomorphicWidget = isomorphicWidget(myWidgetProps);
However, I'd love to be able to do this with interface
s too, so I think this needs language-level support.
```ts
type Json = void | Date | null | boolean | number | string | Json[] | { [prop: string]: Json }type JsonCompatible
= {
[P in keyof T]: T[P] extends Json
? T[P]
: Pickextends Required >
? never
: T[P] extends (() => any) | undefined
? never
: JsonCompatible}
@restjohn What's missing to make this work for nested objects? For example, the following fails:
interface Bar {
v: string
}
interface Foo {
bar: Bar
}
let x: JsonCompatible<Foo> = {
bar: {v: '42'}
}
For anyone checking this out so far down the road, here is my attempt at a better solution. I found no solution online that actually seemed to work correctly (including some from this thread).
type primitive = null
| boolean
| number
| string
type DefinitelyNotJsonable = ((...args: any[]) => any) | undefined
export type IsJsonable<T> =
// Check if there are any non-jsonable types represented in the union
// Note: use of tuples in this first condition side-steps distributive conditional types
// (see https://github.com/microsoft/TypeScript/issues/29368#issuecomment-453529532)
[Extract<T, DefinitelyNotJsonable>] extends [never]
// Non-jsonable type union was found empty
? T extends primitive
// Primitive is acceptable
? T
// Otherwise check if array
: T extends (infer U)[]
// Arrays are special; just check array element type
? IsJsonable<U>[]
// Otherwise check if object
: T extends object
// It's an object
? {
// Iterate over keys in object case
[P in keyof T]:
// Recursive call for children
IsJsonable<T[P]>
}
// Otherwise any other non-object no bueno
: never
// Otherwise non-jsonable type union was found not empty
: never
I would love to be told if I'm wrong.
Here is a comparison with @osi-oswald's implementation:
_Edited Nov 5, 2020, to catch all functions per @letmaik's comment_
For anyone checking this out so far down the road, here is my attempt at a better solution. I found no solution online that actually seemed to work correctly (including some from this thread).
@grant-dennison My test case of nested objects passes now, that's great. I found a few issues though:
undefined
is suboptimal because JSON.stringify
will simply omit such fields which corresponds to being an optional field in TS. I don't see any problem in allowing those and in fact they are often needed.test((a:number,b:number) => 0);
passes but it shouldn't. Is there a way to capture all possible functions?class ABC {
private x: Uint8Array = new Uint8Array(1)
}
test(new ABC())
I don't think allowing classes generally makes sense as you wouldn't get them back with JSON.parse
automatically. I think ultimately there should be two versions of this, one which only allows pure values (where JSON.stringify
/JSON.parse
allows recovers the input exactly) and another one which also allows objects with toJSON
like Date
and custom classes (the above would not be allowed). I guess this would then rely on passing a correct reviver
to JSON.parse
.
I think excluding undefined is suboptimal because JSON.stringify will simply omit such fields which corresponds to being an optional field in TS. I don't see any problem in allowing those and in fact they are often needed.
@letmaik I'd rather the following test pass for all possible objects conforming to a proposed JSON type:
expect(myObject).to.deep.equal(JSON.parse(JSON.stringify(myObject)))
@letmaik Thanks for the pointers.
undefined
from the DefinitelyNotJsonable
union type to the primitive
union type.() => any
with (...args: any[]) => any
in DefinitelyNotJsonable
.ABC
class only passes because x
is a private member. Private members are an implementation detail, so I could argue that it doesn't matter if they get lost in the whole JSON.stringify()
/JSON.parse()
routine.EDIT:
Actually for the third point, you can get a bit closer (passing the scenario you pointed out) by changing the T extends object
condition to { [K in keyof T]: T[K] } extends T
. Because private members affect type compatibility, the mapped object type isn't compatible with the original.
Most helpful comment
Looks like not too late to the party. Google search didn't take long to hit on this issue & what others suggest. What I came up with is basically same as @niedzielski. Despite caveats, it's simple enough to be useful and sits a comfortable distance from settling for just
any
.In difference to @niedzielski's version, preferring
Json
to all-capsJSON
prefix and thinkingJsonMap
instead ofJsonObject
(as others also have above) as feel "Map" is less overloaded than "Object".Also going with
AnyJson
rather thanJsonValue
.End result is
Ambivalent about whether or not to separate out JsonPrimitive.