This leans heavily on the RFC at https://github.com/APIs-guru/graphql-constraints-spec and touches on issues covered in https://github.com/graphcool/graphcool/issues/23
We propose a constraint directive @constraint
that can be used to annotate types and force validation of input parameters against its constraint arguments. This proposal will differ from the RFC since it will lay out a streamlined implementation at Graphcool.
@constraint
NOTE: null values are handled by GraphQL natively. All the constraints don't affect nullability.
The constraint directive covered here can be used on field definitions of model types.
Only one instance of a constraint directive per field is allowed. Additionally the existing @isUnique directive can be applied to a field.
When a constraint directive has more than one constraint in it, they are treated as logical AND between individual constraints.
In GraphQL there are types (String
, Int
, Float
, ID
, Boolean
and various user-defined types) and type wrappers (List
and Not-Null
). The type wrapper Not-Null
in itself already places a constraint on the input values.
Therefore we propose two main kinds of constraints: Type Constraints
and List Constraints
. Both of these types of constraints are arguments of the @constraint
directive and are each only valid on certain types.
The existing @isUnique directive is also a constraint but has a special role since it does not only take the inputValue of the current field instance into consideration but also the existing values of other instances.
If a type constraint is applied to an entity of List
wrapper type it is checked against all values of the list.
Type constraints do not override GraphQL standard scalars semantic and runtime behavior. Moreover, each type constraint is compatible only with specific standard scalars. At a later point we could extend the available constraints to also cover the Json
and DateTime
types.
| Directive Type | Float | Int | Boolean | String | ID |
| ------------------- | ----- | --- | ------- | ------ | -- |
| number constraints | + | + | - | - | - |
| string constraints | - | - | - | + | + |
| boolean constraints | - | - | + | - | - |
Providing a constraint in the constraint directive that does not apply to the type of the field definition should result in an error.
multipleOf: Float!
An instance is valid only if division by this constraint's value results in an integer.
max: Float!
An instance is valid only if the instance is less than or exactly equal to max
.
min: Float!
An instance is valid only if the instance is greater than or exactly equal to min
.
exclusiveMax: Float!
An instance is valid only if it is strictly less than (not equal to) exclusiveMax
.
exclusiveMin: Float!
An instance is valid only if it has a value strictly greater than (not equal to) exclusiveMin
.
oneOfNumber: [Float!]!
(alternatively inNumber analog to filter syntax)
An instance is valid only if its value is equal to one of the elements in this constraint's array value.
notOneOfNumber: [Float!]!
(alternatively notInNumber analog to filter syntax)
An instance is valid only if its value is not equal to any of the elements in this constraint's array value.
equalsNumber: Float!
An instance is valid only if its value is equal to the value of the constraint.
notEqualsNumber: Float!
An instance is valid only if its value is not equal to the value of the constraint.
type Foo {
byte:Int @constraint(
min: 0
max: 255
)
}
Examples of valid values for byte
field: 155
, 255
, 0
Examples of invalid values for byte
field: "string"
, 256
, -1
type Foo {
bitMask: Int @constraint(
oneOfNumber: [ 1, 2, 4, 8, 16, 32, 64, 128 ]
)
}
Examples of valid values for bitMask
field: 1
, 16
, 128
Examples of invalid values for bitMask
field: "string"
, 3
, 5
maxLength: Int!
The value of this constraint MUST be a non-negative integer. An instance is valid against this constraint if its length is less than, or equal to maxLength
. The length of an instance is defined as the number of its characters.
minLength: Int!
The value of this constraint MUST be a non-negative integer.
An instance is valid against this constraint if its length is greater than, or equal to minLength
. The length of an instance is defined as the number of its characters.
startsWith: String!
An instance is valid if it begins with the characters of the constraint's string.
endsWith: String!
An instance is valid if it ends with the characters of the constraint's string.
contains: String!
An instance is valid if constraint's value may be found within the instance string.
notContains: String!
An instance is valid if constraint's value is not to be found in instance string.
regex: String!
An instance is valid if it matches a regular expression according to java.util.regex
oneOfString: [String!]!
An instance is valid only if its value is equal to one of the elements in this constraint's array value.
notOneOfString: [String!]!
An instance is valid only if its value is not equal to any of the elements in this constraint's array value.
equalsString: String!
An instance is valid only if its value is equal to the value of the constraint.
notEqualsString: String!
An instance is valid only if its value is not equal to the value of the constraint.
type Foo{
string: String @constraint(
regex: "^[0-9a-zA-Z]*$"
)
}
Examples of valid values: "foo1"
, "Apollo13"
, 123test
Examples of invalid values: 3
, "dash-dash"
, [email protected]
equalsBoolean: Boolean!
An instance is valid only if its value is equal to the value of the constraint.
notEqualsBoolean: Boolean!
An instance is valid only if its value is not equal to the value of the constraint.
type Foo{
boolean: Boolean @constraint(
equalsBoolean: true
)
}
Examples of valid values: true
Examples of invalid values: false
oneOfEnum: [ENUM!]!
An instance is valid only if its value is equal to one of the listed values. Values must belong to the enum.
`notOneOfEnum: [ENUM!]!
An instance is valid only if its value is not equal to one of the listed values. Values must belong to the enum.
type Foo{
enum: Enum @constraint(
oneOfEnum: [A, B, C]
)
}
enum Enum{
A,
B,
C,
D
}
Examples of valid values: A
Examples of invalid values: D
List constraints can be applied to List fields. Applying a list constraint to a non-list field should result in an error. List values can be on scalar fields like [Int!]
or [Float!]!
or relation fields like [Post!]!
.
maxItems: Int!
The value MUST be non-negative. An instance is valid only if its size is less than, or equal to, the value of the constraint.
minItems: Int!
The value MUST be non-negative. An instance is valid only if its size is greater than, or equal to, the value of the constraint.
uniqueItems: Boolean!
If set to true
, the instance is valid if all of its elements are unique. (For relation fields uniqueness is determined by the partners Id
. Since we only allow unique pairs once per relation this will always succeed when set to true and fail if set to false.)
type Foo {
point3D: [Float!] @constraint(
maxItems: 3,
minItems: 3
)
}
Examples of valid values for point3D
field: [1, 2, 3]
, [-10, 2.5, 100]
Examples of invalid values for point3D
field: [-1, 0]
, [-1, 0, 100, 0]
type Foo {
pointOnScreen: [Float!] @constraint(
maxItems: 2,
minItems: 2
min: 0.0)
}
Examples of valid values for pointOnScreen
field: [1, 2.5]
, [0, 100]
Examples of invalid values for pointOnScreen
field: [-10, 100]
, [100, -100]
, [0, 0, 0]
Constraint directives are part of the consistency guarantee for the stored data. Therefore when creating a new constraint directive it will be first checked against the field that it is applied to. If the constraints do not match the field definition an error is returned. If the constraint directive fits the field, all existing values have to be checked whether they would violate any of the constraints. If this is the case an error should be returned with the offending values (or Ids?). Updating a constraint directive will cause the same checks.
When creating or updating a new data item, the input values will be checked against existing constraint directives. Violating a constraint will lead to an error message that will output the violated constraint and the value it was set to and the offending input value. The list constraints minItems
and maxItems
when applied to relation fields will additionally need to be checked upon deletion of related data items.
{"data":{"createAuthor":null},"errors":[{"locations":[{"line":1,"column":11}],"path":["createAuthor"],"code":3035,"message":"The input value violated one or more constraints: The inputvalue: 'not-identical' violated the constraint 'equalsString' with value: 'identical ","requestId":"test-request-id"}]}
type Foo {
bar: [Int!] @constraint(
multipleOf: 0.01
minItems: 1,
maxItems: 3,
uniqueItems: true
)
}
Examples of valid values for bar
field: [1, 2, 3]
, [0.01, 0.02]
, [0.99]
Examples of invalid values for bar
field: [0.999]
, []
, [1, 2, 3, 4]
, [1.001, 2]
, [1, 1]
type Query {
allPersons(
first: Int @constraint(min: 1, max: 25)
after: String
last: Int @constraint(min: 1, max: 25)
before: String
): [Foo!]
}
Examples of valid values for first
and last
input arguments: 1
, 25
, 10
Examples of invalid values for first
and last
input arguments: 0
, 30
This is a great proposal @do4gr. Good job! 馃憤
Regarding "Checking of Constraint Directives": I think it's crucial that every constraint can be directly mapped to a database feature to make this feasible for large datasets. Is this currently the case for all proposed constraints?
As Johannes commented, these checks can be either implemented in our Scala backend or in the DB itself.
SQL offers the possibility of CHECK Constraints on columns that can be used to implement these constraints on the database level. Naive possible SQL statements for each constraint follow. (Without regards to speed as of now)
| Constraint | Check Statement
| -------------------- | ----------------
| Number Constraints | ----------------
| multipleOf |
| max | value <= argument
| min | value >= argument
| exclusiveMax | value < argument
| exclusiveMin | value > argument
| oneOfNumber | value in (argument)
| notOneOfNumber | value not in (argument)
| equalsNumber | value = argument
| notEqualsNumber | value <> argument
| String Constraints | ----------------------
| maxLength | CHAR_LENGTH(value) <= argument
| minLength | CHAR_LENGTH(value) >= argument
| startsWith | use REGEXP ?
| endsWith | use REGEXP ?
| contains | INSTR(value, argument) > 0
| notContains | INSTR(value, argument) = 0
| regex | use REGEXP
| oneOfString | value in (argument)
| notOneOfString | value not in (argument)
| equalsString | value = argument
| notEqualsString | value <> argument
| Boolean Constraints | -----------------
| equalsBoolean | value = argument
| notEqualsBoolean | value <> argument
| List Constraints | -----------------
| maxItems | these are tricky
| minItems | these are tricky
| uniqueItems | these are tricky
Adding constraints to a column could be fairly simple as in:
Alter Table User
Add Constraint Name Check (CHAR_LENGTH(Name) >= argument);
Check constraints are attractive since they can be configured to guard update and create of data and to also check on creation that existing data is valid.
Unfortunately, MySql which we use has not implemented the Check
constraint since the bug report about it was filed 13 years ago -.- .The possible workarounds using generated columns for example to guard update and create are a lot uglier since they add additional columns.
Alter Table User
Add Column Name_minLength char(0) as
(case when CHAR_LENGTH(Name) > 4 then ''end)
virtual not null;
An additional check would be necessary upon creation / update of the constraint itself to ensure that existing data does not violate the constraint. This can be a simple select statement.
Select * As Count
From User
Where CHAR_LENGTH(Name) <= 4
Our implementation of list values poses a challenge for expressing constraints purely on the DB level though. Since we store list values as Json in String fields on the DB the validation of existing list values with generated SQL queries would get too complicated I think.
Since our chosen way of storing List values and choice of MySql do not allow us a clean DB only solution at the moment I would propose to handle the validation in the Scala layer and to fail before hitting the DB with invalid values.
Hi!
Is there any updates on this?
Michael
Any updates?
Any updates?
Hey, at the moment we are focusing on our database connector preview, but constraints are definitely still on our todo list.
Hello, this feature is awesome, looking forward to using it! @do4gr do we have an ETA on this, as Postgres connector has been released (also awesome btw)
Thanks for the efforts!
Is this what https://github.com/confuser/graphql-constraint-directive package implements?
critically needed feature. any updated / ETA?
Hey, sorry at the moment I cannot give you an ETA on this. We still want to implement this, but have a bunch of other things that are higher priority at the moment. Once we schedule it I'll report back.
@do4gr Any updates on schedule of this?
Validators from validator.js
looks very useful too.
BTW, @confuser, are there any plans to implement them in https://github.com/confuser/graphql-constraint-directive?
@FluorescentHallucinogen Some of them are already implemented under the format
argument. If there are any further validators needed, feel free to open an issue
validator
and graphql
and graphql-tools
as dependencies)@confuser Just found a comment that graphql-constraint-directive
doesn't work with Apollo Server 2: https://medium.com/@koresar/please-note-the-constraint-directive-does-not-work-if-you-use-schema-stitching-c0eb0567111f. Is this true? If so, are there any plans to fix it?
I think it would be great to support json-schema
for the JSON
and JSONB
type.
Please take into account that constraints can affect multiple fields in the same row.
For example, you might want to be an employee's commission to be always smaller than their salary.
+1 for this feature, would be a great addition to Prisma. I'd also like to add that being able to constrain an enum field to a subset of the enum values would be immensely valuable.
That's a great suggestion @afefer. @do4gr can you adjust the original proposal to accommodate for this feature?
any update on constraints?
I worry about the flexibility of this solution. Shouldn't validation be done at the code (application) level, not at the database (Prisma) level by packages like https://github.com/confuser/graphql-constraint-directive and https://github.com/vsimko/node-graphql-constraint-lambda?
If Prisma will support @constraint
directive, does this mean that @constraint
directive name will be reserved for Prisma and will conflict with the @constraint
directive from the existing npm packages?
IMO, preserving (not stripping out) custom directives defined in datamodel.graphql
when generating prisma.graphql
would be enough. See #1228 for more info.
I would like to define custom directives and be able to provide my own implementation of how exactly these directives (including for validation/restrictions) should work.
Another important question is, should we even use custom directives for validation/restrictions? What about custom scalars (see #3512) for that?
I'm interested in pros and cons in using custom scalars vs. custom schema directives for validation (in the limitations and potential flexibility of each of these approaches).
See, e.g. @okgrow/graphql-scalars for a library of custom GraphQL scalar types for creating precise type-safe GraphQL schemas.
There are many places where directives (in general) can be applied (see the full list here: https://github.com/graphql/graphql-js/blob/master/src/language/directiveLocation.js). Considering this, for validation/restrictions custom directives seem more flexible (granular) than custom scalar types.
any way to use PostgreSQL's unique together constraint?
any way to use PostgreSQL's unique together constraint?
For a unique constraint over multiple fields, you can use a unique index, which is being added in datamodel v1.1.
This is currently in closed beta. AFAIK you can contact the Prisma team via slack to join.
It looks like this feature is actively being worked on? Can someone confirm? When is this slated for release?
@schickling is this something that's on the roadmap?
Yes, absolutely. Hopefully shortly after the initial Prisma 2 #4256 launch.
Declarative validation is awesome!
Our team is really looking forward to this! Would greatly help stabilize our schema 馃憤
Most helpful comment
Hey, at the moment we are focusing on our database connector preview, but constraints are definitely still on our todo list.