A createManyXs
mutation to create many nodes in a batched fashion.
Suggested API:
mutation {
createManyItems(data: [{
name: "Tesla"
price: "1000"
}, {
name: "Audi"
price: "2"
}]) {
ids # list of ids for newly created nodes
count
}
}
Where would i look in order to implement this?
After all, most of the groundwork for this seems done and things like TypeCreateManyInput
already exist when you create nested resources.
I tried to find "CreateManyInput" via code search, but turned up empty-handed. Not sure if i was even searching the right repo.
any updates on this feature?
Is hard to implement? I think this will be very useful.
Any progress?
Are there plans to tackle this any time soon? Or at least some help from the prima team as to how it should be tackled for contributors to work on this?
I'm missing not having this after 3 hours of trying out Prisma.
Hey guys! This looks pretty simple to implement since Prisma already have updateMany
. What are the blockers?
any updates on this?
It would be very handy
one of the first thing I am using prisma is the bulk import of csv rows, so bit saddened by unpresence of this feature
I need this. Please hurry up
I need it tooooo :(
bump
any good workaround for this?
I feel like this is such a must-have. Shame it wasn't added way sooner
any updates on this?
does anyone have a good workaround for this?
does anyone have a good workaround for this?
It isn't pretty but I'm temporarily getting around it by building a dynamic template string that prepends an arbitrary unique ID to each data object getting inserted and calling the mutation method. Using axios for the POST, It looks something like this:
const players = [{id: 'a', name: 'playerA'}, {id: 'b', name: 'playerB'}]
let playerMutations = ``;
players.forEach(player => {
playerMutations += `
p_${player.id}: createPlayer (
data: {
name: "${player.name}"
}
) {
name
}
`;
});
const mutation = `
mutation {
${playerMutations}
}
`;
try {
await axios({
url: 'http://localhost:4000/default/dev',
method: 'post',
data: { query: mutation }
});
} catch (e) {
console.error('Error in PlayerServices run:', e.message);
return [];
}
@aarshaw check out my comment above
@vladimirvolek @HugoLiconV @gotexis @george1028 @rollrodrig check out my comment above. temp solution that im currently using
You can use raw access (edit prisma config rawAccess: true
). This option provides $graphql
function where you can _insert many values in a single request_.
_docker-compose.yml_
PRISMA_CONFIG: |
port: 4466
# uncomment the next line and provide the env var PRISMA_MANAGEMENT_API_SECRET=my-secret to activate cluster security
# managementApiSecret: my-secret
databases:
default:
connector: mysql
host: mysql
user: root
password: prisma
port: 3306
rawAccess: true // HERE
migrations: true
_generated/prisma-client/index_
export interface Prisma {
$exists: Exists;
$graphql: <T = any>(
query: string,
variables?: { [key: string]: any }
) => Promise<T>;
_example_
await prisma.$graphql(`
mutation {
executeRaw(
query: "insert into \"default$default\".\"Address\" (\"address\", \"createdAt\", \"id\", \"updatedAt\") values ('00:01:cc:00:00:02', '2018-10-26T08:51:04.242Z', 'MDA6MDE6Y2M6MDA6MDA6MDIwMD', '2018-10-26T08:51:04.242Z')"
)
}
`);
more info
https://www.prisma.io/forum/t/how-i-can-create-many-records-at-the-same-time-with-the-prisma-client/4723/5
https://github.com/prisma/prisma/issues/3300#issuecomment-430696015
@vladimirvolek What about SQL injection while using the raw queries? Is prisma already handling it or should we implement it by ourself?
@sapkra In this case you probably need to handle it yourself since Prisma is not aware of the semantics of the query you are dynamically building.
It has been well over a year at this point, is there any dialogue on this?
u can do something like this 馃槈
@ahmedB7r , that works when it's only a few items. But if someone is attempting to add > 1000 items, Prisma will begin rejecting the requests after the queue reaches 1000 to protect the database. A feature like createMany would allow Prisma to internally queue these requests and handle them as it sees fit with the possibility of providing updates as the queue is worked to completion.
any update on this ?
Also waiting on this. Will it be a feature of Prisma2?
Does this mean the team won't work on Prisma anymore? Now that Prisma2 is taking all of their attention?
Exactly. That's how I understood it too.
maybe you could have a look at this answer : https://spectrum.chat/prisma/general/whats-the-best-way-to-bulk-insert-records-with-the-prisma-clent~ecffdc67-30a2-4bca-a506-8cdfebd8de77?m=MTU1NDcxMDQxMTE5Ng==
Thanks, @Vaneste. We can copy it here just in case the chat gets cleared:
The best way would be to send the request in batches to the prisma server. I personally use p-map for this purpose(https://github.com/sindresorhus/p-map). Make the batch size around 500
import pmap from 'p-map';
const data = [{/*.....*/},{/*.....*/},/*.....*/]
const result = await pmap(data,mapper, { concurrency: 500 });
async function mapper(data) {
const res = await prisma.createRecord(data);
return res;
}
Not the best solution, but it is a solution good to have until prisma 2 comes out.
Most helpful comment
I need this. Please hurry up