I'm working on a big project and in some places, I have to delete keys by a pattern, for example, I want to delete all the keys that start with filterProducts
I made a Google search and found some methods to do so like, for example, the method in this article
But I don't know what is the efficient one for a big project, could you please help me?
The method you linked to is a pretty terrible way of going about it. KEYS
is a no-go for production and DEL is also pretty inefficient. I would check into using SCAN
and UNLINK
. This way you can delete a handful of keys at a time without tying up Redis.
All that being said, that type of script is up to you and beyond the scope of the client library. I'm going to close the issue.
@MohammedAl-Mahdawi I have a similar use case in one of my projects and use the scanStream
implementation offered by ioredis. This is basically an easier way to use the scan
command: https://github.com/luin/ioredis#streamify-scanning
If you use SCAN
and UNLINK
like @stockholmux already mentioned, you should be able to remove any number of keys without blocking Redis.
@dirkbonhomme Thank you so much for your help, I really appreciate that.
I followed your advice and used ioredis and created the following functions, could you please just tell me for big projects is the following the most efficient way for doing so or not:
//key example "prefix*"
function getKeysByPattern(key) {
return new Promise((resolve, reject) => {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
var keys = [];
stream.on('data', function (resultKeys) {
// `resultKeys` is an array of strings representing key names
for (var i = 0; i < resultKeys.length; i++) {
keys.push(resultKeys[i]);
}
});
stream.on('end', function () {
resolve(keys)
});
})
}
//key example "prefix*"
function deleteKeysByPattern(key) {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
var keys = [];
stream.on('data', function (resultKeys) {
// `resultKeys` is an array of strings representing key names
for (var i = 0; i < resultKeys.length; i++) {
keys.push(resultKeys[i]);
}
});
stream.on('end', function () {
redis.unlink(keys)
});
}
@MohammedAl-Mahdawi looks good to me. Only possible issue is that you will load all matching keys into memory at once. If that's a problem, you could remove them in batches as soon as they come in:
stream.on('data', function (resultKeys) {
if (resultKeys.length) {
redis.unlink(resultKeys);
}
});
@dirkbonhomme and @MohammedAl-Mahdawi This method has atomicity issues that could bite you in production.
@stockholmux Thank you for your help, so what is the right solution to this problem, please?
For anyone that looking for the final result that we until now reached to and believe it is the most efficient way is the following:
//key example "prefix*"
function getKeysByPattern(key) {
return new Promise((resolve, reject) => {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
var keys = [];
stream.on('data', function (resultKeys) {
// `resultKeys` is an array of strings representing key names
for (var i = 0; i < resultKeys.length; i++) {
keys.push(resultKeys[i]);
}
});
stream.on('end', function () {
resolve(keys)
});
})
}
//key example "prefix*"
function deleteKeysByPattern(key) {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
var keys = [];
stream.on('data', function (resultKeys) {
// `resultKeys` is an array of strings representing key names
for (var i = 0; i < resultKeys.length; i++) {
keys.push(resultKeys[i]);
}
});
stream.on('end', function () {
redis.unlink(keys)
});
}
//key example "prefix*"
function batchDeletionKeysByPattern(key) {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
stream.on('data', function (resultKeys) {
if (resultKeys.length) {
redis.unlink(resultKeys);
}
});
}
The code above is for ioredis.
Please if you are someone that knows a better way of doing so please don't hesitate to share.
@MohammedAl-Mahdawi ,
i am getting this error any ideas on same
codebase:
let redisDel = () => {
var key="employees*"
return new Promise((resolve, reject) => {
var stream = redis.scanStream({
// only returns keys following the pattern of "key"
match: key,
// returns approximately 100 elements per call
count: 100
});
stream.on('data', function (resultKeys) {
if (resultKeys.length) {
console.log(resultKeys)
redis.unlink(resultKeys);
}
});
stream.on('end', function (resultKeys) {
resolve()
})
})
}
anyone give perfect solution here?
@siddhkadam1881 The unlink command is only available since Redis 4 https://redis.io/commands/unlink
You are probably running an older version. Try del
instead.
I want to understand how the scanning is done?. I think in the solution by @MohammedAl-Mahdawi if deletion happens in batches, is there a possibility that new keys added in that time which we don't wish to delete, will be scanned and deleted. How about a lua script ?
Anyone have any ideas why this solution wouldn't work on a very simple 1 instance ElastiCache server, but does work locally? I'm able to clear keys normally, but if I try to use the stream implementation it just never returns any keys on ElastiCache.
Most helpful comment
For anyone that looking for the final result that we until now reached to and believe it is the most efficient way is the following:
The code above is for ioredis.
Please if you are someone that knows a better way of doing so please don't hesitate to share.