How to avoid Redis calls in Lua script limitations

2019-05-29 01:19发布

问题:

I am setting up a PHP tagged cache implementation that will use both Redis and APCu. As APC is key-value store, I'm going to use Redis for key-tag relation and synchoronize with each webserver on APC.

My current question regards only Redis. Probably you know the implementation but to make things clear: A key can have tags associated with it. At some later point in time you can delete cached keys by some tags. There are many keys and not so many tags and there is a n-to-n relation between keys and tags.

set(key, value, tags) consists of:

SET key value
foreach tag in tags
    SADD tag key

Because there is no need to retrieve or change the tags after set, I only need to keep tag-to-keys relation.

deleteByTag(tags) is

keys = SUNION tag1 tag2 tag3...
DEL key1 key2 key2...

To make things faster, I created 2 simple lua scripts that I will SCRIPT LOAD and call EVALSHA.

Lua set script:

redis.call('set', KEYS[1], KEYS[2])
for _, tag in pairs(ARGV) do
    redis.call('sadd', tag, KEYS[1])
end

called with

EVALSHA setHash 2 key value tag1 tag2 tag3...

The deleteByTag script I have problems with looks like this:

redis.call('del', unpack(redis.call('sunion', unpack(ARGV))))
redis.call('del', unpack(ARGV))

called with

EVALSHA deleteByTagHash 0 tag1 tag2 tag3...

Everything is fine except when redis.call('sunion', unpack(ARGV)) return a lot of keys. It seems that Lua has a hard limit to the number of arguments a method can have. In my environment it is 8000.
I want to know if there is a way to clear the keys by tags but avoiding:

  • (1) a round-trip to the server and transferring keys back and forth to the client
  • (2) a for-each on keys. I tried with this modified script and it is slower than (1)

Here is the (2) that is not working fast enough:

for _, key in pairs(redis.call('sunion', unpack(ARGV))) do
    redis.call('del', key)
end
redis.call('del', unpack(ARGV))

回答1:

I'm almost sure, You could increase that number (8000) by changing LUAI_MAXCSTACK value in Your environment's luaconf.h and rebuilding it (Lua environment).

Default one is, as You've already noticed:

/*
@@ LUAI_MAXCSTACK limits the number of Lua stack slots that a C function
@* can use.
** CHANGE it if you need lots of (Lua) stack space for your C
** functions. This limit is arbitrary; its only purpose is to stop C
** functions to consume unlimited stack space. (must be smaller than
** -LUA_REGISTRYINDEX)
*/
#define LUAI_MAXCSTACK  8000

Only it seams like a pornography a bit.

What about using a table and iterating throught table.concat() chunks of <=8000 keys?



回答2:

Had the same issue and solved it with this Lua function:

local call_in_chunks = function (command, args)
    local step = 1000
    for i = 1, #args, step do
        redis.call(command, unpack(args, i, math.min(i + step - 1, #args)))
    end
end

call_in_chunks('del', keys)


标签: lua redis