In mysql or postgres, is there a limit to the size

2019-02-13 12:39发布

问题:

I've got quite a few SQL statements like such:

SELECT foo FROM things WHERE user_id IN (1,2,3..n)

Is there a known limit to the number of elements that will safely fit in an IN clause like that?

回答1:

There is no technical limit, but there is a some kind of 'good sense' limit..

Having too much elements in the IN clause means the query have probably a bad design (imho)



回答2:

The 1000 limit in PostgreSQL is not a hard limit, it is a optimization limit, i.e; after 1000 PostgreSQL doesn't handle it very well. Of course I have to ask what in the world are you doing with a 1000 entry IN clause.



回答3:

For MySQL, from the manual:

The number of values in the IN list is only limited by the max_allowed_packet value.



回答4:

I've used it, in dynamic queries created for postgres with sqlalchemy, with over 25k parameters. Also, they were passed to python functions via positional parameters (*args).... but I didn't notice a slowdown in my case. YMMV



回答5:

No, but be careful when using the IN statement. If you use a subquery in your IN statement, performance might be negatively impacted because SQL Server must generate the entire result set and essentially build a potentially large IF statement internally.

For example, something like Select * From MyTable where MyColumn IN (Select myColumn from AnotherTable) might be somewhat slow if the subquery returns a large number of rows. Often times it's more efficient to use EXISTS.



回答6:

I'm pretty sure Postgres has a limit of 1000....can't find any docco to support that though.