How can I handle a > 22 column table with Slick us

2020-02-26 04:56发布

问题:

I'm new to Scala (using 2.10) and Slick (using 2.0-M2). I see that one of the ways to get around the 22 column limit for tables in Slick is to use nested tuples. I can't figure out how to do that, despite finding this partial code on GitHub.

Current dev branch Scala (2.11-M5) supports case classes with more than 22 elements, but not tuples with arity > 22. And Slick is not yet distributed for Scala 2.11 pre-releases. How can I define a 33 column table (and have it work with all Slick's syntactic sugar)?

N.B., I'm trying to support an existing schema and can't change the table normalization.

回答1:

Here I write a post to give out the solution. Here is the link: https://lihaimei.wordpress.com/2016/03/30/slick-1-fix-more-than-22-columns-case/

I draw some graphs and use different colors to help you understand fast.

To summarize, I use additional case Class to package some columns to one, which will not influence real physical columns. And then when we use projection to map to a custom type, we involve tuple back. This is a hack solution, but it is easily to fix Scala programming language's limit where the size of tuples should be less than 22.



回答2:

The test code you linked to is outdated. If you don't use mappings for your tables, it's straight-forward: The type of * corresponds to the return type you get when querying the table, whether it is a single tuple, an HList, or nested tuples. Since Slick 2.1 this works for all operations. (In 2.0 it was not supported for the * projection, so you had to define an alternate projection and override create_*.) See here for an HList example.

If you want to map the * projection to a custom type, you also use the <> operator as for a single tuple but you don't get the convenience of the tupled and unapply methods that are automatically generated for case classes, so you have to write the two mapping functions (from the unmapped to the mapped type and back) manually as shown here. Note that Scala 2.11 does not improve this situation. While it allows case classes with more than 22 fields, there are no corresponding Function types for arities > 22, so you still can't use tupled and unapply.

As an alternative to writing these functions, you can define a lifted type corresponding to your mapped type as explained in the manual. This is especially useful when you have nested case classes (of <= 22 fields each) for your mapped type. You only have to define separate mappings for each case class and they will automatically compose when you use them in a * projection (or any other place in a projection or query).