I was wondering if there was an easy way in SQL to convert an integer to its binary representation and then store it as a varchar.
For example 5 would be converted to "101" and stored as a varchar.
I was wondering if there was an easy way in SQL to convert an integer to its binary representation and then store it as a varchar.
For example 5 would be converted to "101" and stored as a varchar.
On SQL Server, you can try something like the sample below:
Actually this is REALLY SIMPLE using plain old SQL. Just use bitwise ANDs. I was a bit amazed that there wasn't a simple solution posted online (that didn't invovled UDFs). In my case I really wanted to check if bits were on or off (the data is coming from dotnet eNums).
Accordingly here is an example that will give you seperately and together - bit values and binary string (the big union is just a hacky way of producing numbers that will work accross DBs:
Produces this result:
I believe that this method simplifies a lot of the other ideas that others have presented. It uses bitwise arithmetic along with the
FOR XML
trick with a CTE to generate the binary digits.Standard SQL (tested in PostgreSQL).
Here's a bit of a change to the accepted answer from Sean, since I found it limiting to only allow a hardcoded number of digits in the output. In my daily use, I find it more useful to either get only up to the highest 1 digit, or specify how many digits I'm expecting back. It will automatically pad the side with 0s, so that it lines up to 8, 16, or whatever number of bits you want.