I am trying to write a bash script for testing that takes a parameter and sends it through curl to web site. I need to url encode the value to make sure that special characters are processed properly. What is the best way to do this?
Here is my basic script so far:
#!/bin/bash
host=${1:?'bad host'}
value=$2
shift
shift
curl -v -d "param=${value}" http://${host}/somepath $@
You can emulate javascript's
encodeURIComponent
in perl. Here's the command:You could set this as a bash alias in
.bash_profile
:Now you can pipe into
encodeURIComponent
:Here's a one-line conversion using Lua, similar to blueyed's answer except with all the RFC 3986 Unreserved Characters left unencoded (like this answer):
Additionally, you may need to ensure that newlines in your string are converted from LF to CRLF, in which case you can insert a
gsub("\r?\n", "\r\n")
in the chain before the percent-encoding.Here's a variant that, in the non-standard style of application/x-www-form-urlencoded, does that newline normalization, as well as encoding spaces as '+' instead of '%20' (which could probably be added to the Perl snippet using a similar technique).
The following is based on Orwellophile's answer, but solves the multibyte bug mentioned in the comments by setting LC_ALL=C (a trick from vte.sh). I've written it in the form of function suitable PROMPT_COMMAND, because that's how I use it.
Ruby, for completeness
Here is my version for busybox ash shell for an embedded system, I originally adopted Orwellophile's variant:
Another option is to use
jq
:-s
(--slurp
) reads input lines into an array and-s -R
(--slurp --raw-input
) reads the input into a single string.-r
(--raw-output
) outputs the contents of strings instead of JSON string literals.Or this percent-encodes all bytes: