This works from my local terminal:
ssh -i ~/.ec2/mykey.pem ubuntu@ec2-yada-yada.amazonaws.com ls
Of course it does. But when I try the same using node.js' child_process.spawn
command it complains that the key does not exist / can't be accessed.
// child process
var childProcess = require('child_process').spawn;
// spawn the slave using slaveId as the key
slaves[slaveId] = childProcess('ssh', [
'-i /mykey.pem',
'ubuntu@ec2-yada.amazonaws.com',
'ls'
])
Result:
stderr: Warning: Identity file /mykey.pem not accessible: No such file or directory.
stderr: Permission denied (publickey).
Things tried:
Variations on the path to key:
/actual/path/to/mykey.pem
mykey.pem
(with a copy of the file in the root of the node project)
/mykey.pem
(with a copy of the file in the root of the node project)
~/.ec2.mykey.pem
(where it should be)Running the command without the ssh part, ie.
childProcess(ls);
- works.chmod 644, 600, 400 etc. mykey.pem
My only theory at this point is there is an issue with passing a file reference in and I need to do something using the fs module. (?) And yes, I know there are libraries out there for ssh access with node but they use passwords which won't cut it and anyway, my requirements don't really justify a library.
Please tell me I'm being stupid and that this is possible.
UPDATE:
OK, so I can use the exec command like this:
var childProcess = require('child_process').exec;
slaves[slaveId] = childProcess('ssh -i mykey.pem ubuntu@ec2-yada.amazonaws.com ls', function (error, stdout, stderr) {...}
Still, I feel like I've been downgraded from creating a true slave using fork
with all it's nice messaging and handy properties (my original implementation which runs fine locally) to having a vacuum cleaner and being told to do all the work myself (now that I want to launch slaves on remote hosts).