I have created an AWS Lambda Layer using following command:
aws lambda publish-layer-version --layer-name TensorflowLambdaLayer --compatible-runtimes go1.x --zip-file fileb://tensorflowLayer.zip
Here is ARN that was generated: `arn:aws:lambda:us-east-1:757767972066:layer:TensorflowLambdaLayer:1
When I try to run my Lambda function which uses Tesnroflow library via AWS SAM, it gets stuck on "mounting" step:
2019-07-18 15:51:29 Mounting /tmp/tmpgz8cb80s as /var/task:ro,delegated inside runtime container
Once I terminate it with Ctrl + C
I am getting following message:
^C/var/task/bin/inference: error while loading shared libraries: libtensorflow.so: cannot open shared object file: No such file or directory
Makefile:82: recipe for target 'run-inference' failed
Here is crucial part of my template.yml:
Parameters:
LambdaTensorflowLayerArn:
Type: String
Default: 'arn:aws:lambda:us-east-1:757767972066:layer:TensorflowLambdaLayer:1'
LambdaFFMPEGLayerArn:
Type: String
Default: 'arn:aws:lambda:us-east-1:757767972066:layer:ffmpeg:1'
Inference:
Type: 'AWS::Serverless::Function'
Properties:
Handler: bin/inference
Runtime: go1.x
Timeout: 300
CodeUri: ./bin/inference.zip
Layers:
- Ref: LambdaFFMPEGLayerArn
- Ref: LambdaTensorflowLayerArn
I am not sure what might be causing this
I was having the same issue with the message
In my case, it was trying to use TensorFlow with go. The problem is that the official installation for libtensorflow.so (and libtensorflow_framework.so) doesn't seem to be working if the package is left in /usr/local (recommended) or in other path. Using ldconfig as suggested for /usr/local doesn't help either. Also, the base example for gcc doesn't work (even with the -L. This has surprised me) until the LD_LIBRARY_PATH is set:
This is the content of /usr/local in my case
The other solution was to manually create the symlinks to /usr/lib. I don't know how to do that in a serverless config. Was going to write this in a comment but still dont have enough rep.