Public URLs For Objects In Bluemix Object Storage

2019-01-15 03:59发布

问题:

I would like to upload a number of photos to the Bluemix Object Storage service and then display them in a web app. Right now a GET request to the photo in the object storage container requires and auth token. Is there any way I can create a public URL to the object that would not require an auth token for a GET request?

I see there is an option of creating temporary URLs to objects but I don't want the URL to be temporary I want it to live forever. Is the only option to create a long lived temporary URL?

回答1:

The correct way to do this is to modify the container ACL. You cannot do this via the Bluemix UI currently but you can using the Swift REST API. For example, to change the container ACL so anyone can read objects in the container you can issue the following PUT request.

curl -X PUT "https://dal.objectstorage.open.softlayer.com/v1/AUTH_123/mycontainer" \
    -H "X-Auth-Token: token123" \
    -H "X-Container-Read: .r:*"


回答2:

I know this is a old post but with the help of Ryan Baxter and Object storage documentation in IBM I could resolve the Issue Finally these too commands saved the day

First use swift and change access control of container

swift post container-name --read-acl ".r:*,.rlistings"

Next Using Curl Configure Container to a particular Url for accessing Files

curl -X GET " https://<access point>/<version>/AUTH_projectID/container-name" -H "X-Auth-Token:<auth token>"     -H "X-Container-Read: .r:*,.rlistings"

And also very grateful for the help provided by Alex da Silva



回答3:

Now BlueMix has an S3 endpoint capability. You can use curl or any other langage for exemple here is an boto3 that will upload an object, make it public and and some metadata : ( the function is using a json file on which you store the credentials and it uses 3 variables that are used in the global app : currentdirpath,ImagesToS3,ImageName )

def UploadImageDansBucket (currentdirpath,ImagesToS3,ImageName) :
    currentdirpath = 'path/to/your/dir/current'
    ImagesToS3 = ' /path/of/your/object/'
    ImageName = 'Objectname'
    with open("credentials.json", 'r') as f:
        data = json.loads(f.read())
        bucket_target = data["aws"]["targetBucket"]
        print ('Open Connection to the bucket in the cloud..........')  

        s3ressource = boto3.resource(
            service_name='s3', 
            endpoint_url= data["aws"]["hostEndPoint"],
            aws_access_key_id= data["aws"]["idKey"],
            aws_secret_access_key=data["aws"]["secretKey"],
            use_ssl=True,
            )
        s3ressource.meta.client.meta.events.unregister('before-sign.s3', fix_s3_host)
        s3ressource.Object(bucket_target, 'hello.txt').put(Body=b"I'm a test file")
        s3ressource.Object(bucket_target, 'bin.txt').put(Body=b"0123456789abcdef"*10000)
        fn = "%s%s" % (ImagesToS3,ImageName)
        data = open(fn, 'rb')
        #s3ressource.Bucket(bucket_target).put_object(Key=fn, Body=data)
        now = datetime.datetime.now()  # on recupere la date actuelle 
        timestamp = time.mktime(now.timetuple())  # on effectue la convertion
        timestampstr = str (timestamp)
        s3ressource.Bucket(bucket_target).upload_file(fn,ImageName, ExtraArgs={ "ACL": "public-read", "Metadata": {"METADATA1": "a" ,"METADATA2": "b","METADATA3": "c", "timestamp": timestampstr },},)