Put json data pipeline definition using Boto3

2019-07-23 15:42发布

问题:

I have a data pipeline definition in json format, and I would like to 'put' that using Boto3 in Python.

I know you can do this via the AWS CLI using put-pipeline-definition, but Boto3 (and the AWS API) use a different format, splitting the definition into pipelineObjects, parameterObjects and parameterValues.

Do I need to write code to translate from a json definition to that expected by the API/Boto? If so, is there a library that does this?

回答1:

The AWS CLI has code that does this translation, so I can borrow that!



回答2:

You could convert from the Data Pipeline exported JSON format to the pipelineObjects format expected by boto3 using a python function of the following form.

def convert_to_pipeline_objects(pipeline_definition_dict):
    objects_list = []
    for def_object in pipeline_definition_dict['objects']:
        new_object = {
            'id': def_object['id'],
            'name': def_object['name'],
            'fields': []
        }
        for key in def_object.keys():
            if key in ('id', 'name'):
                continue
            if type(def_object[key]) == dict:
                new_object['fields'].append(
                    {
                        'key': key,
                        'refValue': def_object[key]['ref']
                    }
                )
            else:
                new_object['fields'].append(
                    {
                        'key': key,
                        'stringValue': def_object[key]
                    }
                )
        objects_list.append(new_object)