I have a CSV file with 2 columns and 20,000 rows I would like to import into Google Cloud Datastore. I'm new to the Google Cloud and NoSQL databases. I have tried using dataflow but need to provide a Javascript UDF function name. Does anyone have an example of this? I will be querying this data once it's in the datastore. Any advice or guidance on how to create this would be appreciated.
相关问题
- java.lang.NullPointerException at java.io.PrintWri
- Why do Dataflow steps not start?
- sqlyog export query result as csv
- __call__() missing 1 required positional argument:
- Upload file to Google Cloud Storage using AngularJ
相关文章
- Is there a size limit for HTTP response headers on
- How to read local csv file in client side javascri
- appcfg.py command not found
- Google app engine datastore string encoding proble
- Symfony : Doctrine data fixture : how to handle la
- How to prevent excel from truncating numbers in a
- Handling bad CSV records in CsvHelper
- How do I read a csv stored in S3 with csv.DictRead
Simple in python, but can easily adapt to other langauges. Use the
split()
method to loop through the lines and comma-separated values:Using Apache Beam, you can read a CSV file using the
TextIO
class. See the TextIO documentation.Next, apply a transform that will parse each row in the CSV file and return an
Entity
object. Depending on how you want to store each row, construct the appropriateEntity
object. This page has an example of how to create anEntity
object.Lastly, write the
Entity
objects to Cloud Datastore. See the DatastoreIO documentation.