I'm writing a long running task which fetch from mongodb (using mgo) multiple times. Then write it to an xlsx file using this module. Then read it again using os.Open
then store it to my ftp server.
Stor
function consume my memory so much, So I think there should be a way not to save file but pass my data from xlsx.Write to ftp.Store directly. (If I can stream simultaneously would be perfect because I don't have to hold all of my documents in server's memory before send them to Stor function)
These are prototype of the functions
func (f *File) Write(writer io.Writer) (err error)
xlsl
func (ftp *FTP) Stor(path string, r io.Reader) (err error)
ftp
You want to use io.Pipe. You can do:
reader, writer := io.Pipe()
errChan := make(chan error)
go func() {
errChan <- myFTP.Stor(path, reader)
}()
err := myXLS.Write(writer)
// handle err
err = <-errChan
// handle err
You might want to writer.CloseWithError(err)
if xlsx.Write
returns an error without closing the writer.
you can use bytes.Buffer
:
func uploadFileToQiniu(file *xlsx.File) (key string, err error) {
key = fmt.Sprintf("%s.xlsx", util.SerialNumber())
log.Debugf("file key is %s", key)
log.Debug("start to write file to a writer")
buf := new(bytes.Buffer)
err = file.Write(buf)
if err != nil {
log.Errorf("error caught when writing file: %v", err)
return
}
size := int64(buf.Len())
log.Debugf("file size is %d", size)
err = Put(key, size, buf)
if err != nil {
log.Errorf("error caught when uploading file: %v", err)
}
return key, nil
}
func Put(key string, size int64, reader io.Reader) error {}