golang scp file using crypto/ssh

2019-07-09 03:13发布

问题:

I'm trying to download a remote file over ssh The following approach works fine on shell

ssh hostname "tar cz /opt/local/folder" > folder.tar.gz

However the same approach on golang giving some difference in output artifact size. For example the same folders with pure shell produce artifact gz file 179B and same with go script 178B. I assume that something has been missed from io.Reader or session got closed earlier. Kindly ask you guys to help.

Here is the example of my script:

func executeCmd(cmd, hostname string, config *ssh.ClientConfig, path string) error {
    conn, _ := ssh.Dial("tcp", hostname+":22", config)
    session, err := conn.NewSession()
    if err != nil {
        panic("Failed to create session: " + err.Error())
    }

    r, _ := session.StdoutPipe()
    scanner := bufio.NewScanner(r)

    go func() {
        defer session.Close()

        name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
        file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
        if err != nil {
            panic(err)
        }
        defer file.Close()
        for scanner.Scan() {
            fmt.Println(scanner.Bytes())
            if err := scanner.Err(); err != nil {
                fmt.Println(err)
            }

            if _, err = file.Write(scanner.Bytes()); err != nil {
                log.Fatal(err)

            }
        }
    }()

    if err := session.Run(cmd); err != nil {
        fmt.Println(err.Error())
        panic("Failed to run: " + err.Error())
    }

    return nil
}

Thanks!

回答1:

bufio.Scanner is for newline delimited text. According to the documentation, the scanner will remove the newline characters, stripping any 10s out of your binary file.

You don't need a goroutine to do the copy, because you can use session.Start to start the process asynchronously.

You probably don't need to use bufio either. You should be using io.Copy to copy the file, which has an internal buffer already on top of any buffering already done in the ssh client itself. If an additional buffer is needed for performance, wrap the session output in a bufio.Reader

Finally, you return an error value, so use it rather than panic'ing on regular error conditions.

conn, err := ssh.Dial("tcp", hostname+":22", config)
if err != nil {
    return err
}

session, err := conn.NewSession()
if err != nil {
    return err
}
defer session.Close()

r, err := session.StdoutPipe()
if err != nil {
    return err
}

name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
    return err
}
defer file.Close()

if err := session.Start(cmd); err != nil {
    return err
}

n, err := io.Copy(file, r)
if err != nil {
    return err
}

if err := session.Wait(); err != nil {
    return err
}

return nil


回答2:

You can try doing something like this:

r, _ := session.StdoutPipe()
reader := bufio.NewReader(r)

go func() {
    defer session.Close()
    // open file etc

    // 10 is the number of bytes you'd like to copy in one write operation
    p := make([]byte, 10)
    for {
        n, err := reader.Read(p)
        if err == io.EOF {
            break
        }
        if err != nil {
            log.Fatal("err", err)
        }

        if _, err = file.Write(p[:n]); err != nil {
            log.Fatal(err)
        }
    }
}()

Make sure your goroutines are synchronized properly so output is completeky written to the file.



标签: go ssh scp