Golang RESTful API load testing causing too many d

2019-05-10 01:06发布

问题:

I think I am having serious issue managing database connection pool in Golang. I built an RESTful API using Gorilla web toolkit which works great when only few requests are being sent over to the server. But now I started performing load testing using loader.io site. I apologize for the long post, but I wanted to give you the full picture.

Before going further, here are some info on the server running the API and MySQL: Dedicated Hosting Linux 8GB RAM Go version 1.1.1 Database connectivity using go-sql-driver MySQL 5.1

Using loader.io I can send 1000 GET requests/15 seconds without problems. But when I send 1000 POST requests/15 seconds I get lots of errors all of which are ERROR 1040 too many database connections. Many people have reported similar issues online. Note that I am only testing on one specific POST request for now. For this post request I ensured the following (which was also suggested by many others online)

  1. I made sure I use not Open and Close *sql.DB for short lived functions. So I created only global variable for the connection pool as you see in the code below, although I am open for suggestion here because I do not like to use global variables.

  2. I made sure to use db.Exec when possible and only use db.Query and db.QueryRow when results are expected.

Since the above did not solve my problem, I tried to set db.SetMaxIdleConns(1000), which solved the problem for 1000 POST requests/15 seconds. Meaning no more 1040 errors. Then I increased the load to 2000 POST requests/15 seconds and I started getting ERROR 1040 again. I tried to increase the value in db.SetMaxIdleConns() but that did not make a difference.

Here some connection statistics I get from the MySQL database on the number of connections by running SHOW STATUS WHERE variable_name = 'Threads_connected';

For 1000 POST requests/15 seconds: observed #threads_connected ~= 100 For 2000 POST requests/15 seconds: observed #threads_connected ~= 600

I also increased the maximum connections for MySQL in my.cnf but that did not make a difference. What do you suggest? Does the code look fine? If yes, then it is probably the connections are just limited.

You will find a simplified version of the code below.

var db *sql.DB

func main() {
    db = DbConnect()
    db.SetMaxIdleConns(1000)

    http.Handle("/", r)
    err := http.ListenAndServe(fmt.Sprintf("%s:%s", API_HOST, API_PORT), nil)

    if err != nil {
       fmt.Println(err)
    }
}

func DbConnect() *sql.DB {
    db, err := sql.Open("mysql", connectionString)
    if err != nil {
        fmt.Printf("Connection error: %s\n", err.Error())
        return nil
    }
    return db
}

func PostBounce(w http.ResponseWriter, r *http.Request) {
    userId, err := AuthRequest(r)

    //error checking
    //ready requesy body and use json.Unmarshal

    bounceId, err := CreateBounce(userId, b)

    //return HTTP status code here
}

func AuthRequest(r *http.Request) (id int, err error) {
    //parse header and get username and password

    query := "SELECT Id FROM Users WHERE Username=? AND Password=PASSWORD(?)"
    err = db.QueryRow(query, username, password).Scan(&id)

    //error checking and return
}

func CreateBounce(userId int, bounce NewBounce) (bounceId int64, err error) {
    //initialize some variables
    query := "INSERT INTO Bounces (.....) VALUES (?, ?, ?, ?, ?, ?, ?, ?)"
    result, err := db.Exec(query, ......)

    //error checking

    bounceId,_ = result.LastInsertId()

    //return 
}

回答1:

Go database/sql doesn't prevent you from creating an infinite number of connections to the database. If there is an idle connection in the pool, it will be used, otherwise a new connection is created.

So, under load, your request handlers sql.DB is probably finding no idle connections and so a new connection is created when needed. This churns for a bit -reusing idle connections when possible and creating new when needed-, ultimately reaching the max connections for the Db. And, unfortunately, in Go 1.1 there isn't a convenient way (e.g. SetMaxOpenConns) to limit open connections.

Upgrade to a newer version of Golang. In Go 1.2+ you get SetMaxOpenConns. And check out the MySql docs for starting setting and then tune.

db.SetMaxOpenConns(100) //tune this

If you must use Go 1.1 you'll need to ensure in your code that *sql.DB is only being used by N clients at a time.



回答2:

@MattSelf proposed solution is correct, but I ran into other issues. Here I highlighted what I did exactly to solve the problem (by the way, the server is running CentOS).

  1. Since I have a dedicated server I increased the max_connections for MySQL

In /etc/my.cnf I added the line max_connections=10000. Although, that is more connections than what I need.

  1. Restart MySQL: service mysql restart

  2. Changed the ulimit -n. That is to increase the number of descriptive files that are open.

To do that I made changes to two files:

In /etc/sysctl.conf I added the line

fs.file-max = 65536

In /etc/security/limits.conf I added the following lines:

*          soft     nproc          65535
*          hard     nproc          65535
*          soft     nofile         65535
*          hard     nofile         65535
  1. Reboot your server

  2. Upgraded Go to 1.3.3 as suggested by @MattSelf

  3. Set

      db.SetMaxOpenConns(10000)
    

Again the number is too large for what I need, but this proved to me that things worked.

  1. I ran a test using loader.io which consists of 5000 clients each sending POST request all within 15 seconds. All went through without errors.


回答3:

Something else to note is setting the back_log to a higher value in your my.cnf file to something like a few hundred or 1000. This will help handle more connections per second. See High connections per second.