Is datareader quicker than dataset when populating

2019-01-06 22:13发布

Which would be quicker.

1) Looping a datareader and creating a custom rows and columns based populated datatable

2) Or creating a dataAdapter object and just (.Fill)ing a datatable.

Does the performance of a datareader still hold true upon dynamic creation of a datatable?

8条回答
女痞
2楼-- · 2019-01-06 22:42

As with many questions like this the answer is: depends.

If you don't know the structure of your data up front and are creating TableAdapters on the fly, then the dynamic DataTable would be more efficient. There is a good deal of code generation involved in creating a TableAdapter.

However, if you know the structure of your data up front then the question becomes, How much functionality do I need?

If you need a full CRUD implementation then there are some efficiencies gained by using a TableAdapter rather than writing all that CRUD code yourself. Also, the TableAdapter implementation is OK (not great). If you need something more efficient then you may be better off using nHibernate or some other ORM.

If you don't need a full CRUD implementation (i.e., this is a read-only solution) and know your data structure up front, then you'll have to test the efficiency of a TableAdapter read-only implementation against a dynamically generated DataTable. If I were a betting man I'd put my money on the TableAdapter implementation since you bind data once and read it multiple times.

查看更多
一夜七次
3楼-- · 2019-01-06 22:44

Going by DataReader's Read which is a forward-only, one-row-at-a-time approach, which reads data sequentially so that you get records as soon as they are read when being connected, will be the best for memory and performance.

That said, between the two approaches, I find IDataAdapter.Fill much faster than DataTable.Load. Of course that depends on implementations.. Here is a benchmark between the two which I posted here:

public DataTable Read1<T>(string query) where T : IDbConnection, new()
{
    using (var conn = new T())
    {
        using (var cmd = conn.CreateCommand())
        {
            cmd.CommandText = query;
            cmd.Connection.ConnectionString = _connectionString;
            cmd.Connection.Open();
            var table = new DataTable();
            table.Load(cmd.ExecuteReader());
            return table;
        }
    }
}

public DataTable Read2<S, T>(string query) where S : IDbConnection, new() 
                                           where T : IDbDataAdapter, IDisposable, new()
{
    using (var conn = new S())
    {
        using (var da = new T())
        {
            using (da.SelectCommand = conn.CreateCommand())
            {
                da.SelectCommand.CommandText = query;
                da.SelectCommand.Connection.ConnectionString = _connectionString;
                DataSet ds = new DataSet(); //conn is opened by dataadapter
                da.Fill(ds);
                return ds.Tables[0];
            }
        }
    }
}

The second approach always outperformed the first.

Stopwatch sw = Stopwatch.StartNew();
DataTable dt = null;
for (int i = 0; i < 100; i++)
{
    dt = Read1<MySqlConnection>(query); // ~9800ms
    dt = Read2<MySqlConnection, MySqlDataAdapter>(query); // ~2300ms

    dt = Read1<SQLiteConnection>(query); // ~4000ms
    dt = Read2<SQLiteConnection, SQLiteDataAdapter>(query); // ~2000ms

    dt = Read1<SqlCeConnection>(query); // ~5700ms
    dt = Read2<SqlCeConnection, SqlCeDataAdapter>(query); // ~5700ms

    dt = Read1<SqlConnection>(query); // ~850ms
    dt = Read2<SqlConnection, SqlDataAdapter>(query); // ~600ms

    dt = Read1<VistaDBConnection>(query); // ~3900ms
    dt = Read2<VistaDBConnection, VistaDBDataAdapter>(query); // ~3700ms
}
sw.Stop();
MessageBox.Show(sw.Elapsed.TotalMilliseconds.ToString());

Read1 looks better on eyes, but data adapter performs better (not to confuse that one db outperformed the other, the queries were all different). The difference between the two depended on query though. The reason could be that Load requires various constraints to be checked row by row from the documentation when adding rows (its a method on DataTable) while Fill is on DataAdapters which were designed just for that - fast creation of DataTables.

查看更多
The star\"
4楼-- · 2019-01-06 22:47

I cant speak to filling a datatable per se but using a datareader is the most efficient reading method.

查看更多
成全新的幸福
5楼-- · 2019-01-06 22:51

Your option #1 would be slower. However, there's a better way to convert a datareader to a datatable than adding custom rows by hand:

DataTable dt = new DataTable();

using (SqlConnection conn = GetOpenSqlConnection())
using (SqlCommand cmd = new SqlCommand("SQL Query here", conn)
using (IDataReader rdr = cmd.ExecuteReader())
{
    dt.Load(rdr);
}

I can't comment on the difference between this and using .Fill().

查看更多
欢心
6楼-- · 2019-01-06 22:59

The DataAdapter uses a DataReader under the hood so your experience would likely be the same.

The benefit of the DataAdapter is you cut out a lot of code that would need maintenance.

This debate is a bit of a religious issue so definitely look around and decide what works best for your situation:

查看更多
Summer. ? 凉城
7楼-- · 2019-01-06 22:59

The datareader is faster. And if you are using 2.0+ you probablt don't even have to use a datatable. You can use a generic list of your object.

查看更多
登录 后发表回答