Database Connectivity in .NET: ADO.NET basics

Database Connectivity in .NET: ADO.NET basics

Database Connectivity in .NET: ADO.NET basics

Connecting to databases and retrieving and manipulating data is a key part of most data-driven applications. .NET provides a robust framework called ADO.NET for working with relational data using languages like C# and VB.NET. Gaining a solid grasp of ADO.NET will enable you to interact with databases efficiently in your .NET applications.

Let’s dive in!

ADO.NET Architecture

ADO.NET refers to the data access libraries included with .NET for connecting to data sources like SQL Server and retrieving records. It provides an abstraction layer between your application code and the database driver. The key components include:

  • Data Providers: Components that bridge communication between your app and specific data source whether a database, file, XML, etc. Database providers connect to respective database servers and expose data source functionality into ADO.NET.

  • Data Sets: In-memory store of data retrieved from data source. Useful for data binding UI controls, batch updates, caching query results locally, and persisting to XML.

  • Data Readers: Forward-only stream of rows returned from a query. Readers provide efficient, read-only access but in a connected fashion requiring the database connection to remain open.

  • Commands: Represent SQL statements or stored procedures executed against a database. Commands encapsulate the action to perform.

  • Connections: Represent a session and communication channel to the database. Connections enable running commands and retrieving results.

  • Parameters: Hire .NET developers to provide strongly typed placeholders for passing arguments when executing parameterized SQL statements or stored procedures. Protects from injection.

These core objects collaborate to enable querying, updating, and manipulating relational databases efficiently. Next, we’ll explore creating connections.

Creating Database Connections

The first step when working with a database is establishing a valid connection. This requires referencing the appropriate ADO.NET data provider in your application and using a connection string:

//Reference SQL Server data provider

using System.Data.SqlClient;

//Create SQL connection string builder specifying server, database, Trusted_Connection

var builder = new SqlConnectionStringBuilder() {

DataSource = “localhost”,

InitialCatalog = “SampleDB”,

IntegratedSecurity = true

};

//Build connection string

var connectionString = builder.ConnectionString;

//Instantiate SQL connection object passing connection string

using (var connection = new SqlConnection(connectionString))

{

//Open connection

connection.Open();

//Define commands, readers, etc here

//Close connection

connection.Close();

}

Connection strings act as addresses pointing your application to the database location and providing access credentials. The use block ensures connections get disposed of deterministically.

ADO.NET includes built-in support and data providers for SQL Server, OLE DB, ODBC, Oracle, and many other database platforms. Providers handle mapping ADO.NET functions to specific vendor APIs.

Hire dedicated developers to integrate third-party providers for additional databases like MySQL and PostgreSQL not included out of the box.

Creating Database Commands

After opening a valid connection, we can create Command objects that represent SQL statements or stored procedures we want to execute against the database:

//Non-parameterized query command

var command = new SqlCommand(“SELECT * FROM Employees”, connection);

//Parametrized command querying employees by ID

var command = new SqlCommand(“SELECT * FROM Employees WHERE EmployeeID = @ID”);

//Specify parameter placeholder

command.Parameters.AddWithValue(“@ID”, 1);

//or stored procedure command

var command = new SqlCommand(“GetEmployeeByID”, connection);

command.CommandType = CommandType.StoredProcedure;

When creating the Command object, we specify the SQL query or stored procedure name directly as text. Commands can be re-used across multiple connections.

For parametrized queries, command parameters act as placeholders we populate before execution. The command type also differentiates between straight SQL statements vs stored procedures.

Executing SQL Statements and Stored Procedures

After defining Command objects, we can execute them against the open connection:

//Execute command, returns number of rows affected

var rowsAffected = command.ExecuteNonQuery();

//For queries, execute and get back a data reader instance

var reader = command.ExecuteReader();

ExecuteNonQuery is best for INSERT, UPDATE, DELETE commands that modify rather than retrieve data. The rows affected integer provides a simple way to verify changes occurred or validate statement logic upfront.

ExecuteReader returns a forward-only, read-only data reader we can iterate over to access the resultset rows returned by our SELECT query.

For stored procedures that utilize output parameters or return values, the ExecuteScalar method returns the single value:

//Store procedure with int output parameter

var command = new SqlCommand(“GetEmployeeCount”, connection);

//Define output parameter

command.Parameters.Add(“@Count”, SqlDbType.Int);

command.Parameters[“@Count”].Direction = ParameterDirection.Output;

//Execute stored procedure

command.ExecuteNonQuery();

//Retrieve output value

var employeeCount = (int) command.Parameters[“@Count”].Value;

By defining output parameters explicitly, ExecuteNonQuery allows us to retrieve those procedure outputs easily.

Working with Data Readers

Data reader objects represent the forward-reading, read-only result set returned from query command execution. They provide efficient, connected access to iterate through rows:

//Execute SQL command that returns rows

var reader = command.ExecuteReader();

//Iterate through reader

while (reader.Read())

{

//Access column data by index

var id = reader.GetInt32(0);

//Or by column name

var name = reader.GetString(“FirstName”);

}

//Always close reader when finished

reader.Close();

Data readers have properties like FieldCount and HasRows to inspect results set metadata. Methods like GetString, GetInt32 etc. enable retrieving scalar column values by index or name per the current resultset row.

Read() advances to next row, returning false when end hit so we can process results in a loop. Close() must be called when done to release resources.

Connection Pooling

Opening and closing connections constantly can hurt performance through added round trips. ADO.NET mitigates this through connection pooling.

The pool manager maintains a cache of active connections that can be reused across multiple data access requests by transparently grabbing and returning connections. This avoids costly re-opening of connections repeatedly.

Configure pooling parameters in your connection string appropriately for your workload patterns. The pool also works across multiple threads requesting connections.

Working with Database Transactions

Most data operations require multiple inserts, updates or deletes as a single logical unit of work. Unit of work refers to a set of data modifications that succeed or fail together – maintaining data integrity.

Transactions provide this mechanism grouping multiple SQL statements so they are treated atomically rather than individually. If any part fails, the entire transaction is rolled back leaving the database unmodified.

ADO.NET enables programmatic transaction management through the Connection object:

// Start new transaction

connection.Open();

var transaction = connection.BeginTransaction();

try

{

// Run SQL statements

command.ExecuteNonQuery();

// Commit if all succeeds

transaction.Commit();

}

catch (Exception ex)

{

// Rollback if issue

transaction.Rollback();

// Handle error

}

The benefit vs auto-commit mode is that we can batch operations knowing if any fails, none will be persisted. Great for complex business workflows!

Parameterizing Commands to Prevent Injection

Directly injecting untrusted input into SQL statements makes apps vulnerable to injection attacks compromising security:

//DANGER! Don’t concat user input

var sql = “SELECT * FROM Products WHERE Category = ‘” + input + “‘”;

User input could terminate the string prematurely or modify logic using symbols. Any input should be treated as untrusted.

Instead, use parameterized queries to avoid injection:

//SAFE – Use parameters for input values

var command = new SqlCommand(“SELECT * FROM Products WHERE Category = @category”);

//Add parameter

command.Parameters.AddWithValue(“@category”, input);

Parameters enforce validation and type safety. Input gets treated as data value rather than part of SQL. This neutralizes injection attacks even for malicious input.

Retrieving Database Schema Information

ADO.NET provides metadata collections on connections for retrieving information about database structure programmatically:

//Query database schema info

var tables = connection.GetSchema(“Tables”);

//Inspect metadata

foreach(System.Data.DataRow table in tables.Rows)

{

// Table name, schema etc

string name = table[“TABLE_NAME”].ToString();

}

Schema datasets describe objects like tables, views, procedures, and columns, along with properties. We can generate mappings, docs, graphs, ER diagrams and more dynamically from this metadata.

Enabling Asynchronous Operations

Synchronous data access where commands block current thread while executing can limit scalability in high-throughput applications because of the required context switching and blocking. ADO.NET solves this by providing asynchronous (async) versions of methods like ExecuteReader:

// Async execution releases current thread

await command.ExecuteReaderAsync();

// UI remains responsive while executing

// Other logic can run…

// …

var reader = await command.ExecuteReaderAsync();

while (await reader.ReadAsync()) {

// Process row

}

await reader.CloseAsync();

Asynchronous operations prevent blocking primary thread so user interfaces stay responsive and parallel work can progress while waiting on I/O.

Optimizing Data Access Performance

There are various ways to optimize data access code for maximum efficiency:

  • Batching – Batch SQL statements into a single roundtrip using transactional APIs instead of individual commands

  • Asynchronous Operations – Avoid blocking threads, enable parallelism

  • Parameterize Queries – Reuse parameterized commands instead of runtime SQL construction

  • Data Readers – Stream data as needed rather than materializing all results upfront with datasets

  • Connection Pooling – Reuse open connections instead repeatedly opening/closing connections

  • Column Filtering – Only request necessary columns vs entire rows with SELECT * uncontrolled queries

  • Indexes – Create indexes on frequently filtered columns to avoid full table scans

  • Stored Procedures – Encapsulate data logic on server instead of client round trips

  • MARS – Enable multiple active result sets from same connection when logic requires it

There are no silver bullets, so profiling queries using database tools is crucial for optimization.

Conclusion

ADO.NET underpins data access in .NET. Core objects like connections, commands and readers enable querying, updating and materializing result sets efficiently. Numerous performance optimizations exist around pooling, parameterization, MARS and more.

For simplified abstractions, micro-ORMs like Dapper provide lighter weight options as well. Robust data access capabilities form the bedrock of most line of business applications!

This concludes our journey into ADO.NET. You should now feel comfortable working with databases programatically from .NET apps.

Leave a Reply

Your email address will not be published. Required fields are marked *