Pages

Wednesday, September 25, 2013

Exporting a local SQL Server table and its data to SQL Azure

For a recent project, I needed to export a table from my local SQL Server to a SQL Azure implementation.

Exporting an entire database

Now, if it is required to export the entire database (as opposed to a single table), it is an easy matter using SQL Server Management Studio (SSMS): Right-click on the DB name -> Tasks -> Deploy Database to SQL Azure.

But the problem with trying to export just a single table (and all of its data) into SQL Azure is that the latter does not permit statements like INSERT INTO ... SELECT FROM, due to connection limitations across servers (or even across databases).

Therefore, it is not even possible to do the 2-step process of first deploying your database from the local to the Azure server, and then inserting table data from the resulting Azure database to the desired one within Azure... because that too would involve a SELECT across databases.

It is also not possible to create a BACPAC for just the desired table, and then import the table from that BACPAC into an existing Azure database. BACPAC data can be imported ("deployed") only into a new database in SQL Azure.

Exporting a single table: The robust solution

One possible approach is to write a simple client app (e.g. a C# project) that would read data from the local database and then push it to the remote SQL Azure db. This is a robust approach, but is not quick and easy if all you need is to quickly push a relatively small table (up to a few 10's of thousands of rows) to Azure.

The quick and easy solution for relatively small tables

Here's a solution that accomplishes the job simply and conveniently. It involves generating a script form within SSMS; the script contains CREATE TABLE and INSERT statements with the data.

Here's how to create the script, which includes INSERTs for the data content:

  1. In SSMS, right-click on your local database, go to Tasks -> Generate Scripts
  2. In the Choose Objects tab, click on Select specific database objects, and then expand the Tables treeview and select the table(s) you want to export
  3. In the Set Scripting Options tab, click the Advanced button to get to the Options window, and in that window, change 'Types of data to script' to: Schema and data (default is Schema only).
  4. (Take this opportunity to look through the other options such as whether you want the Indexes to be scripted as well - lots to learn here!)
  5. Save the script to a file.

And then run that script and get the data into SQL Azure:


  1. Click on your SQL Azure db name in SSMS (where you want to transfer the data), and then open the script file which you had saved in the previous set of steps. Make sure that the window tab is showing that you're connected to your Azure server and not to your local server.
  2. Look through the top part of the script - there is likely to be a USE statement referring to your source database name - comment it out because it's not supported and we don't need it
  3. Make sure that the table being created has a CLUSTERED PRIMARY KEY. If not, add it (adding an IDENTITY column, if it does not exist). This is important - Azure will not accept data into the table if it doesn't have a clustered PK.
  4. Double-check that the connection for this window is correct: SQL Azure server, correct target database.
  5. Run the script.
In my test, it inserted almost 1,000 rows of data per minute. Total data was about 40K rows, so it took nearly 45 mins to transfer. Also, the connection was lost once, about 15 minutes into the process, so I had to remove that bunch of (nearly 15K) INSERT statements and then run the query again.

In the end, this process saved a lot of time in writing client-side code to read local and write remote data, and was very much worth it for the purpose.

Happy coding!

No comments:

Post a Comment