Moving Data from Teradata to SQL Server
Update – December 2016
I wrote this article back in 2013, based on my current experiences working between SQL Server & Teradata. With the benefit of time (and learning more!), there are probably better ways I could have achieved what I’ve outlined below. Most notably using Integration Services. That said, it was a good learning experience, and hopefully still holds some value to those unable / unwilling to use SSIS
I work with both Teradata and SQL Server. Generally speaking, there isn’t much need to move data between the two – they serve different purposes – and when I do, a simple Select statement via a linked server works as well as I need it to. However, I recently needed to move a larger amount of data, where this approach would not be suitable.
It took a little elbow grease and persuasion to get everything to work together, but eventually it did, and I have a single Batch File I can execute / schedule which does the whole job.
I would point one that many of the approaches used in what follows are pretty new to me, so I don’t pretend to be an expert – and there may well be other, better ways of doing parts of it, but it works, and it’s fast!
Part 1 – Getting the data out of Teradata
Although there are many ways you can extract data from Teradata, if you want to bulk extract, and you want it to be quick, there’s really only one option – Fast Export. When compared to the other Teradata Utilities, such as BTEQ & FASTLOAD, I do find Fast Export a bit trickier to work with – but it’s the only realistic option.
The knack to getting FastExport to output in a format you can then easily import into another system is to CAST your entire row as a single string, with [TAB] delimitation between each field. Moreover, you must cast as a fixed width string (ie, CHAR not VARCHAR) – otherwise you get an additional column at the beginning of your output file which is unreadable. (It probably has a purpose – but I don’t know what it is!). Your Import Script will need to look something like this…
.LOGTABLE DataBaseName.Export_FE_Log_d; .LOGON server/uid,pw; .BEGIN EXPORT SESSIONS 20; .EXPORT OUTFILE "F:\Uploads\OutPut.txt" FORMAT TEXT MODE RECORD; SELECT CAST(trim( CAST(ZEROIFNULL(Year_Week_Number) AS INTEGER) ||' '|| CAST(OD_Group_Description AS VARCHAR(30)) ||' '|| ~ CAST(ZEROIFNULL(Depot_Stock_Cost) AS DECIMAL(15,2)) ||' '|| CAST(ZEROIFNULL(Depot_Stock_Retail) AS DECIMAL(15,2))) AS CHAR(800)) AS NN FROM DataBaseName.TableName; .END EXPORT; .LOGOFF;
You can then save this as a text file, (eg script.txt) and call this from the command prompt
CD F:\Uploads fexp < script.txt > result.txt
(The result.txt file will be created, and gives you details of any errors etc.)
You should now be looked at a tab-delimitated flat file. If you want to change to Pipe bar or Comma delimitation, change what you have between the concatenation operators, eg
||’,’|| or ||’|’||
To give some context, FastExport can do in under 3 mins what BTEQ took 30 mins to achieve
Part 2 – Prepping your table in SQL Server.
I will assume you already have your table created in SQL Server, which means we’re ready to load it. However – we would, in most cases (though not all, of course) want to wipe the table before loading it. The script for loading can’t delete first, only Append. But, there’s another command-based utility which allows us to perform this action.
The program is called SQLCMD, and a fully-fledged program for managing SQL Server through commands. We don’t want that, we just want to run a single command and move on – but that’s OK, because we can run it in this mode as well.
The syntax to use (there are dozens of parameters available – I’m only showing the ones I needed)
SQLCMD -S SERVER_ADDRESS /Q "DELETE FROM Database.Owner.TableName"
So, you launch SQLCMD, -S followed by your Server Address tells it what to connect to. /Q is what tells it you want to just run a single command and quit again. Then give it your DELETE statement. I’m using Windows Authentication, but if you’re not, you’ll need to also provide your credentials in this script, using the –U & -P flags, eg
SQLCMD -S SERVER_ADDRESS –U uid –P pwd
Part 3 – Importing into SQL Server.
SQL Server provides a powerful command utility for importing data called Bulk Copy – or ‘bcp’. What (in my view) makes it powerful is that you can provide what is referred to as an Format File. This is a standalone file which you provide bcp along with you import data, and tells bcp what format you data is in, and which fields they should be mapped to. It can be a little unnerving at first, but it’s mostly very straightforward. Even better, is that there is a command you can run for SQL Server to create your Format File for you – though we’ll need to edit it. I would point out at this stage that there are two types of Format File – *.fmt files, and the newer type, *.xml files. Where you’re using new systems, you should probably use xml format files, but I was in a bit of a hurry, so used the older .fmt file – but may well change this in the future.
To create a Format File, run the following command:
bcp Database.Owner.TableName format nul –f MyFormatFile.fmt –n –T
This will create a file called MyFormatFile.fmt, and is the starting point for our own Format File. Open it up in notepad. First you’ll see a version number, and a figure showing the total number of columns – this should be the number of columns in your import data (assuming it’s not the same). Then, you’ll see each field from your table listed showing:
- Field Number – will be sequential, and only needs changing if the order of your Fast Export file differs from your SQL Server Table. It should be the order they appear in the FE file.
- Data Type – Remember we had to cast as CHAR? Well, that means we need to specific in this file that ALL fields are ‘SQLCHAR’ – so go through and change these if required
- Prefix Length – I leave all of mine as Zero
- Data Type Length – should match the field you’re importing
- Field / Row Terminator – This is your delimitation flag. If you used [TAB], this should read ‘\t’. If it was comma, ‘,’ etc. Except for your last field – this will, in most cases, want to read ‘\r\n’ to denote it’s a new line
- Table Column Mapping Number – this is the sequential column number in your destination table you want to map it to. Mostly, this will be the same as your Field Number, but if your columns are in different orders, you will want to change this. Or, if you want to skip a column, set this field as 0
- Column Name (as per SQL Server Table) – Name of the column in your destination table
- Column Collation – for true Character fields, you may want / need to set this, such as to SQL_Latin1_General_CP1_CI_AS
Then save the updated file. We can then trigger the upload using the bcp utility again. (I’ve assumed your import data file and format file are in the same place)
BCP Databse.Owner.TableName in "F:\Uploads\Output.txt" -t, -f "F:\Uploads\MyFormatFile.fmt" –T
(The –T flag denotes that you wish to use windows authentication – use the –U & -P flags if you need to provide username & password)
This will import your data, and you should get a visual reference showing the progress.
Part 4 – Tie it all together
The beauty of using these command utilities is that we can tie them into a single .bat file. In my folder, I have 5 files:
- Script.txt (my Fast Export Script)
- Result.txt (the Fast Export Output Results file)
- Output.txt (the results from the Fast Export)
- MyFormatFile.fmt (the bcp Format File)
- ExportAndLoad.bat (my batch file)
My finished Batch file looks like this…
cd "F:\Uploads " fexp < script.txt >result.txt SQLCMD -S MYSERVER /Q "DELETE FROM Database.dbo.Table" BCP Database.dbo.Table in "F:\Uploads\OutPut.txt" -t, -f "F:\Uploads\MyFormatFile.fmt" –T
And there you have it. Running that Batch File will do the export, the wipe, and the loading into SQL Server.
You could take it one step further, and attach this to a task schedule, but I want to trigger mine manually (at least for now!), so this is as far as I go for the time being!
Hope that helps someone – it’s all ‘out there’ on the web, but founds myself visiting dozens of sites and talking to many other people to piece it all together, so here you have it all in one place.