Background is I’m making a SQL connection that takes a .csv file and imports it into a SQL Server database table.
The problem I’m running into is, I’m having trouble with the query syntax because there is no unique identifier for a row in the .csv file I’m importing. It takes 3 fields combined to makes a row unique/distinct.
Rough example of the .csv files’ data, the first three of the .csv columns can be considered together to make unique rows:
Order_Id Product_Id Date Other (etc...) 1 1a 1/9 q 1 2a 1/9 q 1 2a 1/10 e 2 1a 1/9 e 2 2a 1/10 e
This is my query syntax simplified in Visual Studios, (I’m importing 25 or so columns in reality from the .csv file, and so to keep things straight/simple I made the columns names the exact same in both the .csv file and the SQL-Server table), but the basic syntax looks like this:
private void SaveImportDataToDatabase(DataTable importData) { using (SqlConnection conn = new SqlConnection("Server=localhost;Database=my_Database;Trusted_Connection=True;")) { conn.Open(); foreach (DataRow importRow in importData.Rows) { SqlCommand cmd = new SqlCommand("IF EXISTS(SELECT DISTINCT Order_id, Product_Id, Date FROM Sales WHERE Order_id = @Order_id AND Product_Id = @Product_Id AND Date = @Date) UPDATE SQL_Sales SET Order_id = @Order_id WHERE Order_id = @Order_id ELSE INSERT INTO SQL_Sales (order_id, Product_Id, Date)" + "VALUES (@order_id, @Product_Id, @Date);", conn); cmd.Parameters.AddWithValue("@Order_id", importRow["Order_id"]); cmd.Parameters.AddWithValue("@Product_Id", importRow["Product_Id"]); cmd.Parameters.AddWithValue("@Date", importRow["Date"]); cmd.ExecuteNonQuery(); } } }
Once imported I see a few problems in the SQL Server table,
- The order_id field will be null
- It’s only importing a very small amount of the data, about 50 of 2000 records
- If I re-import the data with a change in the .csv file, say with a single new row, I get 100 of 2000 records
I’m not sure if what I’m trying to do is possible or worth it. Should I be breaking this down a lot more instead of doing it all in one query? I’m not necessarily new to coding but I don’t code very often/I’m rusty and this is my first C# project so lend me some patience if possible.
Just wanted to add more code in response to @casey crookston, it’s possible problems 2 and 3 are related to my loop
private void btnImport_Click(object sender, EventArgs e) { Cursor = Cursors.WaitCursor; DataTable importData = GetDataFromFile(); if (importData == null) return; SaveImportDataToDatabase(importData); MessageBox.Show("Import Successful"); txtFileName.Text = string.Empty; Cursor = Cursors.Default; } private DataTable GetDataFromFile() { DataTable importedData = new DataTable(); try { using (StreamReader sr = new StreamReader(txtFileName.Text)) { string header = sr.ReadLine(); if (string.IsNullOrEmpty(header)) { MessageBox.Show("No File Data"); return null; } string[] headerColumns = header.Split(','); foreach (string headerColumn in headerColumns) { importedData.Columns.Add(headerColumn); } while (!sr.EndOfStream) { string line = sr.ReadLine(); if (string.IsNullOrEmpty(line)) continue; string[] fields = line.Split(','); DataRow importedRow = importedData.NewRow(); for(int i = 1; i < fields.Count(); i++) { importedRow[i] = fields[i]; } importedData.Rows.Add(importedRow); } } } catch (Exception e) { Console.WriteLine("The file could not be read:"); Console.WriteLine(e.Message); } return importedData; }
Advertisement
Answer
This looks like a good spot to use SQL Server’s MERGE
syntax:
merge sales s using (values(@product_id, @order_id, @date, @other_1, @other_2)) as p(order_id, product_id, date, other_1, other_2) on (s.product_id = p.product_id and s.order_id = p.order_id and s.date = p.date) when matched then update set s.other_1 = p.other_1, s.other_2 = p.other_2 when not matched by target then insert(order_id, product_id, date, other_1, other_2) values(p.order_id, p.product_id, p.date, p.other_1, p.other_2)
This uses the first 3 columns as primary key; when a tuple already exists, then colums other_1
and other_2
are updated with the values that would have been otherwise inserted.