Skip to content
Advertisement

Insert nested arrays into sql from python

I have a list contains many lists in python.

my_list = [['city', 'state'], ['tampa', 'florida'], ['miami','florida']]

The nested list at index 0 contains the column headers, and rest of the nested lists contain corresponding values. How would I insert this into sql server using pyodbc or slqalchemy? I have been using pandas pd.to_sql and want to make this a process in pure python. Any help would be greatly appreciated.

expected output table would look like:

city |state
-------------
tampa|florida
miami|florida

Advertisement

Answer

Since the column names are coming from your list you have to build a query string to insert the values. Column names and table names can’t be parameterised with placeholders (?).

import pyodbc 

conn = pyodbc.connect(my_connection_string)
cursor = conn.cursor()

my_list = [['city', 'state'], ['tampa', 'florida'], ['miami','florida']]

columns = ','.join(my_list[0]) #String of column names
values = ','.join(['?'] * len(my_list[0])) #Placeholders for values
query = "INSERT INTO mytable({0}) VALUES ({1})".format(columns, values)

#Loop through rest of list, inserting data
for l in my_list[1:]:
    cursor.execute(query, l)
conn.commit() #save changes

Update:

If you have a large number of records to insert you can do that in one go using executemany. Change the code like this:

columns = ','.join(my_list[0]) #String of column names
values = ','.join(['?'] * len(my_list[0])) #Placeholders for values
#Bulk insert
query = "INSERT INTO mytable({0}) VALUES ({1})".format(columns, values)
cursor.executemany(query, my_list[1:])
conn.commit() #save change
User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement