I made a code allowing me to GET data from an API and insert it into a database. But I encounter a problem my api contains about 20 million data and to insert everything in my database it will take me 43 days 🙂
I think the problem comes from the fact that I insert the data one by one.
That’s why I’m looking for how to execute all my queries at once.
The problem is that I get my data as a list[dict] like that :
<class 'list'> { 'siren': 'siren_int', 'nic': 'nic int', 'dateCreationEtablissement': '2000-09-26', 'uniteLegale': {'denominationUniteLegale': None, 'nomUniteLegale': 'last name', 'prenomUsuelUniteLegale': 'firstname'}, 'adresseEtablissement': { 'numeroVoieEtablissement': None, 'libelleVoieEtablissement': 'street name', 'codePostalEtablissement': 'postal code', 'libelleCommuneEtablissement': 'commune name', } }
Currently my code looks like this :
def insertGlobalBdd(self, rows : list[dict]) : cursor = self.bdd.cursor() for element in rows: rows.append((element["siren"], self.TestFormat(element["uniteLegale"]["nomUniteLegale"]), self.TestFormat(element["uniteLegale"]["prenomUsuelUniteLegale"]), self.TestFormat(element["uniteLegale"]["denominationUniteLegale"]), element["dateCreationEtablissement"])) cursor.execute("""INSERT INTO %s.dbo.Entreprise (siren, nomUniteLegale, prenomUsuelUniteLegale, denominationuniteLegale, dateCreationEtablissement) VALUES ('%s', '%s', '%s', '%s', '%s')""".format(",".join(str(i) for i in rows))) self.bdd.commit()
would you have any idea how to do it?
Advertisement
Answer
You can do with ‘executemany’.
example
task_all[ (1, "Jahanzabe", 30), (2, "Steve", 20), (3, "Haider", 30), (....All Others...)] cur.executemany('Insert in to Table_name(id, name, no) values(?,?,?)', task_all) con.commit()