Insert Many Rows From an Extracted TextFile

Hi all,

My situation here is that, i have a text file from a bank and extracted it to match my doctype field and inserting it using the frappe ORM. but the problem is that, its slow. The record for the text file is not large, only 100 records.

 for row in tuple(records):
                data = row.split(";")
                data_len = len(data)

                doc = frappe.new_doc('Payment Transaction')
                doc.filed1=data[0]
                doc.field2 = data[1]
                doc.filed3=data[2]
                doc.field4 = data[3]
                doc.insert(ignore_permissions=True)

This took about 8 seconds to finish.

I wonder that, when i make the doc as child table to other doctype and insert the records it took quite reasonable time to finish.

Is there any solution like adding the records in a list of dict and insert it onetime at the end of the process?

example:

record_to_insert = []
for row in tuple(records):
                    data = row.split(";")
                    data_len = len(data)

                    doc = frappe.new_doc('Payment Transaction')
                    doc.filed1=data[0]
                    doc.field2 = data[1]
                    doc.filed3=data[2]
                    doc.field4 = data[3]
                    record_to_insert.append(doc)


frappe.save(record_to_insert)

hope to get help from you guys, Thank you.

Got any solution?
Because i am also facing same issue.

Hi @vivek22793, the solution is to insert and commit each transaction. Or if you deal with big list you just use the raw sql using frappe.db.sql