Try ERPNext Buy Support Partners Foundation

Can't import some data via CSV which I can create manuall, Data too long for column 'name' at row 1

Hello,
if I try to import a Customer via CSV import with a name length of 75 characters, I’ll get the following error:

Traceback (most recent call last):
  File "/home/frappe/frappe-bench/apps/frappe/frappe/core/doctype/data_import/importer.py", line 126, in import_data
    doc = self.process_doc(doc)
  File "/home/frappe/frappe-bench/apps/frappe/frappe/core/doctype/data_import/importer.py", line 194, in process_doc
    return self.insert_record(doc)
  File "/home/frappe/frappe-bench/apps/frappe/frappe/core/doctype/data_import/importer.py", line 213, in insert_record
    new_doc.insert()
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 260, in insert
    self.run_post_save_methods()
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 927, in run_post_save_methods
    self.run_method("on_update")
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 797, in run_method
    out = Document.hook(fn)(self, *args, **kwargs)
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 1073, in composer
    return composed(self, method, *args, **kwargs)
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 1056, in runner
    add_to_return_value(self, fn(self, *args, **kwargs))
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 791, in 
    fn = lambda self, *args, **kwargs: getattr(self, method)(*args, **kwargs)
  File "/home/frappe/frappe-bench/apps/erpnext/erpnext/selling/doctype/customer/customer.py", line 84, in on_update
    self.create_primary_contact()
  File "/home/frappe/frappe-bench/apps/erpnext/erpnext/selling/doctype/customer/customer.py", line 104, in create_primary_contact
    contact = make_contact(self)
  File "/home/frappe/frappe-bench/apps/erpnext/erpnext/selling/doctype/customer/customer.py", line 448, in make_contact
    contact.insert()
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/document.py", line 245, in insert
    self.db_insert()
  File "/home/frappe/frappe-bench/apps/frappe/frappe/model/base_document.py", line 325, in db_insert
    ), list(d.values()))
  File "/home/frappe/frappe-bench/apps/frappe/frappe/database/database.py", line 156, in sql
    self._cursor.execute(query, values)
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/cursors.py", line 170, in execute
    result = self._query(query)
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/cursors.py", line 328, in _query
    conn.query(q)
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/connections.py", line 517, in query
    self._affected_rows = self._read_query_result(unbuffered=unbuffered)
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/connections.py", line 732, in _read_query_result
    result.read()
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/connections.py", line 1075, in read
    first_packet = self.connection._read_packet()
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/connections.py", line 684, in _read_packet
    packet.check_error()
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/protocol.py", line 220, in check_error
    err.raise_mysql_exception(self._data)
  File "/home/frappe/frappe-bench/env/lib/python3.7/site-packages/pymysql/err.py", line 109, in raise_mysql_exception
    raise errorclass(errno, errval)
pymysql.err.DataError: (1406, "Data too long for column 'name' at row 1")

If I create this customer manually it works without any problems?
Any ideas ?!

What normally goes wrong here is the “name” filed in the CSV file might have single or double quotes or possibly even commas in the text of the name. This will cause the program that parses out the fields from the CSV file to generate errors by finding extra columns or by concatenating columns in a particular row.

Look for anything in the first row of data that might have characters other than numbers, letters and spaces. This is a common cause for data errors in an import function.

Hope this helps…

BKM

Oh wait, which type of CSV standard are you parsing ? Because anything creating new columns should be escaped, I can open my CSV files with commas in field values in libreoffice calc without any problems under default settings.

They are quoted by my CSV export library. So Raw data would be

customer_name,primary_address
"My,Sweet,Customer","My,Sweet,Customer-Abrechnung"

If your data can be reformatted easily to JSON you might consider using the API instead.

I found I was expending way too much effort manually submitting dozens of CSV files. The API permits me to submit them all at once by running a single shell script calling multiple curl commands.

2 Likes

Oh no… I’ve just considered the CSV way, since that seemed to be the default or better tested.
Except: All my “intermediate” (old system -> intermediate -> erpnext) stuff is easily JSON (I even cache it as json to save SQL bandwidth).

Yeah, a lot of the available software handles quotes and commas much better than ERPNext does. I am just passing along my experiences of having them fail before.

I have had entire import sessions blow up because there was a name filed that had an apsotrophe included. I have also found that having all fields exported as text wrapped in quotes can also blow up if there is a stray apostrophe or quote symbol.

The parsing engine that frappe/erpnext uses is probably not the greatest. I tripped over these things a great deal when working through a bunch of different clients data imports. Things that you expect would work just get interpreted incorrectly.

Not a perfect process, but if you know to look for these issues, it makes them easier to track down.

BKM

1 Like

Ok I get the same error when using the API…

Can you show your JSON?

I find this site very useful when trying to track down the cause of defective JSON: