postgresql - fastest way of inserting data into a table -


i have postgres database, , have inserted data table. because of issues internet connection, of data couldn't written.the file trying write database large (about 330712484 rows - ws -l command takes while complete.

now, column row_id (integer) primary key, , indexed. since of rows not inserted table, wanted insert these specific rows table. (i estimate 1.8% of data isn't inserted table ...) beginning, tried see of primary keys inside database so:

conn      = psycopg2.connect(connector) cur       = conn.cursor()  open(filename) f:      header = f.readline().strip()     header = list(csv.reader([header]))[0]     print(header)     i, l in enumerate(f):         if i>10: break         print(l.strip())          row_id = l.split(',')[0]          query = 'select * raw_data.chartevents row_id={}'.format(row_id)         cur.execute(query)         print(cur.fetchall())  cur.close() conn.close() 

even first few rows of data, checking see whether primary key exists takes large amount of time.

what fastest way of doing this?

the fastest way insert data in postgresql using copy protocol, implemented in psycopg2. copy not allow check if target id exists, tho. best option copy file content's temporary table insert or update this, in batch update article wrote on http://tapoueh.org blog while ago.

with recent enough version of postgresql may use

insert ... select * copy_target_table     on confict (pkey_name) nothing 

Comments

Popular posts from this blog

android - InAppBilling registering BroadcastReceiver in AndroidManifest -

python Tkinter Capturing keyboard events save as one single string -

sql server - Why does Linq-to-SQL add unnecessary COUNT()? -