Jump to content

Recommended Posts

Hello :)

I need a help. I have try two ways of import csv files in a table. Basically i do this for automation.  I have a csv file that is filled up every day with new appended data.

I used 2 scripts:

 

With the software: pgfutter

pgfutter --db "Test" --port "5432" --user "postgres" --pw "1111" csv DATA.csv

The only problem there is that if the csv file have cyrillic content its not displayed. But there is no issue when transfer duplicated rows. They are replaced by the new ones.

 

In the second scripts the result is opposite. Its not replace if they are existed in the table. They are copied next to the same one:

(
echo DELETE from testcsv; COPY testcsv FROM 'D:\data.csv' DELIMITER ',' CSV;
) | "C:\Program Files\PostgreSQL\9.6\bin\psql" -h localhost -p 5432 -U postgres -d Test -w

I see in the Net that can accomplish this with temporary tables or upsert. Can you give me what is the best solution and give me example, please?

 

Link to comment
https://forums.phpfreaks.com/topic/304476-import-csv-file-in-table-duplicate-rows/
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.