I have a csv file and want to import part of it into my Mysql database. From this question, I tried this command in my Mysql command line
load data local infile 'mycsv.csv’ into table mytable fields terminated by ',' enclosed by '"' lines terminated by 'n' (entity_ticker,harvested_at, entity_sector,entity_competitors,story_type,story_source,event,entity_sentiment,event_sentiment,event_relevance,entity_relevance,entity_industry)
Those in bracket is the columns I want to put in my table. The order is same to my table but not same to csv file.
And the mysql didn’t do anything like this:
mysql> load data local infile '2013first1000.csv’ into table news_dataset fields terminated by ',' enclosed by '"' lines terminated by 'n' (entity_ticker,harvested_at, entity_sector,entity_competitors,story_type,story_source,event,entity_sentiment,event_sentiment,event_relevance,entity_relevance,entity_industry) "> "> "> ">
This is the first two row of my file opened in txt:
signal_id,story_id,story_group_id,new_story_group,story_group_count,source_id,author_id,story_type,story_source,templated_story_score,story_traffic,story_group_traffic_sum,story_group_exposure,entity_sentiment,event_sentiment,story_sentiment,story_group_sentiment_avg,story_group_sentiment_stdev,entity_name,entity_ticker,entity_exchange,entity_relevance,entity_country,entity_indices,entity_industry,entity_region,entity_sector,entity_competitors,entity_type,entity_composite_figi,entity_exch_code,entity_figi,entity_market_sector,entity_security_description,entity_security_type,entity_share_class_figi,entity_unique_id,entity_unique_id_fut_opt,entity_author_republish_score,entity_author_timeliness_score,entity_source_republish_score,entity_source_timeliness_score,event,event_group,event_relevance,event_author_republish_score,event_author_timeliness_score,event_source_republish_score,event_source_timeliness_score,event_impact_pct_change_avg,event_impact_pct_change_stdev,event_impact_pos,event_impact_neg,event_impact_gt_mu_add_sigma,event_impact_lt_mu_sub_sigma,event_impact_gt_mu_pos_add_sigma_pos,event_impact_lt_mu_neg_sub_sigma_neg,event_impact_gt_mu_pos_add_2sigma_pos,event_impact_lt_mu_neg_sub_2sigma_neg,event_impact_gt_1pct_pos,event_impact_lt_1pct_neg,overall_source_timeliness_score,overall_source_republish_score,overall_author_republish_score,overall_author_timeliness_score,harvested_at 5a431613914455535dd2f7e3,59a8c0cb631131661f7cca72,629355b4-5ef4-45ab-872a-9fd1809ca49d,f,552,hW9kPL6pMO0suREcwfIObD7SjwY=,A/mu8VqEc7agwge3hbjXgdFfDeA=,news,broadcastengineering.com,,11548,308325207322,high,31.2,33.35,31.2,36.9,38.9,Sony Corp Ord,SNE,NYSE,100,N/A,[],Consumer Electronics/Appliances,N/A,Consumer Non-Durables,”[“”MSFT””,””IBM””,””INTC””,””4331″”,””NOKIA””,””HPQ””,””DIS””,””MSI””,””AAPL””,””PHG””]”,US_EQUITY,BBG000BT7ZK6,UN,BBG000BT81N7,Equity,SNE,ADR,BBG001S5W6H8,EQ0010136500009000,,0,70.2,0.9566,40.4852,Employment Actions – General,Employment Actions,50.8,,,0,32.5118,,,,,,,,,,,,,53.7511,0.53,0,64.2908,2013-08-01T00:00:03.000Z
Could you tell me how to do this?
Advertisement
Answer
1) ‘The order is same to my table but not same to csv file.’-Make the order same then whats the purpose of using csv import? use this-
load data local infile 'mycsv.csv' into table mytable fields terminated by ',' enclosed by ' ' lines terminated by 'n'
2) still want to ignore columns- From Mysql docs:
You can also discard an input value by assigning it to a user variable and not assigning the variable to a table column:
LOAD DATA INFILE 'file.txt' INTO TABLE t1 (column1, @dummy, column2, @dummy, column3);
3)to skip first row- use Ignore 1 lines