How to append a new row to an existing csv file?
For writing a CSV file, the CSV module provides two different classes writer and Dictwriter. Here we will discuss 2 ways to perform this task effectively. The First will be ‘append a list as a new row to the existing CSV file‘ and second way is ‘Append a dictionary as a new row to the existing CSV file.’
First, let’s have a look at our existing CSV file contents.
event.csv
Append a new row to the existing CSV file using writer
let’s see how to use the writer class to append a list as a new row into an existing CSV file.
- Open your existing CSV file in append mode Create a file object for this file.
- Pass this file object to csv.writer() and get a writer object.
- Pass the list as an argument into the writerow() function of the writer object. (It will add a list as a new row into the CSV file).
- Close the file object
Python3
from csv import writer
List = [ 6 , 'William' , 5532 , 1 , 'UAE' ]
with open ( 'event.csv' , 'a' ) as f_object:
writer_object = writer(f_object)
writer_object.writerow( List )
f_object.close()
|
Output:
Append a new row to the existing CSV file using Dictwriter
Let’s see how to use DictWriter class to append a dictionary as a new row into an existing CSV file.
- Open your CSV file in append mode Create a file object for this file.
- Pass the file object and a list of column names to DictWriter() You will get an object of DictWriter.
- Pass the dictionary as an argument to the writerow() function of DictWriter (it will add a new row to the CSV file).
- Close the file object
Python3
from csv import DictWriter
field_names = [ 'ID' , 'NAME' , 'RANK' ,
'ARTICLE' , 'COUNTRY' ]
dict = { 'ID' : 6 , 'NAME' : 'William' , 'RANK' : 5532 ,
'ARTICLE' : 1 , 'COUNTRY' : 'UAE' }
with open ( 'event.csv' , 'a' ) as f_object:
dictwriter_object = DictWriter(f_object, fieldnames = field_names)
dictwriter_object.writerow( dict )
f_object.close()
|
Output:
CSV file after Appending Dictionary
Last Updated :
13 Oct, 2022
Like Article
Save Article
Share your thoughts in the comments
Please Login to comment...