Get File Object used by a CSV Reader/Writer Object

Is there any way to access the file object used by a CSV writer/reader object after it has been instantiated? I openned up the csv module, and it appears it's contest are builtin. I also tried setting the file object as a property but I get the following error:

AttributeError: '_csv.writer' object has no attribute 'fileobj'


Asked by: Grace616 | Posted: 27-01-2022






Answer 1

csv.writer is a "builtin" function. That is, it is written in compiled C code rather than Python. So its internal variables can't be accessed from Python code.

That being said, I'm not sure why you would need to inspect the csv.writer object to find out the file object. That object is specified when creating the object:

w = csv.writer(fileobj, dialect, ...)

So if you need to access that object later, just save it in another variable.

Answered by: Aida647 | Posted: 28-02-2022



Answer 2

From what I can tell, there is no straightforward way to get the file object back out once you put it into a csv object. My approach would probably be to subclass the csv writer and readers you're using so they can carry that data around with them. Of course, this assumes the ability to be able to directly access the types of classes the factory function makes (among other things).

Answered by: Maria312 | Posted: 28-02-2022



Similar questions

mysql - for loop in for loop, python, csv reader/writer

I am new in python, but after 3 days of reading and finding solution without success i am lost. I've got mysql table with data (id, user_id...). I connect to db, read user_id and save data into array "user". Then i open csv file with a lot of rows and columns (user_id, name, mail, telephone, address...). In the next step i compare if user_id from db matches with user_id in csv file. If answer is yes then i write t...


c++ - sqlite: check reader/writer lock

I have 2 processes which both access an sqlite3 database. While reading is not a problem in sqlite, only one process can write to the database. According to the faq: http://www.sqlite.org/faq.html#q5 sqlite uses reader/writer locks. How do i check if the database is locked for writing by another process, both from python and c++? [edit]...


python background parallel file reader/writer

I have what I think is a fairly simple problem, but I can't seem to get it to work as I'd hoped. I have ~200 ~200 MB files, enough that I can't load them all into memory at once. Each file needs to be processed with each other file once (so ~20000 operations of the processing function). I need to loop through files multiple times, in something like: for i in xrange(len(file_list)-1): #go through each fil...


mysql - for loop in for loop, python, csv reader/writer

I am new in python, but after 3 days of reading and finding solution without success i am lost. I've got mysql table with data (id, user_id...). I connect to db, read user_id and save data into array "user". Then i open csv file with a lot of rows and columns (user_id, name, mail, telephone, address...). In the next step i compare if user_id from db matches with user_id in csv file. If answer is yes then i write t...


Python CSV reader/writer handling quotes: How can I wrap row fields in quotes? (Getting triple quotes as output)

I have a problem with the csv reader and writer in python. Whenever I try to take one CSV file and par down the number of columns from roughly 37 to 6, this is the kind of output I am getting. Example of one row: 0,"JOHNSON, JOHN J.",JOHN J. JOHNSON,TECH879,INSPECTION TECHNICIAN,MOTOR VEHICLE INSPECTION UNIT Notice the second field is quoted. This is what I see when I view it in a text editor. I want all...


python - Using Pandas vs. CSV reader/writer to process and save large CSV file

I'm fairly new to python and pandas but trying to get better with it for parsing and processing large data files. I'm currently working on a project that requires me to parse a few dozen large CSV CAN files at the time. The files have 9 columns of interest (1 ID and 7 data fields), have about 1-2 million rows, and are encoded in hex. A sample bit of data looks like this: id Flags DLC Data0 ...


python - How to create asyncio stream reader/writer for stdin/stdout?

I need to write two programs which will be run as a parent process and its child. The parent process spawns the child and then they communicate via pair of pipes connected to child's stdin and stdout. The communication is peer-to-peer, that's why I need asyncio. A simple read/reply loop won't do. I have written the parent. No problem because asyncio provides everything I needed in create_subproce...


python - Most memory efficient reader/writer in pandas

When I want to call pandas.read..function, what is the most memory efficient reader/writer, which would save the most memory on my machine? I want to read a large datafile and wonder if I can pick a format upfront to save the data with some other tool to save some memory consumption in python. Is it: pd.read_csv, pd.read_hdf, ..? The fast file is explained here:






Still can't find your answer? Check out these communities...



PySlackers | Full Stack Python | NHS Python | Pythonist Cafe | Hacker Earth | Discord Python



top