pyarrow.csv.CSVWriter#

classpyarrow.csv.CSVWriter(sink,Schemaschema,WriteOptionswrite_options=None,*,MemoryPoolmemory_pool=None)#

Bases:_CRecordBatchWriter

Writer to create a CSV file.

Parameters:
sinkstr, path,pyarrow.OutputStream or file-like object

The location where to write the CSV data.

schemapyarrow.Schema

The schema of the data to be written.

write_optionspyarrow.csv.WriteOptions

Options to configure writing the CSV data.

memory_poolMemoryPool, optional

Pool for temporary allocations.

__init__(*args,**kwargs)#

Methods

__init__(*args, **kwargs)

close(self)

Close stream and write end-of-stream 0 marker.

write(self, table_or_batch)

Write RecordBatch or Table to stream.

write_batch(self, RecordBatch batch[, ...])

Write RecordBatch to stream.

write_table(self, Table table[, max_chunksize])

Write Table to stream in (contiguous) RecordBatch objects.

Attributes

stats

Current IPC write statistics.

close(self)#

Close stream and write end-of-stream 0 marker.

stats#

Current IPC write statistics.

write(self,table_or_batch)#

Write RecordBatch or Table to stream.

Parameters:
table_or_batch{RecordBatch,Table}
write_batch(self,RecordBatchbatch,custom_metadata=None)#

Write RecordBatch to stream.

Parameters:
batchRecordBatch
custom_metadatamapping orKeyValueMetadata

Keys and values must be string-like / coercible to bytes

write_table(self,Tabletable,max_chunksize=None)#

Write Table to stream in (contiguous) RecordBatch objects.

Parameters:
tableTable
max_chunksizeint, defaultNone

Maximum number of rows for RecordBatch chunks. Individual chunks maybe smaller depending on the chunk layout of individual columns.