Movatterモバイル変換


[0]ホーム

URL:


Compressing your data files - Amazon Redshift
DocumentationAmazon RedshiftDatabase Developer Guide

Amazon Redshift will no longer support the creation of new Python UDFs starting November 1, 2025. If you would like to use Python UDFs, create the UDFs prior to that date. Existing Python UDFs will continue to function as normal. For more information, see the blog post.

Compressing your data files

When you want to compress large load files, we recommend that you use gzip, lzop, bzip2, or Zstandard to compress them and split the data into multiple smaller files.

Specify the GZIP, LZOP, BZIP2, or ZSTD option with the COPY command. This example loads the TIME table from a pipe-delimited lzop file.

copy timefrom 's3://amzn-s3-demo-bucket/data/timerows.lzo' iam_role 'arn:aws:iam::0123456789012:role/MyRedshiftRole'lzopdelimiter '|';

There are instances when you don't have to split uncompressed data files. For more information about splitting your data and examples of using COPY to load data, seeLoading data from Amazon S3.

Loading data files
Verify data files before and after a load

[8]
ページ先頭

©2009-2025 Movatter.jp