Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Fastest full featured PostgreSQL client for Node.js

License

NotificationsYou must be signed in to change notification settings

nodeplusplus/postgres

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fastest full PostgreSQL nodejs client

  • 🚀 Fastest full featured PostgreSQL client for Node.js
  • 🚯 1250 LOC - 0 dependencies
  • 🏷 ES6 Tagged Template Strings at the core
  • 🏄‍♀️ Simple surface API

Getting started


Good UX with Postgres.js


Install

$ npm install postgres

Use

constpostgres=require('postgres')constsql=postgres({ ...options})// will default to the same as psqlawaitsql`  select name, age from users`// > [{ name: 'Murray', age: 68 }, { name: 'Walter', age 78 }]

Connection optionspostgres([url], [options])

You can use either apostgres:// url connection string or the options to define your database connection properties.

constsql=postgres('postgres://username:password@host:port/database',{host        :'',// Postgres ip address or domain nameport        :5432,// Postgres server portpath        :'',// unix socket path (usually '/tmp')database    :'',// Name of database to connect tousername    :'',// Username of database userpassword    :'',// Password of database userssl         :false,// True, or options for tls.connectmax         :10,// Max number of connectionstimeout     :0,// Idle connection timeout in secondstypes       :[],// Array of custom types, see more belowonnotice    :fn// Defaults to console.logonparameter :fn// (key, value) when server param changedebug       :fn// Is called with (connection, query, parameters)transform   :{column            :fn,// Transforms incoming column namesvalue             :fn,// Transforms incoming row valuesrow               :fn// Transforms entire rows},connection  :{application_name  :'postgres.js',// Default application_name    ...// Other connection parameters}})

More info forssl can be found in theNode.js docs for tls connect options

Querysql` ` -> Promise

A query will always return aPromise which resolves to either an array[...] ornull depending on the type of query. Destructuring is great to immidiately access the first element.

const[new_user]=awaitsql`  insert into users (    name, age  ) values (    'Murray', 68  )  returning *`// new_user = { user_id: 1, name: 'Murray', age: 68 }

Query parameters

Parameters are automatically inferred and handled by Postgres so that SQL injection isn't possible. No special handling is necessarry, simply use JS tagged template literals as usual.

letsearch='Mur'constusers=awaitsql`  select    name,    age  from users  where    name like${search+'%'}`// users = [{ name: 'Murray', age: 68 }]

Streamsql` `.stream(fn) -> Promise

If you want to handle rows returned by a query one by one you can use.stream which returns a promise that resolves once there are no more rows.

awaitsql.stream`  select created_at, name from events`.stream(row=>{// row = { created_at: '2019-11-22T14:22:00Z', name: 'connected' }})// No more rows

Listen and notify

When you call listen, a dedicated connection will automatically be made to ensure that you receive notifications in realtime. This connection will be used for any further calls to listen.

sql.listen('news',payload=>{constjson=JSON.parse(payload)console.log(json.this)// logs 'is'})

Notify can be done as usual in sql, or by using thesql.notify method.

sql.notify('news',JSON.stringify({no:'this',is:'news'}))

Dynamic query helperssql() inside tagged template

Postgres.js has a safe, ergonomic way to aid you in writing queries. This makes it easier to write dynamic inserts, selects, updates and where queries.

Insert

constuser={name:'Murray',age:68}sql`  insert into users${sql(user)}`

Is translated into a safe query like this:

insert into users (name, age)values ($1, $2)

Multiple inserts in one query

If you need to insert multiple rows at the same time it's also much faster to do it with a singleinsert. Simply pass an array of objects tosql().

constusers=[{name:'Murray',age:68,garbage:'ignore'},{name:'Walter',age:78}]sql`  insert into users${sql(users,'name','age')}`

Arrayssql.array(Array)

Postgres has a native array type which is similar to js arrays, but Postgres only allows the same type and shape for nested items. This method automatically infers the item type and translates js arrays into Postgres arrays.

consttypes=sql`  insert into types (    integers,    strings,    dates,    buffers,    multi  ) values (${sql.array([1,2,3,4,5])},${sql.array(['Hello','Postgres'])},${sql.array([newDate(),newDate(),newDate()])},${sql.array([Buffer.from('Hello'),Buffer.from('Postgres')])},${sql.array([[[1,2],[3,4]][[5,6],[7,8]]])},  )`

JSONsql.json(object)

constbody={hello:'postgres'}const[{ json}]=awaitsql`  insert into json (    body  ) values (${sql.json(body)}  )  returning body`// json = { hello: 'postgres' }

File querysql.file(path, [args], [options]) -> Promise

Using ansql file for a query. The contents will be cached in memory so that the file is only read once.

sql.file(path.join(__dirname,'query.sql'),[],{cache:true// Default true - disable for single shot queries or memory reasons})

Transactions

BEGIN / COMMITsql.begin(fn) -> Promise

Calling begin with a function will return a Promise which resolves with the returned value from the function. The function provides a single argument which issql with a context of the newly created transaction.BEGIN is automatically called, and if the Promise failsROLLBACK will be called. If it succeedsCOMMIT will be called.

const[user,account]=awaitsql.begin(asyncsql=>{const[user]=awaitsql`    insert into users (      name    ) values (      'Alice'    )  `const[account]=awaitsql`    insert into accounts (      user_id    ) values (${user.user_id}    )  `return[user,account]})

SAVEPOINTsql.savepoint([name], fn) -> Promise

sql.begin(asyncsql=>{const[user]=awaitsql`    insert into users (      name    ) values (      'Alice'    )  `const[account]=(awaitsql.savepoint(sql=>sql`      insert into accounts (        user_id      ) values (${user.user_id}      )    `).catch(err=>{// Account could not be created. ROLLBACK SAVEPOINT is called because we caught the rejection.}))||[]return[user,account]}).then(([user,account]))=>{// great success - COMMIT succeeded}).catch(()=>{// not so good - ROLLBACK was called})

Do note that you can often achieve the same result usingWITH queries (Common Table Expressions) instead of using transactions.

Types

You can add ergonomic support for custom types, or simply pass an object with a{ type, value } signature that contains the Postgresoid for the type and the correctly serialized value.

Adding Query helpers is the recommended approach which can be done like this:

constsql=sql({types:{rect:{to        :1337,from      :[1337],serialize :({ x, y, width, height})=>[x,y,width,height],parse     :([x,y,width,height])=>{x,y,width,height}}}})const[custom]=sql`  insert into rectangles (    name,    rect  ) values (    'wat',${sql.rect({x:13,y:37:width:42,height:80})}  )  returning *`// custom = { name: 'wat', rect: { x: 13, y: 37: width: 42, height: 80 }}

Teardown / Cleanup

To ensure proper teardown and cleanup on server restarts usesql.end({ timeout: null }) beforeprocess.exit()

Callingsql.end() will reject new queries and return a Promise which resolves when all queries are finished and the underlying connections are closed. If a timeout is provided any pending queries will be rejected once the timeout is reached and the connections will be destroyed.

Sample shutdown usingPrexit

importprexitfrom'prexit'prexit(async()=>{awaitsql.end({timeout:5})awaitnewPromise(r=>server.close(r))})

Unsafe queriessql.unsafe(query, [args], [options]) -> promise

If you know what you're doing, you can useunsafe to pass any string you'd like to postgres.

sql.unsafe(danger+`  select * from users where id = $1`,[user_id])

Errors

Errors are all thrown to related queries and never globally. Errors comming from Postgres itself are always in thenative Postgres format, and the same goes for anyNode.js errors eg. coming from the underlying connection.

There are also the following errors specifically for this library.

MESSAGE_NOT_SUPPORTED

X (X) is not supported

Whenever a message is received from Postgres which is not supported by this library. Feel free to file an issue if you think something is missing.

MAX_PARAMETERS_EXCEEDED

Max number of parameters (65534) exceeded

The postgres protocol doesn't allow more than 65534 (16bit) parameters. If you run into this issue there are various workarounds such as usingsql([...]) to escape values instead of passing them as parameters.

SASL_SIGNATURE_MISMATCH

Message type X not supported

When using SASL authentication the server responds with a signature at the end of the authentication flow which needs to match the one on the client. This is to avoidman in the middle attacks. If you receive this error the connection was canceled because the server did not reply with the expected signature.

NOT_TAGGED_CALL

Query not called as a tagged template literal

Making queries has to be done using the sql function as atagged template. This is to ensure parameters are serialized and passed to Postgres as query parameters with correct types and to avoid SQL injection.

AUTH_TYPE_NOT_IMPLEMENTED

Auth type X not implemented

Postgres supports many different authentication types. This one is not supported.

CONNECTION_CLOSED

write CONNECTION_CLOSED host:port

This error is thrown if the connection was closed without an error. This should not happen during normal operation, so please create an issue if this was unexpected.

CONNECTION_ENDED

write CONNECTION_ENDED host:port

This error is thrown if the user has calledsql.end() and performed a query afterwards.

CONNECTION_DESTROYED

write CONNECTION_DESTROYED host:port

This error is thrown for any queries that were pending when the timeout tosql.end({ timeout: X }) was reached.

Thank you

A really big thank you to @JAForbes who introduced me to Postgres and still holds my hand navigating all the great opportunities we have.

Thanks to @ACXgit for initial tests and dogfooding.

About

Fastest full featured PostgreSQL client for Node.js

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • JavaScript100.0%

[8]ページ先頭

©2009-2025 Movatter.jp