Description
A modern, feature-rich and highly tunable Node.js client library for Apache Cassandra and DataStax Enterprise using exclusively Cassandra's binary protocol and Cassandra Query Language v3.
DataStax Node.js Driver for Apache Cassandra alternatives and similar modules
Based on the "Database" category.
Alternatively, view DataStax Node.js Driver for Apache Cassandra alternatives based on common mentions on social networks and blogs.
-
SheetJS js-xlsx
:green_book: SheetJS Community Edition -- Spreadsheet Data Toolkit -
Sequelize
An easy-to-use and promise-based multi SQL dialects ORM tool for Node.js | Postgres, MySQL, MariaDB, SQLite, MSSQL, Snowflake & DB2 -
TypeORM
ORM for TypeScript and JavaScript (ES7, ES6, ES5). Supports MySQL, PostgreSQL, MariaDB, SQLite, MS SQL Server, Oracle, SAP Hana, WebSQL databases. Works in NodeJS, Browser, Ionic, Cordova and Electron platforms. -
Mongoose
MongoDB object modeling designed to work in an asynchronous environment. -
Prisma
Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB (Preview) -
MySQL
A pure node.js JavaScript Client implementing the MySQL protocol. -
Lowdb
Simple to use local JSON database. Powered by plain JavaScript (supports Node, Electron and the browser) -
RxDB
🔄 A client side, offline-first, reactive database for JavaScript Applications -
Knex
A query builder for PostgreSQL, MySQL, CockroachDB, SQL Server, SQLite3 and Oracle, designed to be flexible, portable, and fun to use. -
NeDB
The JavaScript Database, for Node.js, nw.js, electron and the browser -
MongoDB
The Official MongoDB Node.js Driver -
PostgreSQL
PostgreSQL client for node.js. -
Redis
🚀 A robust, performance-focused, and full-featured Redis client for Node.js. -
Objection.js
An SQL-friendly ORM for Node.js -
Bookshelf
A simple Node.js ORM for PostgreSQL, MySQL and SQLite3 built on top of Knex.js -
Waterline
An adapter-based ORM for Node.js with support for mysql, mongo, postgres, mssql (SQL Server), and more -
MikroORM
TypeScript ORM for Node.js based on Data Mapper, Unit of Work and Identity Map patterns. Supports MongoDB, MySQL, MariaDB, PostgreSQL and SQLite databases. -
LevelUP
A wrapper for abstract-leveldown compliant stores, for Node.js and browsers. -
pg-promise
PostgreSQL interface for Node.js -
slonik
A PostgreSQL client with strict types, detailed logging and assertions. -
node-mssql
Microsoft SQL Server client for Node.js -
Massive
PostgreSQL data access tool. -
Keyv
Simple key-value storage with support for multiple backends -
Mongorito
🍹 MongoDB ODM for Node.js apps based on Redux -
Couchbase
Couchbase Node.js Client Library (Official) -
couchdb-nano
Nano: The official Apache CouchDB library for Node.js -
pg-mem
An in memory postgres DB instance for your unit tests -
Iridium
A high performance MongoDB ORM for Node.js -
OpenRecord
Make ORMs great again! -
Mongo Seeding
The ultimate solution for populating your MongoDB database. -
Bluzelle Decentralized DB
A decentralized NoSQL database -
Aerospike
Node.js client for the Aerospike database -
@databases
TypeScript clients for databases that prevent SQL Injection -
Finale
Create flexible REST endpoints and controllers from Sequelize models in your Express app -
firenze
Adapter based JavaScript ORM for Node.js and the browser -
Sqlmancer
Conjure SQL from GraphQL queries 🧙🔮✨ -
uuid-mongodb
📇 Generates and parses MongoDB BSON UUIDs -
RediBox Core
Modular Redis connection and PUBSUB subscription manager for node. Easily extendable. Built for performance, powered by ioredis. -
database-js
Common Database Interface for Node -
Node Postgres Extras
NodeJS PostgreSQL database performance insights. Locks, index usage, buffer cache hit ratios, vacuum stats and more. -
@Sugoi\orm
SugoiJS ORM module typescript based - Simple solution for object handling with predefined lifecycle -
@Sugoi\mongoDB
SugoiJS MongoDB module - ORM module based
Appwrite - The Open Source Firebase alternative introduces iOS support
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of DataStax Node.js Driver for Apache Cassandra or a related project?
README
DataStax Node.js Driver for Apache Cassandra®
A modern, feature-rich and highly tunable Node.js client library for Apache Cassandra and DSE using exclusively Cassandra's binary protocol and Cassandra Query Language.
Installation
$ npm install cassandra-driver
Features
- Simple, Prepared, and Batch statements
- Asynchronous IO, parallel execution, request pipelining
- Connection pooling
- Auto node discovery
- Automatic reconnection
- Configurable load balancing and retry policies
- Works with any cluster size
- Built-in object mapper
- Both promise and callback-based API
- Row streaming and pipes
- Built-in TypeScript support
Documentation
Getting Help
You can use the project mailing list or create a ticket on the Jira issue tracker.
Basic usage
const cassandra = require('cassandra-driver');
const client = new cassandra.Client({
contactPoints: ['h1', 'h2'],
localDataCenter: 'datacenter1',
keyspace: 'ks1'
});
const query = 'SELECT name, email FROM users WHERE key = ?';
client.execute(query, [ 'someone' ])
.then(result => console.log('User with email %s', result.rows[0].email));
The driver supports both promises and callbacks for the asynchronous methods, you can choose the approach that suits your needs.
Note that in order to have concise code examples in this documentation, we will use the promise-based API of the
driver along with the await
keyword.
If you are using DataStax Astra you can configure your client by setting the secure bundle and the user credentials:
const client = new cassandra.Client({
cloud: { secureConnectBundle: 'path/to/secure-connect-DATABASE_NAME.zip' },
credentials: { username: 'user_name', password: '[email protected]' }
});
Prepare your queries
Using prepared statements provides multiple benefits.
Prepared statements are parsed and prepared on the Cassandra nodes and are ready for future execution. Also, when preparing, the driver retrieves information about the parameter types which allows an accurate mapping between a JavaScript type and a Cassandra type.
The driver will prepare the query once on each host and execute the statement with the bound parameters.
// Use query markers (?) and parameters
const query = 'UPDATE users SET birth = ? WHERE key=?';
const params = [ new Date(1942, 10, 1), 'jimi-hendrix' ];
// Set the prepare flag in the query options
await client.execute(query, params, { prepare: true });
console.log('Row updated on the cluster');
Row streaming and pipes
When using #eachRow()
and #stream()
methods, the driver parses each row as soon as it is received,
yielding rows without buffering them.
// Reducing a large result
client.eachRow(
'SELECT time, val FROM temperature WHERE station_id=',
['abc'],
(n, row) => {
// The callback will be invoked per each row as soon as they are received
minTemperature = Math.min(row.val, minTemperature);
},
err => {
// This function will be invoked when all rows where consumed or an error was encountered
}
);
The #stream()
method works in the same way but instead of callback it returns a Readable Streams2 object
in objectMode
that emits instances of Row
.
It can be piped downstream and provides automatic pause/resume logic (it buffers when not read).
client.stream('SELECT time, val FROM temperature WHERE station_id=', [ 'abc' ])
.on('readable', function () {
// 'readable' is emitted as soon a row is received and parsed
let row;
while (row = this.read()) {
console.log('time %s and value %s', row.time, row.val);
}
})
.on('end', function () {
// Stream ended, there aren't any more rows
})
.on('error', function (err) {
// Something went wrong: err is a response error from Cassandra
});
User defined types
User defined types (UDT) are represented as JavaScript objects.
For example: Consider the following UDT and table
CREATE TYPE address (
street text,
city text,
state text,
zip int,
phones set<text>
);
CREATE TABLE users (
name text PRIMARY KEY,
email text,
address frozen<address>
);
You can retrieve the user address details as a regular JavaScript object.
const query = 'SELECT name, address FROM users WHERE key = ?';
const result = await client.execute(query, [ key ], { prepare: true });
const row = result.first();
const address = row.address;
console.log('User lives in %s, %s - %s', address.street, address.city, address.state);
Read more information about using UDTs with the Node.js Driver.
Paging
All driver methods use a default fetchSize
of 5000 rows, retrieving only first page of results up to a
maximum of 5000 rows to shield an application against accidentally retrieving large result sets in a single response.
stream()
method automatically fetches the following page once the current one was read. You can also use eachRow()
method to retrieve the following pages by using autoPage
flag. See [paging documentation for more
information][doc-paging].
Batch multiple statements
You can execute multiple statements in a batch to update/insert several rows atomically even in different column families.
const queries = [
{
query: 'UPDATE user_profiles SET email=? WHERE key=?',
params: [ emailAddress, 'hendrix' ]
}, {
query: 'INSERT INTO user_track (key, text, date) VALUES (?, ?, ?)',
params: [ 'hendrix', 'Changed email', new Date() ]
}
];
await client.batch(queries, { prepare: true });
console.log('Data updated on cluster');
Object Mapper
The driver provides a built-in object mapper that lets you interact with your data like you would interact with a set of documents.
Retrieving objects from the database:
const videos = await videoMapper.find({ userId });
for (let video of videos) {
console.log(video.name);
}
Updating an object from the database:
await videoMapper.update({ id, userId, name, addedDate, description });
You can read more information about getting started with the Mapper in our documentation.
Data types
There are few data types defined in the ECMAScript specification, this usually represents a problem when you are trying to deal with data types that come from other systems in JavaScript.
The driver supports all the CQL data types in Apache Cassandra (3.0 and below) even for types with no built-in JavaScript representation, like decimal, varint and bigint. Check the documentation on working with numerical values, uuids and collections.
Logging
Instances of Client()
are EventEmitter
and emit log
events:
client.on('log', (level, loggerName, message, furtherInfo) => {
console.log(`${level} - ${loggerName}: ${message}`);
});
The level
being passed to the listener can be verbose
, info
, warning
or error
. Visit the logging
documentation for more information.
Compatibility
- Apache Cassandra versions 2.1 and above.
- DataStax Enterprise versions 4.8 and above.
- Node.js versions 8 and above.
Note: DataStax products do not support big-endian systems.
Credits
This driver is based on the original work of Jorge Bay on node-cassandra-cql and adds a series of advanced features that are common across all other DataStax drivers for Apache Cassandra.
The development effort to provide an up to date, high performance, fully featured Node.js Driver for Apache Cassandra will continue on this project, while node-cassandra-cql will be discontinued.
License
© DataStax, Inc.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
*Note that all licence references and agreements mentioned in the DataStax Node.js Driver for Apache Cassandra README section above
are relevant to that project's source code only.