Description
Tiny but powerful goodies for Continuation-Passing-Style (CPS) functions with functional composability backed by category theory foundations.
Functions are the most basic and powerful concept. A whole program can be written as funciton, taking input data and producing output. However, viewing function's return value as the only output is often too limited. For instance, all asynchronous Node API methods rely on the output data returned via callbacks rather than via functions' return values. This pattern is of course the well-known Continuation-Passing Style (CPS)
The famous article by John Backus "Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs" advocated to "reduce the code obesity" by building generic hierarchical ways of composing entire programs. The present proposal attempts to provide some way of how such composability can be achieved.
See https://github.com/dmitriz/cpsfy/blob/master/DOCUMENTATION.md for details.
cpsfy alternatives and similar modules
Based on the "Control Flow" category.
Alternatively, view cpsfy alternatives based on common mentions on social networks and blogs.
-
Neo-Async
Neo-Async is thought to be used as a drop-in replacement for Async, it almost fully covers its functionality and runs faster -
observable-to-promise
Awesome Observable related stuff - An Observable is a collection that arrives over time. -
js-csp
CSP channels for Javascript (like Clojurescript's core.async, or Go) THIS IS AN UPSTREAM FORK -
async-chainable
An extension to Async adding better handling of mixed Series / Parallel tasks via object chaining -
Watchlight
A light-weight, comprehensive, reactive framework for business logic and when things change. -
Simple-Series-Parallel
A minimalist utility module for running async functions in series or parallel
CodeRabbit: AI Code Reviews for Developers
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of cpsfy or a related project?
README
// ) )
___ ___ ___ __//__
// ) ) // ) ) (( ) ) // // / /
// //___/ / \ \ // ((___/ /
((____ // // ) ) // / /
(Generated with http://patorjk.com/software/taag)
cpsfy
Tiny but powerful goodies for Continuation-Passing-Style (CPS) functions with functional composability backed by category theory foundations.
npm install cpsfy
(Or pnpm install cpsfy
to save disc space.)
No dependency policy. For maximum security, this package is intended to be kept minimal and transparent with no dependencies ever.
Quick demo
We want to read the content of name.txt
into string str
and remove spaces from both ends of str
. If the resulting str
is nonempty,
we read the content of the file with that name into string content
, otherwise do nothing.
Finally we split the content
string into array of lines.
If there are any errors on the way, we want to handle them at the very end
in a separate function without any change to our main code.
//function returning CPS function with 2 callbacks
const readFileCps = file => (onRes, onErr) =>
require('fs').readFile(file, (err, content) => {
err ? onErr(err) : onRes(content)
})
// CPS wraps a CPS function to provide the API methods
const getLines = CPS(readFileCps('name.txt'))
// map applies function to the file content
.map(file => file.trim())
.filter(file => file.length > 0)
// chain applies function that returns CPS function
.chain(file => readFileCps(file))
.map(text => text.split('\n'))
// => CPS function with 2 callbacks
// To use, simply pass callbacks in the same order
getLines(
lines => console.log(lines), // onRes callback
err => console.error(err) // onErr callback
)
Note how we handle error at the end without affecting the main logic!
But can't I do it with promises?
Ok, let us have another example where you can't, shall we? At least not as easy. And maybe ... not really another.:wink:
Reading from static files is easy but boring. Data is rarely static. What if we have to react to data changing in real time? Like our file names arriving as data stream? Let us use the popular websocket library:
const WebSocket = require('ws')
// general purpose CPS function listening to websocket
const wsMessageListenerCps = url => cb =>
new WebSocket(url).on('message', cb)
And here is the crux:
wsMessageListenerCps(url)
is just another CPS function!
So we can simply drop it instead of readFileCps('name.txt')
into exactly the same code and be done with it:
const getLinesFromWS = CPS(wsMessageListenerCps(someUrl))
.map(file => file.trim())
.filter(file => file.length > 0)
.chain(file => readFileCps(file))
.map(text => text.split('\n'))
And if you paid attention, the new CPS function has only one callback,
while the old one had two! Yet we have used exactly the same code!
How so? Because we haven't done anything to other callbacks.
The only difference is in how the final function is called - with one callback instead of two. As wsMessageListenerCps(url)
accepts one callback, so does
getLinesFromWS
when we call it:
getLinesFromWS(lines => console.log(lines))
That will print all lines for all files whose names we receive from our websocket. And if we feel overwhelmed and only want to see lines containing say "breakfast", nothing can be easier:
// just add a filter
const breakfastLines = CPS(getLinesFromWS)
.filter(line => /[Bb]reakfast/.test(line))
// call it exactly the same way
breakfastLines(lines => console.log(lines))
And from now on we'll never miss a breakfast.:smile:
CPS function
Any function
const cpsFn = (cb1, cb2, ...) => { ... }
that expects to be called with several (possibly zero) functions (callbacks) as arguments. The number of callbacks may vary each time cpsFn
is called. Once called and running, cpsFn
may call any of its callbacks any (possibly zero) number of times with any number m
of arguments (x1, ..., xm)
, where m
may also vary from call to call. The m
-tuple (vector) (x1, ..., xm)
is regarded as the output of cpsFn
from the n
the callback cbn
:
// (x1, ..., xm) becomes output from nth callback whenever
cbn(x1, ..., xm) // is called, where n = 1, 2, ..., m
In other words, a CPS function receives any number of callbacks that it may call in any order any number of times at any moments immediately or in the future with any number of arguments.
API in brief
const { map, chain, filter, scan, ap, CPS, pipeline }
= require('cpsfy')
Each of the map
, chain
, filter
, scan
operators can be used in 3 ways:
// 'map' as curried function
map(f)(cpsFn)
// 'map' method provided by the 'CPS' wrapper
CPS(cpsFn).map(f)
// 'cpsFn' is piped into 'map(f)' via 'pipeline' operator
pipeline(cpsFn)(map(f))
The wrapped CPS function CPS(cpsFn)
has all operators available as methods, while it remains a plain CPS function, i.e. can be called with the same callbacks:
CPS(cpsFn)(f1, f2, ...) // is equivalent to
cpsFn(f1, f2, ...)
pipeline(...arguments)(...functions)
Pass any number of arguments to a sequence of functions,
one after another, similar to the UNIX pipe
(x1, ..., xn) | f1 | f2 | ... | fm
.
Allows to write functional composition in
the intiutive linear way.
Examples of pipeline
pipeline(1, 2)(
(x, y) => x + y,
sum => sum * 2,
doubleSum => -doubleSum
) //=> -6
chaining
The CPS
factory provides the same methods for simple chaining:
CPS(cpsFn).map(f).chain(g).filter(h)
However, the need to wrap a plain function might feel as overhead.
The pipeline
operator allows to write the same code
in functional style without the need to wrap:
// pass cpsFn directly as argument
pipeline(cpsFn)(
map(f),
chain(g),
filter(h)
)
But the most important advantage of the pipeline
style is
that you can drop there arbitrary functions without
any need to patch object prototypes.
For instance, using the above example,
we can start our pipe with url
or even insert
some intermediate function to compute the correct url for us:
pipeline(path)( // begin with path
path => 'https://' + path // compute url on the spot
url => {console.log(url); return url} // check for debugging
wsMessageListenerCps, // return CPS function
map(f), // use CPS operators as usual
chain(g),
filter(h)
)
map(...functions)(cpsFunction)
// these are equivalent
map(f1, f2, ...)(cpsFn)
CPS(cpsFn).map(f1, f2, ...)
pipeline(cpsFn)(map(f1, f2, ...))
For each n
, apply fn
to each output from the n
th callback of cpsFn
.
Result of applying map
New CPS function that calls its n
th callback cbn
as
cbn(fn(x1, x2, ...))
whenever cpsFn
calls its n
th callback.
Example of map
Using readFileCps
as above.
// read file and convert all letters to uppercase
const getCaps = map(str => str.toUpperCase())(
readFileCps('message.txt')
)
// or
const getCaps = CPS(readFileCps('message.txt'))
.map(str => str.toUpperCase())
// or
const getCaps = pipeline(readFileCps('message.txt'))(
map(str => str.toUpperCase())
)
// getCaps is CPS function, call with any callback
getCaps((err, data) => err
? console.error(err)
: console.log(data)
) // => file content is capitalized and printed
chain(...functions)(cpsFunction)
// these are equivalent
chain(f1, f2, ...)(cpsFn)
CPS(cpsFn).chain(f1, f2, ...)
pipeline(cpsFn)(chain(f1, f2, ...))
where each fn
is a curried function
// fn is expected to return a CPS function
const fn = (x1, x2, ...) => (cb1, cb2, ...) => { ... }
The chain
operator applies each fn
to each output from the n
th callback of cpsFn
, however, the CPS ouptup of fn
is passed ahead instead of the return value.
Result of applying chain
New CPS function newCpsFn
that calls fn(x1, x2, ...)
whenever cpsFn
passes output (x1, x2, ...)
into its n
th callback, and collects all outputs from all callbacks of all fn
s. Then for each fixed m
, outputs from the m
th callbacks of all fn
s are collected and passed into the m
th callback cbm
of newCpsFn
:
cbm(y1, y2, ...) // is called whenever
cbmFn(y1, y2, ...) // is called where
// cbmFn is the mth callback of fn
Example of chain
Using readFileCps
as above.
// write version of readFileCps
const writeFileCps = (file, content) => (onRes, onErr) =>
require('fs').writeFile(file, content, (err, message) => {
err ? onErr(err) : onRes(message)
})
const copy = chain(
// function that returns CPS function
text => writeFileCps('target.txt', text)
)(
readFileCps('source.txt') // CPS function
)
// or as method
const copy = CPS(readFileCps('source.txt'))
.chain(text => writeFileCps('target.txt', text))
// or with pipeline operator
const copy = pipeline(readFileCps('source.txt'))(
chain(text => writeFileCps('target.txt', text))
)
// copy is a CPS function, call it with any callback
copy((err, data) => err
? console.error(err)
: console.log(data)
) // => file content is capitalized and printed
filter(...predicates)(cpsFunction)
// these are equivalent
filter(pred1, pred2, ...)(cpsFn)
CPS(cpsFn).filter(pred1, pred2, ...)
pipeline(cpsFn)(filter(pred1, pred2, ...))
where each predn
is the n
th predicate function used to filter output from the n
th callback of cpsFn
.
Result of applying filter
New CPS function that calls its n
th callback cbn(x1, x2, ...)
whenever (x1, x2, ...)
is an output from the n
th callback of cpsFun
and
predn(x1, x2, ...) == true
Example of filter
Using readFileCps
and writeFileCps
as above.
// only copy text if it is not empty
const copyNotEmpty = CPS(readFileCps('source.txt'))
.filter(text => text.length > 0)
.chain(text => writeFileCps('target.txt', text))
// copyNotEmpty is CPS function, call with any callback
copyNotEmpty(err => console.error(err))
scan(...reducers)(init)(cpsFunction)
Similar to reduce
, except that all partial accumulated values are passed into callback whenever there is new output.
// these are equivalent
scan(red1, red2, ...)(init)(cpsFn)
(cpsFn).scan(red1, red2, ...)(init)
pipeline(cpsFn)(scan(red1, red2, ...)(init))
where each redn
is a reducer and init
is the initial accumulated value.
// compute new accumulator value from the old one
// and the tuple of current values (y1, y2, ...)
const redn = (acc, y1, y2, ...) => ...
Result of applying scan
New CPS function whose output from the first callback is the accumulated value. For each output (y1, y2, ...)
from the n
th callback,
the n
th reducer redn
is used to compute the new acculated value
redn(acc, y1, y2, ...)
, where acc
starts with init
, similar to reduce
.
Example of scan
// CPS function with 2 callbacks, clicking on one
// of the buttons sends '1' into respective callback
const getVotes = (onUpvote, onDownvote) => {
upvoteButton.addEventListener('click',
ev => onUpvote(1)
)
downvoteButton.addEventListener('click',
ev => onDownvote(1)
)
}
// count numbers of up- and downvotes and
// pass into respective callbacks
const countVotes = CPS(getVotes)
.scan(
([up, down], upvote) => [up + upvote, down],
([up, down], downvote) => [up, down + downvote]
)([0,0])
// countVotes is CPS function that we can call
// with any callback
countVotes(
votes => console.log('Total votes: ', votes),
)
ap(...cpsFunctions)(cpsFunction)
See [running CPS functions in parallel](DOCUMENTATION.md#running-cps-functions-in-parallel). Inspired by the Applicative Functor interface, see e.g. https://funkia.github.io/jabz/#ap
lift(...functions)(cpsFunction)
(TODO)
See [lifting functions of multiple arguments](DOCUMENTATION.md#lifting-functions-of-multiple-parameters)
The "sister" of ap
, apply functions with multiple arguments to
outputs of CPS functions running in parallel, derived from ap
,
see e.g. https://funkia.github.io/jabz/#lift
merge(...cpsFunctions)
(TODO)
See [CPS.merge
](DOCUMENTATION.md#cpsmerge-todo).
Merge outputs from multiple CPS functions, separately in each callback.
E.g. separately merge results and errors from multiple promises
running in parallel.
More details?
This README.md
is kept minimal to reduce the package size. For more human introduction, motivation, use cases and other details, please see [DOCUMENTATION](DOCUMENTATION.md).
License
MIT ยฉ Dmitri Zaitsev
*Note that all licence references and agreements mentioned in the cpsfy README section above
are relevant to that project's source code only.