Class GridFSBucketReadStream

A readable stream that enables you to read buffers from GridFS.

Do not instantiate this class directly. Use openDownloadStream() instead.

Hierarchy

  • Readable
    • GridFSBucketReadStream

Properties

closed: boolean

Is true after 'close' has been emitted.

Since

v18.0.0

destroyed: boolean

Is true after readable.destroy() has been called.

Since

v8.0.0

errored: null | Error

Returns error if the stream has been destroyed with an error.

Since

v18.0.0

readable: boolean

Is true if it is safe to call readable.read(), which means the stream has not been destroyed or emitted 'error' or 'end'.

Since

v11.4.0

readableAborted: boolean

Returns whether the stream was destroyed or errored before emitting 'end'.

Since

v16.8.0

readableDidRead: boolean

Returns whether 'data' has been emitted.

Since

v16.7.0, v14.18.0

readableEncoding: null | BufferEncoding

Getter for the property encoding of a given Readable stream. The encodingproperty can be set using the readable.setEncoding() method.

Since

v12.7.0

readableEnded: boolean

Becomes true when 'end' event is emitted.

Since

v12.9.0

readableFlowing: null | boolean

This property reflects the current state of a Readable stream as described in the Three states section.

Since

v9.4.0

readableHighWaterMark: number

Returns the value of highWaterMark passed when creating this Readable.

Since

v9.3.0

readableLength: number

This property contains the number of bytes (or objects) in the queue ready to be read. The value provides introspection data regarding the status of the highWaterMark.

Since

v9.4.0

readableObjectMode: boolean

Getter for the property objectMode of a given Readable stream.

Since

v12.3.0

FILE: "file" = ...

Fires when the stream loaded the file document corresponding to the provided id.

Event

captureRejectionSymbol: typeof captureRejectionSymbol

Value: Symbol.for('nodejs.rejection')

See how to write a custom rejection handler.

Since

v13.4.0, v12.16.0

captureRejections: boolean

Value: boolean

Change the default captureRejections option on all new EventEmitter objects.

Since

v13.4.0, v12.16.0

defaultMaxListeners: number

By default, a maximum of 10 listeners can be registered for any single event. This limit can be changed for individual EventEmitter instances using the emitter.setMaxListeners(n) method. To change the default for allEventEmitter instances, the events.defaultMaxListenersproperty can be used. If this value is not a positive number, a RangeErroris thrown.

Take caution when setting the events.defaultMaxListeners because the change affects allEventEmitter instances, including those created before the change is made. However, calling emitter.setMaxListeners(n) still has precedence over events.defaultMaxListeners.

This is not a hard limit. The EventEmitter instance will allow more listeners to be added but will output a trace warning to stderr indicating that a "possible EventEmitter memory leak" has been detected. For any singleEventEmitter, the emitter.getMaxListeners() and emitter.setMaxListeners()methods can be used to temporarily avoid this warning:

import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.setMaxListeners(emitter.getMaxListeners() + 1);
emitter.once('event', () => {
// do stuff
emitter.setMaxListeners(Math.max(emitter.getMaxListeners() - 1, 0));
});

The --trace-warnings command-line flag can be used to display the stack trace for such warnings.

The emitted warning can be inspected with process.on('warning') and will have the additional emitter, type, and count properties, referring to the event emitter instance, the event's name and the number of attached listeners, respectively. Its name property is set to 'MaxListenersExceededWarning'.

Since

v0.11.2

errorMonitor: typeof errorMonitor

This symbol shall be used to install a listener for only monitoring 'error'events. Listeners installed using this symbol are called before the regular'error' listeners are called.

Installing a listener using this symbol does not change the behavior once an'error' event is emitted. Therefore, the process will still crash if no regular 'error' listener is installed.

Since

v13.6.0, v12.17.0

Methods

  • Calls readable.destroy() with an AbortError and returns a promise that fulfills when the stream is finished.

    Returns Promise<void>

    Since

    v20.4.0

  • Returns AsyncIterableIterator<any>

  • Parameters

    • error: Error
    • event: string
    • Rest ...args: any[]

    Returns void

  • Parameters

    • callback: ((error?) => void)
        • (error?): void
        • Parameters

          • Optional error: null | Error

          Returns void

    Returns void

  • Parameters

    • error: null | Error
    • callback: ((error?) => void)
        • (error?): void
        • Parameters

          • Optional error: null | Error

          Returns void

    Returns void

  • Marks this stream as aborted (will never push another data event) and kills the underlying cursor. Will emit the 'end' event, and then the 'close' event once the cursor is successfully killed.

    Returns Promise<void>

  • Event emitter The defined events on documents including:

    1. close
    2. data
    3. end
    4. error
    5. pause
    6. readable
    7. resume

    Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • This method returns a new stream with chunks of the underlying stream paired with a counter in the form [index, chunk]. The first index value is 0 and it increases by 1 for each chunk produced.

    Parameters

    • Optional options: Pick<ArrayOptions, "signal">

    Returns Readable

    a stream of indexed pairs.

    Since

    v17.5.0

  • Type Parameters

    • T extends ReadableStream<T>

    Parameters

    • stream: ComposeFnParam | T | Iterable<T> | AsyncIterable<T>
    • Optional options: {
          signal: AbortSignal;
      }
      • signal: AbortSignal

    Returns T

  • Destroy the stream. Optionally emit an 'error' event, and emit a 'close'event (unless emitClose is set to false). After this call, the readable stream will release any internal resources and subsequent calls to push()will be ignored.

    Once destroy() has been called any further calls will be a no-op and no further errors except from _destroy() may be emitted as 'error'.

    Implementors should not override this method, but instead implement readable._destroy().

    Parameters

    • Optional error: Error

      Error which will be passed as payload in 'error' event

    Returns GridFSBucketReadStream

    Since

    v8.0.0

  • This method returns a new stream with the first limit chunks dropped from the start.

    Parameters

    • limit: number

      the number of chunks to drop from the readable.

    • Optional options: Pick<ArrayOptions, "signal">

    Returns Readable

    a stream with limit chunks dropped from the start.

    Since

    v17.5.0

  • Parameters

    • event: "close"

    Returns boolean

  • Parameters

    • event: "data"
    • chunk: any

    Returns boolean

  • Parameters

    • event: "end"

    Returns boolean

  • Parameters

    • event: "error"
    • err: Error

    Returns boolean

  • Parameters

    • event: "pause"

    Returns boolean

  • Parameters

    • event: "readable"

    Returns boolean

  • Parameters

    • event: "resume"

    Returns boolean

  • Parameters

    • event: string | symbol
    • Rest ...args: any[]

    Returns boolean

  • Returns an array listing the events for which the emitter has registered listeners. The values in the array are strings or Symbols.

    import { EventEmitter } from 'node:events';

    const myEE = new EventEmitter();
    myEE.on('foo', () => {});
    myEE.on('bar', () => {});

    const sym = Symbol('symbol');
    myEE.on(sym, () => {});

    console.log(myEE.eventNames());
    // Prints: [ 'foo', 'bar', Symbol(symbol) ]

    Returns (string | symbol)[]

    Since

    v6.0.0

  • This method is similar to Array.prototype.every and calls fn on each chunk in the stream to check if all awaited return values are truthy value for fn. Once an fn call on a chunk awaited return value is falsy, the stream is destroyed and the promise is fulfilled with false. If all of the fn calls on the chunks return a truthy value, the promise is fulfilled with true.

    Parameters

    • fn: ((data, options?) => boolean | Promise<boolean>)

      a function to call on each chunk of the stream. Async or not.

        • (data, options?): boolean | Promise<boolean>
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns boolean | Promise<boolean>

    • Optional options: ArrayOptions

    Returns Promise<boolean>

    a promise evaluating to true if fn returned a truthy value for every one of the chunks.

    Since

    v17.5.0

  • This method allows filtering the stream. For each chunk in the stream the fn function will be called and if it returns a truthy value, the chunk will be passed to the result stream. If the fn function returns a promise - that promise will be awaited.

    Parameters

    • fn: ((data, options?) => boolean | Promise<boolean>)

      a function to filter chunks from the stream. Async or not.

        • (data, options?): boolean | Promise<boolean>
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns boolean | Promise<boolean>

    • Optional options: ArrayOptions

    Returns Readable

    a stream filtered with the predicate fn.

    Since

    v17.4.0, v16.14.0

  • This method is similar to Array.prototype.find and calls fn on each chunk in the stream to find a chunk with a truthy value for fn. Once an fn call's awaited return value is truthy, the stream is destroyed and the promise is fulfilled with value for which fn returned a truthy value. If all of the fn calls on the chunks return a falsy value, the promise is fulfilled with undefined.

    Type Parameters

    • T

    Parameters

    • fn: ((data, options?) => data is T)

      a function to call on each chunk of the stream. Async or not.

        • (data, options?): data is T
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns data is T

    • Optional options: ArrayOptions

    Returns Promise<undefined | T>

    a promise evaluating to the first chunk for which fn evaluated with a truthy value, or undefined if no element was found.

    Since

    v17.5.0

  • Parameters

    • fn: ((data, options?) => boolean | Promise<boolean>)
        • (data, options?): boolean | Promise<boolean>
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns boolean | Promise<boolean>

    • Optional options: ArrayOptions

    Returns Promise<any>

  • This method returns a new stream by applying the given callback to each chunk of the stream and then flattening the result.

    It is possible to return a stream or another iterable or async iterable from fn and the result streams will be merged (flattened) into the returned stream.

    Parameters

    • fn: ((data, options?) => any)

      a function to map over every chunk in the stream. May be async. May be a stream or generator.

        • (data, options?): any
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns any

    • Optional options: ArrayOptions

    Returns Readable

    a stream flat-mapped with the function fn.

    Since

    v17.5.0

  • This method allows iterating a stream. For each chunk in the stream the fn function will be called. If the fn function returns a promise - that promise will be awaited.

    This method is different from for await...of loops in that it can optionally process chunks concurrently. In addition, a forEach iteration can only be stopped by having passed a signal option and aborting the related AbortController while for await...of can be stopped with break or return. In either case the stream will be destroyed.

    This method is different from listening to the 'data' event in that it uses the readable event in the underlying machinary and can limit the number of concurrent fn calls.

    Parameters

    • fn: ((data, options?) => void | Promise<void>)

      a function to call on each chunk of the stream. Async or not.

        • (data, options?): void | Promise<void>
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns void | Promise<void>

    • Optional options: ArrayOptions

    Returns Promise<void>

    a promise for when the stream has finished.

    Since

    v17.5.0

  • Returns the current max listener value for the EventEmitter which is either set by emitter.setMaxListeners(n) or defaults to defaultMaxListeners.

    Returns number

    Since

    v1.0.0

  • The readable.isPaused() method returns the current operating state of theReadable. This is used primarily by the mechanism that underlies thereadable.pipe() method. In most typical cases, there will be no reason to use this method directly.

    const readable = new stream.Readable();

    readable.isPaused(); // === false
    readable.pause();
    readable.isPaused(); // === true
    readable.resume();
    readable.isPaused(); // === false

    Returns boolean

    Since

    v0.11.14

  • The iterator created by this method gives users the option to cancel the destruction of the stream if the for await...of loop is exited by return, break, or throw, or if the iterator should destroy the stream if the stream emitted an error during iteration.

    Parameters

    • Optional options: {
          destroyOnReturn?: boolean;
      }
      • Optional destroyOnReturn?: boolean

        When set to false, calling return on the async iterator, or exiting a for await...of iteration using a break, return, or throw will not destroy the stream. Default: true.

    Returns AsyncIterableIterator<any>

    Since

    v16.3.0

  • Returns the number of listeners listening for the event named eventName. If listener is provided, it will return how many times the listener is found in the list of the listeners of the event.

    Parameters

    • eventName: string | symbol

      The name of the event being listened for

    • Optional listener: Function

      The event handler function

    Returns number

    Since

    v3.2.0

  • Returns a copy of the array of listeners for the event named eventName.

    server.on('connection', (stream) => {
    console.log('someone connected!');
    });
    console.log(util.inspect(server.listeners('connection')));
    // Prints: [ [Function] ]

    Parameters

    • eventName: string | symbol

    Returns Function[]

    Since

    v0.1.26

  • This method allows mapping over the stream. The fn function will be called for every chunk in the stream. If the fn function returns a promise - that promise will be awaited before being passed to the result stream.

    Parameters

    • fn: ((data, options?) => any)

      a function to map over every chunk in the stream. Async or not.

        • (data, options?): any
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns any

    • Optional options: ArrayOptions

    Returns Readable

    a stream mapped with the function fn.

    Since

    v17.4.0, v16.14.0

  • Alias for emitter.removeListener().

    Parameters

    • eventName: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

    Since

    v10.0.0

  • Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • The readable.pause() method will cause a stream in flowing mode to stop emitting 'data' events, switching out of flowing mode. Any data that becomes available will remain in the internal buffer.

    const readable = getReadableStreamSomehow();
    readable.on('data', (chunk) => {
    console.log(`Received ${chunk.length} bytes of data.`);
    readable.pause();
    console.log('There will be no additional data for 1 second.');
    setTimeout(() => {
    console.log('Now data will start flowing again.');
    readable.resume();
    }, 1000);
    });

    The readable.pause() method has no effect if there is a 'readable'event listener.

    Returns GridFSBucketReadStream

    Since

    v0.9.4

  • Type Parameters

    • T extends WritableStream<T>

    Parameters

    • destination: T
    • Optional options: {
          end?: boolean;
      }
      • Optional end?: boolean

    Returns T

  • Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • chunk: any
    • Optional encoding: BufferEncoding

    Returns boolean

  • Returns a copy of the array of listeners for the event named eventName, including any wrappers (such as those created by .once()).

    import { EventEmitter } from 'node:events';
    const emitter = new EventEmitter();
    emitter.once('log', () => console.log('log once'));

    // Returns a new Array with a function `onceWrapper` which has a property
    // `listener` which contains the original listener bound above
    const listeners = emitter.rawListeners('log');
    const logFnWrapper = listeners[0];

    // Logs "log once" to the console and does not unbind the `once` event
    logFnWrapper.listener();

    // Logs "log once" to the console and removes the listener
    logFnWrapper();

    emitter.on('log', () => console.log('log persistently'));
    // Will return a new Array with a single function bound by `.on()` above
    const newListeners = emitter.rawListeners('log');

    // Logs "log persistently" twice
    newListeners[0]();
    emitter.emit('log');

    Parameters

    • eventName: string | symbol

    Returns Function[]

    Since

    v9.4.0

  • The readable.read() method reads data out of the internal buffer and returns it. If no data is available to be read, null is returned. By default, the data is returned as a Buffer object unless an encoding has been specified using the readable.setEncoding() method or the stream is operating in object mode.

    The optional size argument specifies a specific number of bytes to read. Ifsize bytes are not available to be read, null will be returned _unless_the stream has ended, in which case all of the data remaining in the internal buffer will be returned.

    If the size argument is not specified, all of the data contained in the internal buffer will be returned.

    The size argument must be less than or equal to 1 GiB.

    The readable.read() method should only be called on Readable streams operating in paused mode. In flowing mode, readable.read() is called automatically until the internal buffer is fully drained.

    const readable = getReadableStreamSomehow();

    // 'readable' may be triggered multiple times as data is buffered in
    readable.on('readable', () => {
    let chunk;
    console.log('Stream is readable (new data received in buffer)');
    // Use a loop to make sure we read all currently available data
    while (null !== (chunk = readable.read())) {
    console.log(`Read ${chunk.length} bytes of data...`);
    }
    });

    // 'end' will be triggered once when there is no more data available
    readable.on('end', () => {
    console.log('Reached end of stream.');
    });

    Each call to readable.read() returns a chunk of data, or null. The chunks are not concatenated. A while loop is necessary to consume all data currently in the buffer. When reading a large file .read() may return null, having consumed all buffered content so far, but there is still more data to come not yet buffered. In this case a new 'readable' event will be emitted when there is more data in the buffer. Finally the 'end' event will be emitted when there is no more data to come.

    Therefore to read a file's whole contents from a readable, it is necessary to collect chunks across multiple 'readable' events:

    const chunks = [];

    readable.on('readable', () => {
    let chunk;
    while (null !== (chunk = readable.read())) {
    chunks.push(chunk);
    }
    });

    readable.on('end', () => {
    const content = chunks.join('');
    });

    A Readable stream in object mode will always return a single item from a call to readable.read(size), regardless of the value of thesize argument.

    If the readable.read() method returns a chunk of data, a 'data' event will also be emitted.

    Calling read after the 'end' event has been emitted will return null. No runtime error will be raised.

    Parameters

    • Optional size: number

      Optional argument to specify how much data to read.

    Returns any

    Since

    v0.9.4

  • This method calls fn on each chunk of the stream in order, passing it the result from the calculation on the previous element. It returns a promise for the final value of the reduction.

    If no initial value is supplied the first chunk of the stream is used as the initial value. If the stream is empty, the promise is rejected with a TypeError with the ERR_INVALID_ARGS code property.

    The reducer function iterates the stream element-by-element which means that there is no concurrency parameter or parallelism. To perform a reduce concurrently, you can extract the async function to readable.map method.

    Type Parameters

    • T = any

    Parameters

    • fn: ((previous, data, options?) => T)

      a reducer function to call over every chunk in the stream. Async or not.

        • (previous, data, options?): T
        • Parameters

          • previous: any
          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns T

    • Optional initial: undefined

      the initial value to use in the reduction.

    • Optional options: Pick<ArrayOptions, "signal">

    Returns Promise<T>

    a promise for the final value of the reduction.

    Since

    v17.5.0

  • Type Parameters

    • T = any

    Parameters

    • fn: ((previous, data, options?) => T)
        • (previous, data, options?): T
        • Parameters

          • previous: T
          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns T

    • initial: T
    • Optional options: Pick<ArrayOptions, "signal">

    Returns Promise<T>

  • Removes all listeners, or those of the specified eventName.

    It is bad practice to remove listeners added elsewhere in the code, particularly when the EventEmitter instance was created by some other component or module (e.g. sockets or file streams).

    Returns a reference to the EventEmitter, so that calls can be chained.

    Parameters

    • Optional event: string | symbol

    Returns GridFSBucketReadStream

    Since

    v0.1.26

  • Parameters

    • event: "close"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "data"
    • listener: ((chunk) => void)
        • (chunk): void
        • Parameters

          • chunk: any

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "end"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "error"
    • listener: ((err) => void)
        • (err): void
        • Parameters

          • err: Error

          Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "pause"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "readable"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: "resume"
    • listener: (() => void)
        • (): void
        • Returns void

    Returns GridFSBucketReadStream

  • Parameters

    • event: string | symbol
    • listener: ((...args) => void)
        • (...args): void
        • Parameters

          • Rest ...args: any[]

          Returns void

    Returns GridFSBucketReadStream

  • The readable.resume() method causes an explicitly paused Readable stream to resume emitting 'data' events, switching the stream into flowing mode.

    The readable.resume() method can be used to fully consume the data from a stream without actually processing any of that data:

    getReadableStreamSomehow()
    .resume()
    .on('end', () => {
    console.log('Reached the end, but did not read anything.');
    });

    The readable.resume() method has no effect if there is a 'readable'event listener.

    Returns GridFSBucketReadStream

    Since

    v0.9.4

  • The readable.setEncoding() method sets the character encoding for data read from the Readable stream.

    By default, no encoding is assigned and stream data will be returned asBuffer objects. Setting an encoding causes the stream data to be returned as strings of the specified encoding rather than as Bufferobjects. For instance, calling readable.setEncoding('utf8') will cause the output data to be interpreted as UTF-8 data, and passed as strings. Callingreadable.setEncoding('hex') will cause the data to be encoded in hexadecimal string format.

    The Readable stream will properly handle multi-byte characters delivered through the stream that would otherwise become improperly decoded if simply pulled from the stream as Buffer objects.

    const readable = getReadableStreamSomehow();
    readable.setEncoding('utf8');
    readable.on('data', (chunk) => {
    assert.equal(typeof chunk, 'string');
    console.log('Got %d characters of string data:', chunk.length);
    });

    Parameters

    • encoding: BufferEncoding

      The encoding to use.

    Returns GridFSBucketReadStream

    Since

    v0.9.4

  • By default EventEmitters will print a warning if more than 10 listeners are added for a particular event. This is a useful default that helps finding memory leaks. The emitter.setMaxListeners() method allows the limit to be modified for this specific EventEmitter instance. The value can be set toInfinity (or 0) to indicate an unlimited number of listeners.

    Returns a reference to the EventEmitter, so that calls can be chained.

    Parameters

    • n: number

    Returns GridFSBucketReadStream

    Since

    v0.3.5

  • This method is similar to Array.prototype.some and calls fn on each chunk in the stream until the awaited return value is true (or any truthy value). Once an fn call on a chunk awaited return value is truthy, the stream is destroyed and the promise is fulfilled with true. If none of the fn calls on the chunks return a truthy value, the promise is fulfilled with false.

    Parameters

    • fn: ((data, options?) => boolean | Promise<boolean>)

      a function to call on each chunk of the stream. Async or not.

        • (data, options?): boolean | Promise<boolean>
        • Parameters

          • data: any
          • Optional options: Pick<ArrayOptions, "signal">

          Returns boolean | Promise<boolean>

    • Optional options: ArrayOptions

    Returns Promise<boolean>

    a promise evaluating to true if fn returned a truthy value for at least one of the chunks.

    Since

    v17.5.0

  • This method returns a new stream with the first limit chunks.

    Parameters

    • limit: number

      the number of chunks to take from the readable.

    • Optional options: Pick<ArrayOptions, "signal">

    Returns Readable

    a stream with limit chunks taken.

    Since

    v17.5.0

  • This method allows easily obtaining the contents of a stream.

    As this method reads the entire stream into memory, it negates the benefits of streams. It's intended for interoperability and convenience, not as the primary way to consume streams.

    Parameters

    • Optional options: Pick<ArrayOptions, "signal">

    Returns Promise<any[]>

    a promise containing an array with the contents of the stream.

    Since

    v17.5.0

  • The readable.unpipe() method detaches a Writable stream previously attached using the pipe method.

    If the destination is not specified, then all pipes are detached.

    If the destination is specified, but no pipe is set up for it, then the method does nothing.

    const fs = require('node:fs');
    const readable = getReadableStreamSomehow();
    const writable = fs.createWriteStream('file.txt');
    // All the data from readable goes into 'file.txt',
    // but only for the first second.
    readable.pipe(writable);
    setTimeout(() => {
    console.log('Stop writing to file.txt.');
    readable.unpipe(writable);
    console.log('Manually close the file stream.');
    writable.end();
    }, 1000);

    Parameters

    • Optional destination: WritableStream

      Optional specific stream to unpipe

    Returns GridFSBucketReadStream

    Since

    v0.9.4

  • Passing chunk as null signals the end of the stream (EOF) and behaves the same as readable.push(null), after which no more data can be written. The EOF signal is put at the end of the buffer and any buffered data will still be flushed.

    The readable.unshift() method pushes a chunk of data back into the internal buffer. This is useful in certain situations where a stream is being consumed by code that needs to "un-consume" some amount of data that it has optimistically pulled out of the source, so that the data can be passed on to some other party.

    The stream.unshift(chunk) method cannot be called after the 'end' event has been emitted or a runtime error will be thrown.

    Developers using stream.unshift() often should consider switching to use of a Transform stream instead. See the API for stream implementers section for more information.

    // Pull off a header delimited by \n\n.
    // Use unshift() if we get too much.
    // Call the callback with (error, header, stream).
    const { StringDecoder } = require('node:string_decoder');
    function parseHeader(stream, callback) {
    stream.on('error', callback);
    stream.on('readable', onReadable);
    const decoder = new StringDecoder('utf8');
    let header = '';
    function onReadable() {
    let chunk;
    while (null !== (chunk = stream.read())) {
    const str = decoder.write(chunk);
    if (str.includes('\n\n')) {
    // Found the header boundary.
    const split = str.split(/\n\n/);
    header += split.shift();
    const remaining = split.join('\n\n');
    const buf = Buffer.from(remaining, 'utf8');
    stream.removeListener('error', callback);
    // Remove the 'readable' listener before unshifting.
    stream.removeListener('readable', onReadable);
    if (buf.length)
    stream.unshift(buf);
    // Now the body of the message can be read from the stream.
    callback(null, header, stream);
    return;
    }
    // Still reading the header.
    header += str;
    }
    }
    }

    Unlike push, stream.unshift(chunk) will not end the reading process by resetting the internal reading state of the stream. This can cause unexpected results if readable.unshift() is called during a read (i.e. from within a _read implementation on a custom stream). Following the call to readable.unshift() with an immediate push will reset the reading state appropriately, however it is best to simply avoid calling readable.unshift() while in the process of performing a read.

    Parameters

    • chunk: any

      Chunk of data to unshift onto the read queue. For streams not operating in object mode, chunk must be a string, Buffer, Uint8Array, or null. For object mode streams, chunk may be any JavaScript value.

    • Optional encoding: BufferEncoding

      Encoding of string chunks. Must be a valid Buffer encoding, such as 'utf8' or 'ascii'.

    Returns void

    Since

    v0.9.11

  • Prior to Node.js 0.10, streams did not implement the entire node:streammodule API as it is currently defined. (See Compatibility for more information.)

    When using an older Node.js library that emits 'data' events and has a pause method that is advisory only, thereadable.wrap() method can be used to create a Readable stream that uses the old stream as its data source.

    It will rarely be necessary to use readable.wrap() but the method has been provided as a convenience for interacting with older Node.js applications and libraries.

    const { OldReader } = require('./old-api-module.js');
    const { Readable } = require('node:stream');
    const oreader = new OldReader();
    const myReader = new Readable().wrap(oreader);

    myReader.on('readable', () => {
    myReader.read(); // etc.
    });

    Parameters

    • stream: ReadableStream

      An "old style" readable stream

    Returns GridFSBucketReadStream

    Since

    v0.9.4

  • Experimental

    Listens once to the abort event on the provided signal.

    Listening to the abort event on abort signals is unsafe and may lead to resource leaks since another third party with the signal can call e.stopImmediatePropagation(). Unfortunately Node.js cannot change this since it would violate the web standard. Additionally, the original API makes it easy to forget to remove listeners.

    This API allows safely using AbortSignals in Node.js APIs by solving these two issues by listening to the event such that stopImmediatePropagation does not prevent the listener from running.

    Returns a disposable so that it may be unsubscribed from more easily.

    import { addAbortListener } from 'node:events';

    function example(signal) {
    let disposable;
    try {
    signal.addEventListener('abort', (e) => e.stopImmediatePropagation());
    disposable = addAbortListener(signal, (e) => {
    // Do something when signal is aborted.
    });
    } finally {
    disposable?.[Symbol.dispose]();
    }
    }

    Parameters

    • signal: AbortSignal
    • resource: ((event) => void)
        • (event): void
        • Parameters

          • event: Event

          Returns void

    Returns Disposable

    Disposable that removes the abort listener.

    Since

    v20.5.0

  • A utility method for creating Readable Streams out of iterators.

    Parameters

    • iterable: Iterable<any> | AsyncIterable<any>
    • Optional options: ReadableOptions

    Returns Readable

  • Experimental

    A utility method for creating a Readable from a web ReadableStream.

    Parameters

    • readableStream: ReadableStream<any>
    • Optional options: Pick<ReadableOptions, "objectMode" | "highWaterMark" | "signal" | "encoding">

    Returns Readable

    Since

    v17.0.0

  • Returns a copy of the array of listeners for the event named eventName.

    For EventEmitters this behaves exactly the same as calling .listeners on the emitter.

    For EventTargets this is the only way to get the event listeners for the event target. This is useful for debugging and diagnostic purposes.

    import { getEventListeners, EventEmitter } from 'node:events';

    {
    const ee = new EventEmitter();
    const listener = () => console.log('Events are fun');
    ee.on('foo', listener);
    console.log(getEventListeners(ee, 'foo')); // [ [Function: listener] ]
    }
    {
    const et = new EventTarget();
    const listener = () => console.log('Events are fun');
    et.addEventListener('foo', listener);
    console.log(getEventListeners(et, 'foo')); // [ [Function: listener] ]
    }

    Parameters

    • emitter: EventEmitter | _DOMEventTarget
    • name: string | symbol

    Returns Function[]

    Since

    v15.2.0, v14.17.0

  • Returns the currently set max amount of listeners.

    For EventEmitters this behaves exactly the same as calling .getMaxListeners on the emitter.

    For EventTargets this is the only way to get the max event listeners for the event target. If the number of event handlers on a single EventTarget exceeds the max set, the EventTarget will print a warning.

    import { getMaxListeners, setMaxListeners, EventEmitter } from 'node:events';

    {
    const ee = new EventEmitter();
    console.log(getMaxListeners(ee)); // 10
    setMaxListeners(11, ee);
    console.log(getMaxListeners(ee)); // 11
    }
    {
    const et = new EventTarget();
    console.log(getMaxListeners(et)); // 10
    setMaxListeners(11, et);
    console.log(getMaxListeners(et)); // 11
    }

    Parameters

    • emitter: EventEmitter | _DOMEventTarget

    Returns number

    Since

    v19.9.0

  • Returns whether the stream has been read from or cancelled.

    Parameters

    • stream: Readable | ReadableStream

    Returns boolean

    Since

    v16.8.0

  • A class method that returns the number of listeners for the given eventNameregistered on the given emitter.

    import { EventEmitter, listenerCount } from 'node:events';

    const myEmitter = new EventEmitter();
    myEmitter.on('event', () => {});
    myEmitter.on('event', () => {});
    console.log(listenerCount(myEmitter, 'event'));
    // Prints: 2

    Parameters

    • emitter: EventEmitter

      The emitter to query

    • eventName: string | symbol

      The event name

    Returns number

    Since

    v0.9.12

    Deprecated

    Since v3.2.0 - Use listenerCount instead.

  • import { on, EventEmitter } from 'node:events';
    import process from 'node:process';

    const ee = new EventEmitter();

    // Emit later on
    process.nextTick(() => {
    ee.emit('foo', 'bar');
    ee.emit('foo', 42);
    });

    for await (const event of on(ee, 'foo')) {
    // The execution of this inner block is synchronous and it
    // processes one event at a time (even with await). Do not use
    // if concurrent execution is required.
    console.log(event); // prints ['bar'] [42]
    }
    // Unreachable here

    Returns an AsyncIterator that iterates eventName events. It will throw if the EventEmitter emits 'error'. It removes all listeners when exiting the loop. The value returned by each iteration is an array composed of the emitted event arguments.

    An AbortSignal can be used to cancel waiting on events:

    import { on, EventEmitter } from 'node:events';
    import process from 'node:process';

    const ac = new AbortController();

    (async () => {
    const ee = new EventEmitter();

    // Emit later on
    process.nextTick(() => {
    ee.emit('foo', 'bar');
    ee.emit('foo', 42);
    });

    for await (const event of on(ee, 'foo', { signal: ac.signal })) {
    // The execution of this inner block is synchronous and it
    // processes one event at a time (even with await). Do not use
    // if concurrent execution is required.
    console.log(event); // prints ['bar'] [42]
    }
    // Unreachable here
    })();

    process.nextTick(() => ac.abort());

    Parameters

    • emitter: EventEmitter
    • eventName: string

      The name of the event being listened for

    • Optional options: StaticEventEmitterOptions

    Returns AsyncIterableIterator<any>

    that iterates eventName events emitted by the emitter

    Since

    v13.6.0, v12.16.0

  • Creates a Promise that is fulfilled when the EventEmitter emits the given event or that is rejected if the EventEmitter emits 'error' while waiting. The Promise will resolve with an array of all the arguments emitted to the given event.

    This method is intentionally generic and works with the web platform EventTarget interface, which has no special'error' event semantics and does not listen to the 'error' event.

    import { once, EventEmitter } from 'node:events';
    import process from 'node:process';

    const ee = new EventEmitter();

    process.nextTick(() => {
    ee.emit('myevent', 42);
    });

    const [value] = await once(ee, 'myevent');
    console.log(value);

    const err = new Error('kaboom');
    process.nextTick(() => {
    ee.emit('error', err);
    });

    try {
    await once(ee, 'myevent');
    } catch (err) {
    console.error('error happened', err);
    }

    The special handling of the 'error' event is only used when events.once()is used to wait for another event. If events.once() is used to wait for the 'error' event itself, then it is treated as any other kind of event without special handling:

    import { EventEmitter, once } from 'node:events';

    const ee = new EventEmitter();

    once(ee, 'error')
    .then(([err]) => console.log('ok', err.message))
    .catch((err) => console.error('error', err.message));

    ee.emit('error', new Error('boom'));

    // Prints: ok boom

    An AbortSignal can be used to cancel waiting for the event:

    import { EventEmitter, once } from 'node:events';

    const ee = new EventEmitter();
    const ac = new AbortController();

    async function foo(emitter, event, signal) {
    try {
    await once(emitter, event, { signal });
    console.log('event emitted!');
    } catch (error) {
    if (error.name === 'AbortError') {
    console.error('Waiting for the event was canceled!');
    } else {
    console.error('There was an error', error.message);
    }
    }
    }

    foo(ee, 'foo', ac.signal);
    ac.abort(); // Abort waiting for the event
    ee.emit('foo'); // Prints: Waiting for the event was canceled!

    Parameters

    • emitter: _NodeEventTarget
    • eventName: string | symbol
    • Optional options: StaticEventEmitterOptions

    Returns Promise<any[]>

    Since

    v11.13.0, v10.16.0

  • Parameters

    • emitter: _DOMEventTarget
    • eventName: string
    • Optional options: StaticEventEmitterOptions

    Returns Promise<any[]>

  • import { setMaxListeners, EventEmitter } from 'node:events';

    const target = new EventTarget();
    const emitter = new EventEmitter();

    setMaxListeners(5, target, emitter);

    Parameters

    • Optional n: number

      A non-negative number. The maximum number of listeners per EventTarget event.

    • Rest ...eventTargets: (EventEmitter | _DOMEventTarget)[]

    Returns void

    Since

    v15.4.0

  • Experimental

    A utility method for creating a web ReadableStream from a Readable.

    Parameters

    • streamReadable: Readable

    Returns ReadableStream<any>

    Since

    v17.0.0

Generated using TypeDoc