refactor: move brother_node development artifact to dev/test-nodes subdirectory
Development Artifact Cleanup: ✅ BROTHER_NODE REORGANIZATION: Moved development test node to appropriate location - dev/test-nodes/brother_node/: Moved from root directory for better organization - Contains development configuration, test logs, and test chain data - No impact on production systems - purely development/testing artifact ✅ DEVELOPMENT ARTIFACTS IDENTIFIED: - Chain ID: aitbc-brother-chain (test/development chain) - Ports: 8010 (P2P) and 8011 (RPC) - different from production - Environment: .env file with test configuration - Logs: rpc.log and node.log from development testing session (March 15, 2026) ✅ ROOT DIRECTORY CLEANUP: Removed development clutter from production directory - brother_node/ moved to dev/test-nodes/brother_node/ - Root directory now contains only production-ready components - Development artifacts properly organized in dev/ subdirectory DIRECTORY STRUCTURE IMPROVEMENT: 📁 dev/test-nodes/: Development and testing node configurations 🏗️ Root Directory: Clean production structure with only essential components 🧪 Development Isolation: Test environments separated from production BENEFITS: ✅ Clean Production Directory: No development artifacts in root ✅ Better Organization: Development nodes grouped in dev/ subdirectory ✅ Clear Separation: Production vs development environments clearly distinguished ✅ Maintainability: Easier to identify and manage development components RESULT: Successfully moved brother_node development artifact to dev/test-nodes/ subdirectory, cleaning up the root directory while preserving development testing environment for future use.
This commit is contained in:
193
dev/env/node_modules/@streamparser/json-node/README.md
generated
vendored
Executable file
193
dev/env/node_modules/@streamparser/json-node/README.md
generated
vendored
Executable file
@@ -0,0 +1,193 @@
|
||||
# @streamparser/json-node
|
||||
|
||||
[![npm version][npm-version-badge]][npm-badge-url]
|
||||
[![npm monthly downloads][npm-downloads-badge]][npm-badge-url]
|
||||
[![Build Status][build-status-badge]][build-status-url]
|
||||
[![Coverage Status][coverage-status-badge]][coverage-status-url]
|
||||
|
||||
Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node.js, Deno or any modern browser. Fully compliant with the JSON spec and `JSON.parse(...)`.
|
||||
|
||||
*tldr;*
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
|
||||
inputStream.pipe(jsonparser).pipe(destinationStream);
|
||||
|
||||
// Or using events to get the values
|
||||
|
||||
parser.on("data", (value) => { /* ... */ });
|
||||
parser.on("error", err => { /* ... */ });
|
||||
parser.on("end", () => { /* ... */ });
|
||||
```
|
||||
|
||||
## @streamparser/json ecosystem
|
||||
|
||||
There are multiple flavours of @streamparser:
|
||||
|
||||
* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
|
||||
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
|
||||
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.
|
||||
|
||||
## Components
|
||||
|
||||
### Tokenizer
|
||||
|
||||
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens that are emitted as objects.
|
||||
|
||||
```javascript
|
||||
import { Tokenizer } from '@streamparser/json-node';
|
||||
|
||||
const tokenizer = new Tokenizer(opts, transformOpts);
|
||||
```
|
||||
|
||||
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
|
||||
numberBufferSize: <number>, // set to 0 to don't buffer.
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js.
|
||||
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
If buffer sizes are set to anything else than zero, instead of using a string to apppend the data as it comes in, the data is buffered using a TypedArray. A reasonable size could be `64 * 1024` (64 KB).
|
||||
|
||||
#### Buffering
|
||||
|
||||
When parsing strings or numbers, the parser needs to gather the data in-memory until the whole value is ready.
|
||||
|
||||
Strings are inmutable in Javascript so every string operation creates a new string. The V8 engine, behind Node, Deno and most modern browsers, performs a many different types of optimization. One of this optimizations is to over-allocate memory when it detects many string concatenations. This increases significatly the memory consumption and can easily exhaust your memory when parsing JSON containing very large strings or numbers. For those cases, the parser can buffer the characters using a TypedArray. This requires encoding/decoding from/to the buffer into an actual string once the value is ready. This is done using the `TextEncoder` and `TextDecoder` APIs. Unfortunately, these APIs creates a significant overhead when the strings are small so should be used only when strictly necessary.
|
||||
|
||||
### TokenParser
|
||||
|
||||
A token parser that processes JSON tokens as emitted by the `Tokenizer` and emits JSON values/objects.
|
||||
|
||||
```javascript
|
||||
import { TokenParser} from '@streamparser/json-node';
|
||||
|
||||
const tokenParser = new TokenParser(opts, writableStrategy, readableStrategy);
|
||||
```
|
||||
|
||||
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
paths: <string[]>,
|
||||
keepStack: <boolean>, // whether to keep all the properties in the stack
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
|
||||
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
* paths: Array of paths to emit. Defaults to `undefined` which emits everything. The paths are intended to suppot jsonpath although at the time being it only supports the root object selector (`$`) and subproperties selectors including wildcards (`$.a`, `$.*`, `$.a.b`, , `$.*.b`, etc).
|
||||
* keepStack: Whether to keep full objects on the stack even if they won't be emitted. Defaults to `true`. When set to `false` the it does preserve properties in the parent object some ancestor will be emitted. This means that the parent object passed to the `onValue` function will be empty, which doesn't reflect the truth, but it's more memory-efficient.
|
||||
|
||||
### JSONParser
|
||||
|
||||
The full blown JSON parser. It basically chains a `Tokenizer` and a `TokenParser`.
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
You can use both components independently as
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
const jsonParser = tokenizer.pipeTrough(tokenParser);
|
||||
```
|
||||
|
||||
You can subscribe to the resulting data using the
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
|
||||
|
||||
inputStream.pipe(jsonparser).pipe(destinationStream);
|
||||
|
||||
// Or using events to get the values
|
||||
|
||||
parser.on("data", (value) => { /* ... */ });
|
||||
parser.on("error", err => { /* ... */ });
|
||||
parser.on("end", () => { /* ... */ });
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Stream-parsing a fetch request returning a JSONstream
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`{"id":1}{"id":2}{"id":3}...`).
|
||||
|
||||
```js
|
||||
import { JSONParser} from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.pipe(parser);
|
||||
reader.on('data', value => /* process element */);
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a JSON array
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
|
||||
const reader = response.body.pipe(parse).getReader();
|
||||
|
||||
reader.on('data', ({ value, key, parent, stack }) => /* process element */)
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a very long string getting previews of the string
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
|
||||
const reader = response.body.pipe(parse).getReader();
|
||||
|
||||
reader.on('data', ({ value, key, parent, stack, partial }) => {
|
||||
if (partial) {
|
||||
console.log(`Parsing value: ${value}... (still parsing)`);
|
||||
} else {
|
||||
console.log(`Value parsed: ${value}`);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
See [LICENSE.md](../../LICENSE).
|
||||
|
||||
[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson-node.svg
|
||||
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json-node
|
||||
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson-node.svg
|
||||
[build-status-badge]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml/badge.svg
|
||||
[build-status-url]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml
|
||||
[coverage-status-badge]: https://coveralls.io/repos/github/juanjoDiaz/streamparser-json/badge.svg?branch=main
|
||||
[coverage-status-url]: https://coveralls.io/github/juanjoDiaz/streamparser-json?branch=main
|
||||
5
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.d.ts
generated
vendored
Executable file
5
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.d.ts
generated
vendored
Executable file
@@ -0,0 +1,5 @@
|
||||
export { default as JSONParser } from "./jsonparser.js";
|
||||
export { default as Tokenizer } from "./tokenizer.js";
|
||||
export { default as TokenParser } from "./tokenparser.js";
|
||||
export { utf8, JsonTypes, type ParsedTokenInfo, type ParsedElementInfo, TokenParserMode, type StackElement, TokenType, } from "@streamparser/json";
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,MAAM,iBAAiB,CAAC;AACxD,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,gBAAgB,CAAC;AACtD,OAAO,EAAE,OAAO,IAAI,WAAW,EAAE,MAAM,kBAAkB,CAAC;AAE1D,OAAO,EACL,IAAI,EACJ,SAAS,EACT,KAAK,eAAe,EACpB,KAAK,iBAAiB,EACtB,eAAe,EACf,KAAK,YAAY,EACjB,SAAS,GACV,MAAM,oBAAoB,CAAC"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.js
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.js
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TokenType = exports.TokenParserMode = exports.JsonTypes = exports.utf8 = exports.TokenParser = exports.Tokenizer = exports.JSONParser = void 0;
|
||||
var jsonparser_js_1 = require("./jsonparser.js");
|
||||
Object.defineProperty(exports, "JSONParser", { enumerable: true, get: function () { return __importDefault(jsonparser_js_1).default; } });
|
||||
var tokenizer_js_1 = require("./tokenizer.js");
|
||||
Object.defineProperty(exports, "Tokenizer", { enumerable: true, get: function () { return __importDefault(tokenizer_js_1).default; } });
|
||||
var tokenparser_js_1 = require("./tokenparser.js");
|
||||
Object.defineProperty(exports, "TokenParser", { enumerable: true, get: function () { return __importDefault(tokenparser_js_1).default; } });
|
||||
var json_1 = require("@streamparser/json");
|
||||
Object.defineProperty(exports, "utf8", { enumerable: true, get: function () { return json_1.utf8; } });
|
||||
Object.defineProperty(exports, "JsonTypes", { enumerable: true, get: function () { return json_1.JsonTypes; } });
|
||||
Object.defineProperty(exports, "TokenParserMode", { enumerable: true, get: function () { return json_1.TokenParserMode; } });
|
||||
Object.defineProperty(exports, "TokenType", { enumerable: true, get: function () { return json_1.TokenType; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/index.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;;;;AAAA,iDAAwD;AAA/C,4HAAA,OAAO,OAAc;AAC9B,+CAAsD;AAA7C,0HAAA,OAAO,OAAa;AAC7B,mDAA0D;AAAjD,8HAAA,OAAO,OAAe;AAE/B,2CAQ4B;AAP1B,4FAAA,IAAI,OAAA;AACJ,iGAAA,SAAS,OAAA;AAGT,uGAAA,eAAe,OAAA;AAEf,iGAAA,SAAS,OAAA"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type JSONParserOptions } from "@streamparser/json";
|
||||
export default class JSONParserTransform extends Transform {
|
||||
private jsonParser;
|
||||
constructor(opts?: JSONParserOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.d.ts","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAc,KAAK,iBAAiB,EAAE,MAAM,oBAAoB,CAAC;AAExE,MAAM,CAAC,OAAO,OAAO,mBAAoB,SAAQ,SAAS;IACxD,OAAO,CAAC,UAAU,CAAa;gBAG7B,IAAI,GAAE,iBAAsB,EAC5B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
48
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.js
generated
vendored
Executable file
48
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.js
generated
vendored
Executable file
@@ -0,0 +1,48 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const stream_1 = require("stream");
|
||||
const json_1 = require("@streamparser/json");
|
||||
class JSONParserTransform extends stream_1.Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: false, readableObjectMode: true }));
|
||||
this.jsonParser = new json_1.JSONParser(opts);
|
||||
this.jsonParser.onValue = (value) => this.push(value);
|
||||
this.jsonParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.jsonParser.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.jsonParser.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.jsonParser.isEnded)
|
||||
this.jsonParser.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.default = JSONParserTransform;
|
||||
//# sourceMappingURL=jsonparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/jsonparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.js","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":";;AAAA,mCAIgB;AAChB,6CAAwE;AAExE,MAAqB,mBAAoB,SAAQ,kBAAS;IAGxD,YACE,OAA0B,EAAE,EAC5B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,KAAK,EACzB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,UAAU,GAAG,IAAI,iBAAU,CAAC,IAAI,CAAC,CAAC;QAEvC,IAAI,CAAC,UAAU,CAAC,OAAO,GAAG,CAAC,KAAK,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;QACtD,IAAI,CAAC,UAAU,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YAChC,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,UAAU,CAAC,KAAK,GAAG,GAAG,EAAE;YAC3B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,UAAU,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC7B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,OAAO;gBAAE,IAAI,CAAC,UAAU,CAAC,GAAG,EAAE,CAAC;YACpD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF;AAvDD,sCAuDC"}
|
||||
3
dev/env/node_modules/@streamparser/json-node/dist/cjs/package.json
generated
vendored
Executable file
3
dev/env/node_modules/@streamparser/json-node/dist/cjs/package.json
generated
vendored
Executable file
@@ -0,0 +1,3 @@
|
||||
{
|
||||
"type": "commonjs"
|
||||
}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type TokenizerOptions } from "@streamparser/json/tokenizer.js";
|
||||
export default class TokenizerTransform extends Transform {
|
||||
private tokenizer;
|
||||
constructor(opts?: TokenizerOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.d.ts","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAkB,EAChB,KAAK,gBAAgB,EACtB,MAAM,iCAAiC,CAAC;AAEzC,MAAM,CAAC,OAAO,OAAO,kBAAmB,SAAQ,SAAS;IACvD,OAAO,CAAC,SAAS,CAAY;gBAG3B,IAAI,GAAE,gBAAqB,EAC3B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
51
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.js
generated
vendored
Executable file
51
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.js
generated
vendored
Executable file
@@ -0,0 +1,51 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const stream_1 = require("stream");
|
||||
const tokenizer_js_1 = __importDefault(require("@streamparser/json/tokenizer.js"));
|
||||
class TokenizerTransform extends stream_1.Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: true, readableObjectMode: true }));
|
||||
this.tokenizer = new tokenizer_js_1.default(opts);
|
||||
this.tokenizer.onToken = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenizer.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.tokenizer.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.default = TokenizerTransform;
|
||||
//# sourceMappingURL=tokenizer.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenizer.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.js","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":";;;;;AAAA,mCAIgB;AAChB,mFAEyC;AAEzC,MAAqB,kBAAmB,SAAQ,kBAAS;IAGvD,YACE,OAAyB,EAAE,EAC3B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,IAAI,EACxB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,SAAS,GAAG,IAAI,sBAAS,CAAC,IAAI,CAAC,CAAC;QAErC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,eAAe,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QACzE,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YAC/B,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,KAAK,GAAG,GAAG,EAAE;YAC1B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC5B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;YAClD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF;AAvDD,qCAuDC"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type TokenParserOptions } from "@streamparser/json";
|
||||
export default class TokenParserTransform extends Transform {
|
||||
private tokenParser;
|
||||
constructor(opts?: TokenParserOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.d.ts","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAe,KAAK,kBAAkB,EAAE,MAAM,oBAAoB,CAAC;AAE1E,MAAM,CAAC,OAAO,OAAO,oBAAqB,SAAQ,SAAS;IACzD,OAAO,CAAC,WAAW,CAAc;gBAG/B,IAAI,GAAE,kBAAuB,EAC7B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
48
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.js
generated
vendored
Executable file
48
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.js
generated
vendored
Executable file
@@ -0,0 +1,48 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const stream_1 = require("stream");
|
||||
const json_1 = require("@streamparser/json");
|
||||
class TokenParserTransform extends stream_1.Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: true, readableObjectMode: true }));
|
||||
this.tokenParser = new json_1.TokenParser(opts);
|
||||
this.tokenParser.onValue = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.tokenParser.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.tokenParser.isEnded)
|
||||
this.tokenParser.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.default = TokenParserTransform;
|
||||
//# sourceMappingURL=tokenparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/cjs/tokenparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.js","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":";;AAAA,mCAIgB;AAChB,6CAA0E;AAE1E,MAAqB,oBAAqB,SAAQ,kBAAS;IAGzD,YACE,OAA2B,EAAE,EAC7B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,IAAI,EACxB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,WAAW,GAAG,IAAI,kBAAW,CAAC,IAAI,CAAC,CAAC;QAEzC,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,CAAC,eAAe,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QAC3E,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YACjC,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC9B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,OAAO;gBAAE,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,CAAC;YACtD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF;AAvDD,uCAuDC"}
|
||||
193
dev/env/node_modules/@streamparser/json-node/dist/deno/README.md
generated
vendored
Executable file
193
dev/env/node_modules/@streamparser/json-node/dist/deno/README.md
generated
vendored
Executable file
@@ -0,0 +1,193 @@
|
||||
# @streamparser/json-node
|
||||
|
||||
[![npm version][npm-version-badge]][npm-badge-url]
|
||||
[![npm monthly downloads][npm-downloads-badge]][npm-badge-url]
|
||||
[![Build Status][build-status-badge]][build-status-url]
|
||||
[![Coverage Status][coverage-status-badge]][coverage-status-url]
|
||||
|
||||
Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node.js, Deno or any modern browser. Fully compliant with the JSON spec and `JSON.parse(...)`.
|
||||
|
||||
*tldr;*
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
|
||||
inputStream.pipe(jsonparser).pipe(destinationStream);
|
||||
|
||||
// Or using events to get the values
|
||||
|
||||
parser.on("data", (value) => { /* ... */ });
|
||||
parser.on("error", err => { /* ... */ });
|
||||
parser.on("end", () => { /* ... */ });
|
||||
```
|
||||
|
||||
## @streamparser/json ecosystem
|
||||
|
||||
There are multiple flavours of @streamparser:
|
||||
|
||||
* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
|
||||
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
|
||||
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.
|
||||
|
||||
## Components
|
||||
|
||||
### Tokenizer
|
||||
|
||||
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens that are emitted as objects.
|
||||
|
||||
```javascript
|
||||
import { Tokenizer } from '@streamparser/json-node';
|
||||
|
||||
const tokenizer = new Tokenizer(opts, transformOpts);
|
||||
```
|
||||
|
||||
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
|
||||
numberBufferSize: <number>, // set to 0 to don't buffer.
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js.
|
||||
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
If buffer sizes are set to anything else than zero, instead of using a string to apppend the data as it comes in, the data is buffered using a TypedArray. A reasonable size could be `64 * 1024` (64 KB).
|
||||
|
||||
#### Buffering
|
||||
|
||||
When parsing strings or numbers, the parser needs to gather the data in-memory until the whole value is ready.
|
||||
|
||||
Strings are inmutable in Javascript so every string operation creates a new string. The V8 engine, behind Node, Deno and most modern browsers, performs a many different types of optimization. One of this optimizations is to over-allocate memory when it detects many string concatenations. This increases significatly the memory consumption and can easily exhaust your memory when parsing JSON containing very large strings or numbers. For those cases, the parser can buffer the characters using a TypedArray. This requires encoding/decoding from/to the buffer into an actual string once the value is ready. This is done using the `TextEncoder` and `TextDecoder` APIs. Unfortunately, these APIs creates a significant overhead when the strings are small so should be used only when strictly necessary.
|
||||
|
||||
### TokenParser
|
||||
|
||||
A token parser that processes JSON tokens as emitted by the `Tokenizer` and emits JSON values/objects.
|
||||
|
||||
```javascript
|
||||
import { TokenParser} from '@streamparser/json-node';
|
||||
|
||||
const tokenParser = new TokenParser(opts, writableStrategy, readableStrategy);
|
||||
```
|
||||
|
||||
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
paths: <string[]>,
|
||||
keepStack: <boolean>, // whether to keep all the properties in the stack
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
|
||||
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
* paths: Array of paths to emit. Defaults to `undefined` which emits everything. The paths are intended to suppot jsonpath although at the time being it only supports the root object selector (`$`) and subproperties selectors including wildcards (`$.a`, `$.*`, `$.a.b`, , `$.*.b`, etc).
|
||||
* keepStack: Whether to keep full objects on the stack even if they won't be emitted. Defaults to `true`. When set to `false` the it does preserve properties in the parent object some ancestor will be emitted. This means that the parent object passed to the `onValue` function will be empty, which doesn't reflect the truth, but it's more memory-efficient.
|
||||
|
||||
### JSONParser
|
||||
|
||||
The full blown JSON parser. It basically chains a `Tokenizer` and a `TokenParser`.
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
You can use both components independently as
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
const jsonParser = tokenizer.pipeTrough(tokenParser);
|
||||
```
|
||||
|
||||
You can subscribe to the resulting data using the
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
|
||||
|
||||
inputStream.pipe(jsonparser).pipe(destinationStream);
|
||||
|
||||
// Or using events to get the values
|
||||
|
||||
parser.on("data", (value) => { /* ... */ });
|
||||
parser.on("error", err => { /* ... */ });
|
||||
parser.on("end", () => { /* ... */ });
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Stream-parsing a fetch request returning a JSONstream
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`{"id":1}{"id":2}{"id":3}...`).
|
||||
|
||||
```js
|
||||
import { JSONParser} from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser();
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.pipe(parser);
|
||||
reader.on('data', value => /* process element */);
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a JSON array
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
|
||||
const reader = response.body.pipe(parse).getReader();
|
||||
|
||||
reader.on('data', ({ value, key, parent, stack }) => /* process element */)
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a very long string getting previews of the string
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json-node';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
|
||||
const reader = response.body.pipe(parse).getReader();
|
||||
|
||||
reader.on('data', ({ value, key, parent, stack, partial }) => {
|
||||
if (partial) {
|
||||
console.log(`Parsing value: ${value}... (still parsing)`);
|
||||
} else {
|
||||
console.log(`Value parsed: ${value}`);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
See [LICENSE.md](../../LICENSE).
|
||||
|
||||
[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson-node.svg
|
||||
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json-node
|
||||
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson-node.svg
|
||||
[build-status-badge]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml/badge.svg
|
||||
[build-status-url]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml
|
||||
[coverage-status-badge]: https://coveralls.io/repos/github/juanjoDiaz/streamparser-json/badge.svg?branch=main
|
||||
[coverage-status-url]: https://coveralls.io/github/juanjoDiaz/streamparser-json?branch=main
|
||||
13
dev/env/node_modules/@streamparser/json-node/dist/deno/index.ts
generated
vendored
Executable file
13
dev/env/node_modules/@streamparser/json-node/dist/deno/index.ts
generated
vendored
Executable file
@@ -0,0 +1,13 @@
|
||||
export { default as JSONParser } from "./jsonparser.ts";
|
||||
export { default as Tokenizer } from "./tokenizer.ts";
|
||||
export { default as TokenParser } from "./tokenparser.ts";
|
||||
|
||||
export {
|
||||
utf8,
|
||||
JsonTypes,
|
||||
type ParsedTokenInfo,
|
||||
type ParsedElementInfo,
|
||||
TokenParserMode,
|
||||
type StackElement,
|
||||
TokenType,
|
||||
} from "https://deno.land/x/streamparser_json@v0.0.22/index.ts";
|
||||
63
dev/env/node_modules/@streamparser/json-node/dist/deno/jsonparser.ts
generated
vendored
Executable file
63
dev/env/node_modules/@streamparser/json-node/dist/deno/jsonparser.ts
generated
vendored
Executable file
@@ -0,0 +1,63 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import { JSONParser, type JSONParserOptions } from "https://deno.land/x/streamparser_json@v0.0.22/index.ts";
|
||||
|
||||
export default class JSONParserTransform extends Transform {
|
||||
private jsonParser: JSONParser;
|
||||
|
||||
constructor(
|
||||
opts: JSONParserOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: false,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.jsonParser = new JSONParser(opts);
|
||||
|
||||
this.jsonParser.onValue = (value) => this.push(value);
|
||||
this.jsonParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.jsonParser.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.jsonParser.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.jsonParser.isEnded) this.jsonParser.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
65
dev/env/node_modules/@streamparser/json-node/dist/deno/tokenizer.ts
generated
vendored
Executable file
65
dev/env/node_modules/@streamparser/json-node/dist/deno/tokenizer.ts
generated
vendored
Executable file
@@ -0,0 +1,65 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import Tokenizer, {
|
||||
type TokenizerOptions,
|
||||
} from "https://deno.land/x/streamparser_json@v0.0.22/tokenizer.ts";
|
||||
|
||||
export default class TokenizerTransform extends Transform {
|
||||
private tokenizer: Tokenizer;
|
||||
|
||||
constructor(
|
||||
opts: TokenizerOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.tokenizer = new Tokenizer(opts);
|
||||
|
||||
this.tokenizer.onToken = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenizer.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.tokenizer.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.tokenizer.isEnded) this.tokenizer.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
63
dev/env/node_modules/@streamparser/json-node/dist/deno/tokenparser.ts
generated
vendored
Executable file
63
dev/env/node_modules/@streamparser/json-node/dist/deno/tokenparser.ts
generated
vendored
Executable file
@@ -0,0 +1,63 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import { TokenParser, type TokenParserOptions } from "https://deno.land/x/streamparser_json@v0.0.22/index.ts";
|
||||
|
||||
export default class TokenParserTransform extends Transform {
|
||||
private tokenParser: TokenParser;
|
||||
|
||||
constructor(
|
||||
opts: TokenParserOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.tokenParser = new TokenParser(opts);
|
||||
|
||||
this.tokenParser.onValue = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.tokenParser.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.tokenParser.isEnded) this.tokenParser.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
16
dev/env/node_modules/@streamparser/json-node/dist/deno/utils.ts
generated
vendored
Executable file
16
dev/env/node_modules/@streamparser/json-node/dist/deno/utils.ts
generated
vendored
Executable file
@@ -0,0 +1,16 @@
|
||||
import type { ParsedElementInfo } from "https://deno.land/x/streamparser_json@v0.0.17/utils/types/parsedElementInfo.ts";
|
||||
|
||||
export function cloneParsedElementInfo(
|
||||
parsedElementInfo: ParsedElementInfo,
|
||||
): ParsedElementInfo {
|
||||
const { value, key, parent, stack } = parsedElementInfo;
|
||||
return { value, key, parent: clone(parent), stack: clone(stack) };
|
||||
}
|
||||
|
||||
function clone<T>(obj: T): T {
|
||||
// Only objects are passed by reference and must be cloned
|
||||
if (typeof obj !== "object") return obj;
|
||||
// Solve arrays with empty positions
|
||||
if (Array.isArray(obj) && obj.filter((i) => i).length === 0) return obj;
|
||||
return JSON.parse(JSON.stringify(obj));
|
||||
}
|
||||
5
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.d.ts
generated
vendored
Executable file
5
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.d.ts
generated
vendored
Executable file
@@ -0,0 +1,5 @@
|
||||
export { default as JSONParser } from "./jsonparser.js";
|
||||
export { default as Tokenizer } from "./tokenizer.js";
|
||||
export { default as TokenParser } from "./tokenparser.js";
|
||||
export { utf8, JsonTypes, type ParsedTokenInfo, type ParsedElementInfo, TokenParserMode, type StackElement, TokenType, } from "@streamparser/json";
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,MAAM,iBAAiB,CAAC;AACxD,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,gBAAgB,CAAC;AACtD,OAAO,EAAE,OAAO,IAAI,WAAW,EAAE,MAAM,kBAAkB,CAAC;AAE1D,OAAO,EACL,IAAI,EACJ,SAAS,EACT,KAAK,eAAe,EACpB,KAAK,iBAAiB,EACtB,eAAe,EACf,KAAK,YAAY,EACjB,SAAS,GACV,MAAM,oBAAoB,CAAC"}
|
||||
5
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.js
generated
vendored
Executable file
5
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.js
generated
vendored
Executable file
@@ -0,0 +1,5 @@
|
||||
export { default as JSONParser } from "./jsonparser.js";
|
||||
export { default as Tokenizer } from "./tokenizer.js";
|
||||
export { default as TokenParser } from "./tokenparser.js";
|
||||
export { utf8, JsonTypes, TokenParserMode, TokenType, } from "@streamparser/json";
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/index.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,MAAM,iBAAiB,CAAC;AACxD,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,gBAAgB,CAAC;AACtD,OAAO,EAAE,OAAO,IAAI,WAAW,EAAE,MAAM,kBAAkB,CAAC;AAE1D,OAAO,EACL,IAAI,EACJ,SAAS,EAGT,eAAe,EAEf,SAAS,GACV,MAAM,oBAAoB,CAAC"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type JSONParserOptions } from "@streamparser/json";
|
||||
export default class JSONParserTransform extends Transform {
|
||||
private jsonParser;
|
||||
constructor(opts?: JSONParserOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.d.ts","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAc,KAAK,iBAAiB,EAAE,MAAM,oBAAoB,CAAC;AAExE,MAAM,CAAC,OAAO,OAAO,mBAAoB,SAAQ,SAAS;IACxD,OAAO,CAAC,UAAU,CAAa;gBAG7B,IAAI,GAAE,iBAAsB,EAC5B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.js
generated
vendored
Executable file
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.js
generated
vendored
Executable file
@@ -0,0 +1,45 @@
|
||||
import { Transform, } from "stream";
|
||||
import { JSONParser } from "@streamparser/json";
|
||||
export default class JSONParserTransform extends Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: false, readableObjectMode: true }));
|
||||
this.jsonParser = new JSONParser(opts);
|
||||
this.jsonParser.onValue = (value) => this.push(value);
|
||||
this.jsonParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.jsonParser.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.jsonParser.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.jsonParser.isEnded)
|
||||
this.jsonParser.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/jsonparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.js","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":"AAAA,OAAO,EACL,SAAS,GAGV,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAE,UAAU,EAA0B,MAAM,oBAAoB,CAAC;AAExE,MAAM,CAAC,OAAO,OAAO,mBAAoB,SAAQ,SAAS;IAGxD,YACE,OAA0B,EAAE,EAC5B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,KAAK,EACzB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,UAAU,GAAG,IAAI,UAAU,CAAC,IAAI,CAAC,CAAC;QAEvC,IAAI,CAAC,UAAU,CAAC,OAAO,GAAG,CAAC,KAAK,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;QACtD,IAAI,CAAC,UAAU,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YAChC,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,UAAU,CAAC,KAAK,GAAG,GAAG,EAAE;YAC3B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,UAAU,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC7B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,OAAO;gBAAE,IAAI,CAAC,UAAU,CAAC,GAAG,EAAE,CAAC;YACpD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type TokenizerOptions } from "@streamparser/json/tokenizer.js";
|
||||
export default class TokenizerTransform extends Transform {
|
||||
private tokenizer;
|
||||
constructor(opts?: TokenizerOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.d.ts","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAkB,EAChB,KAAK,gBAAgB,EACtB,MAAM,iCAAiC,CAAC;AAEzC,MAAM,CAAC,OAAO,OAAO,kBAAmB,SAAQ,SAAS;IACvD,OAAO,CAAC,SAAS,CAAY;gBAG3B,IAAI,GAAE,gBAAqB,EAC3B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.js
generated
vendored
Executable file
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.js
generated
vendored
Executable file
@@ -0,0 +1,45 @@
|
||||
import { Transform, } from "stream";
|
||||
import Tokenizer, {} from "@streamparser/json/tokenizer.js";
|
||||
export default class TokenizerTransform extends Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: true, readableObjectMode: true }));
|
||||
this.tokenizer = new Tokenizer(opts);
|
||||
this.tokenizer.onToken = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenizer.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.tokenizer.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenizer.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.js","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":"AAAA,OAAO,EACL,SAAS,GAGV,MAAM,QAAQ,CAAC;AAChB,OAAO,SAAS,EAAE,EAEjB,MAAM,iCAAiC,CAAC;AAEzC,MAAM,CAAC,OAAO,OAAO,kBAAmB,SAAQ,SAAS;IAGvD,YACE,OAAyB,EAAE,EAC3B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,IAAI,EACxB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,SAAS,GAAG,IAAI,SAAS,CAAC,IAAI,CAAC,CAAC;QAErC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,eAAe,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QACzE,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YAC/B,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,SAAS,CAAC,KAAK,GAAG,GAAG,EAAE;YAC1B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC5B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;YAClD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.d.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
/// <reference types="node" resolution-mode="require"/>
|
||||
import { Transform, type TransformOptions, type TransformCallback } from "stream";
|
||||
import { type TokenParserOptions } from "@streamparser/json";
|
||||
export default class TokenParserTransform extends Transform {
|
||||
private tokenParser;
|
||||
constructor(opts?: TokenParserOptions, transformOpts?: Omit<TransformOptions, "readableObjectMode" | "writableObjectMode">);
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(chunk: any, encoding: BufferEncoding, done: TransformCallback): void;
|
||||
_final(callback: (error?: Error | null) => void): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.d.ts","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":";;AAAA,OAAO,EACL,SAAS,EACT,KAAK,gBAAgB,EACrB,KAAK,iBAAiB,EACvB,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAe,KAAK,kBAAkB,EAAE,MAAM,oBAAoB,CAAC;AAE1E,MAAM,CAAC,OAAO,OAAO,oBAAqB,SAAQ,SAAS;IACzD,OAAO,CAAC,WAAW,CAAc;gBAG/B,IAAI,GAAE,kBAAuB,EAC7B,aAAa,GAAE,IAAI,CACjB,gBAAgB,EAChB,oBAAoB,GAAG,oBAAoB,CACvC;IAkBR;;;;;;OAMG;IACM,UAAU,CAEjB,KAAK,EAAE,GAAG,EACV,QAAQ,EAAE,cAAc,EACxB,IAAI,EAAE,iBAAiB,GACtB,IAAI;IASE,MAAM,CAAC,QAAQ,EAAE,CAAC,KAAK,CAAC,EAAE,KAAK,GAAG,IAAI,KAAK,IAAI,GAAG,IAAI;CAQhE"}
|
||||
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.js
generated
vendored
Executable file
45
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.js
generated
vendored
Executable file
@@ -0,0 +1,45 @@
|
||||
import { Transform, } from "stream";
|
||||
import { TokenParser } from "@streamparser/json";
|
||||
export default class TokenParserTransform extends Transform {
|
||||
constructor(opts = {}, transformOpts = {}) {
|
||||
super(Object.assign(Object.assign({}, transformOpts), { writableObjectMode: true, readableObjectMode: true }));
|
||||
this.tokenParser = new TokenParser(opts);
|
||||
this.tokenParser.onValue = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.writableEnded)
|
||||
this.end();
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
_transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk, encoding, done) {
|
||||
try {
|
||||
this.tokenParser.write(chunk);
|
||||
done();
|
||||
}
|
||||
catch (err) {
|
||||
done(err);
|
||||
}
|
||||
}
|
||||
_final(callback) {
|
||||
try {
|
||||
if (!this.tokenParser.isEnded)
|
||||
this.tokenParser.end();
|
||||
callback();
|
||||
}
|
||||
catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/dist/mjs/tokenparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.js","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":"AAAA,OAAO,EACL,SAAS,GAGV,MAAM,QAAQ,CAAC;AAChB,OAAO,EAAE,WAAW,EAA2B,MAAM,oBAAoB,CAAC;AAE1E,MAAM,CAAC,OAAO,OAAO,oBAAqB,SAAQ,SAAS;IAGzD,YACE,OAA2B,EAAE,EAC7B,gBAGI,EAAE;QAEN,KAAK,iCACA,aAAa,KAChB,kBAAkB,EAAE,IAAI,EACxB,kBAAkB,EAAE,IAAI,IACxB,CAAC;QACH,IAAI,CAAC,WAAW,GAAG,IAAI,WAAW,CAAC,IAAI,CAAC,CAAC;QAEzC,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,CAAC,eAAe,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QAC3E,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,CAAC,GAAG,EAAE,EAAE;YACjC,MAAM,GAAG,CAAC;QACZ,CAAC,CAAC;QACF,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,aAAa;gBAAE,IAAI,CAAC,GAAG,EAAE,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACM,UAAU;IACjB,8DAA8D;IAC9D,KAAU,EACV,QAAwB,EACxB,IAAuB;QAEvB,IAAI;YACF,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;YAC9B,IAAI,EAAE,CAAC;SACR;QAAC,OAAO,GAAY,EAAE;YACrB,IAAI,CAAC,GAAY,CAAC,CAAC;SACpB;IACH,CAAC;IAEQ,MAAM,CAAC,QAAwC;QACtD,IAAI;YACF,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,OAAO;gBAAE,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,CAAC;YACtD,QAAQ,EAAE,CAAC;SACZ;QAAC,OAAO,GAAY,EAAE;YACrB,QAAQ,CAAC,GAAY,CAAC,CAAC;SACxB;IACH,CAAC;CACF"}
|
||||
18
dev/env/node_modules/@streamparser/json-node/jest.config.ts
generated
vendored
Executable file
18
dev/env/node_modules/@streamparser/json-node/jest.config.ts
generated
vendored
Executable file
@@ -0,0 +1,18 @@
|
||||
import type { JestConfigWithTsJest } from 'ts-jest';
|
||||
|
||||
const jestConfig: JestConfigWithTsJest = {
|
||||
extensionsToTreatAsEsm: ['.ts'],
|
||||
moduleNameMapper: { '^(\\.{1,2}/.*)\\.js$': '$1' },
|
||||
transform: {
|
||||
'^.+\\.m?[t]sx?$': ['ts-jest', {
|
||||
isolatedModules: true,
|
||||
useESM: true
|
||||
}],
|
||||
},
|
||||
testEnvironment: 'node',
|
||||
testMatch: ['<rootDir>/**/test/*.ts', '<rootDir>/**/test/types/*.ts'],
|
||||
collectCoverageFrom: ['src/**'],
|
||||
setupFilesAfterEnv: ['<rootDir>/test/utils/setup.ts']
|
||||
}
|
||||
|
||||
export = jestConfig;
|
||||
50
dev/env/node_modules/@streamparser/json-node/package.json
generated
vendored
Executable file
50
dev/env/node_modules/@streamparser/json-node/package.json
generated
vendored
Executable file
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"name": "@streamparser/json-node",
|
||||
"description": "Streaming JSON parser in Javascript for Node.js, Deno and the browser",
|
||||
"version": "0.0.22",
|
||||
"main": "./dist/mjs/index.js",
|
||||
"module": "./dist/mjs/index.js",
|
||||
"browser": "./dist/mjs/index.js",
|
||||
"types": "./dist/mjs/index.d.ts",
|
||||
"type": "module",
|
||||
"exports": {
|
||||
"./*": {
|
||||
"import": "./dist/mjs/*",
|
||||
"require": "./dist/cjs/*"
|
||||
},
|
||||
".": {
|
||||
"import": "./dist/mjs/index.js",
|
||||
"require": "./dist/cjs/index.js"
|
||||
}
|
||||
},
|
||||
"author": "Juanjo Diaz <juanjo.diazmo@gmail.com>",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/juanjoDiaz/streamparser-json.git"
|
||||
},
|
||||
"homepage": "https://github.com/juanjoDiaz/jsonparse2#readme",
|
||||
"bugs": "https://github.com/juanjoDiaz/streamparser-json/issues",
|
||||
"scripts": {
|
||||
"lint": "eslint src test --ext .js,.ts,.json",
|
||||
"lint:fix": "npm run lint -- --fix",
|
||||
"build:cjs": "tsc --module commonjs --verbatimModuleSyntax false --outDir ./dist/cjs && node ../../build-cjs.js whatwg",
|
||||
"build:mjs": "tsc --module esnext --verbatimModuleSyntax --outDir ./dist/mjs",
|
||||
"build:deno": "node ../../build.deno.js . ./dist/deno",
|
||||
"build": "npm run build:mjs && npm run build:cjs && npm run build:deno",
|
||||
"prepublishOnly": "npm run build",
|
||||
"pretest": "npm run build",
|
||||
"test": "node test/CommonJS.cjs && jest test",
|
||||
"test-with-coverage": "npm test -- --coverage"
|
||||
},
|
||||
"license": "MIT",
|
||||
"tags": [
|
||||
"json",
|
||||
"stream"
|
||||
],
|
||||
"dependencies": {
|
||||
"@streamparser/json": "^0.0.22"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^22.9.0"
|
||||
}
|
||||
}
|
||||
13
dev/env/node_modules/@streamparser/json-node/src/index.ts
generated
vendored
Executable file
13
dev/env/node_modules/@streamparser/json-node/src/index.ts
generated
vendored
Executable file
@@ -0,0 +1,13 @@
|
||||
export { default as JSONParser } from "./jsonparser.js";
|
||||
export { default as Tokenizer } from "./tokenizer.js";
|
||||
export { default as TokenParser } from "./tokenparser.js";
|
||||
|
||||
export {
|
||||
utf8,
|
||||
JsonTypes,
|
||||
type ParsedTokenInfo,
|
||||
type ParsedElementInfo,
|
||||
TokenParserMode,
|
||||
type StackElement,
|
||||
TokenType,
|
||||
} from "@streamparser/json";
|
||||
63
dev/env/node_modules/@streamparser/json-node/src/jsonparser.ts
generated
vendored
Executable file
63
dev/env/node_modules/@streamparser/json-node/src/jsonparser.ts
generated
vendored
Executable file
@@ -0,0 +1,63 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import { JSONParser, type JSONParserOptions } from "@streamparser/json";
|
||||
|
||||
export default class JSONParserTransform extends Transform {
|
||||
private jsonParser: JSONParser;
|
||||
|
||||
constructor(
|
||||
opts: JSONParserOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: false,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.jsonParser = new JSONParser(opts);
|
||||
|
||||
this.jsonParser.onValue = (value) => this.push(value);
|
||||
this.jsonParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.jsonParser.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.jsonParser.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.jsonParser.isEnded) this.jsonParser.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
65
dev/env/node_modules/@streamparser/json-node/src/tokenizer.ts
generated
vendored
Executable file
65
dev/env/node_modules/@streamparser/json-node/src/tokenizer.ts
generated
vendored
Executable file
@@ -0,0 +1,65 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import Tokenizer, {
|
||||
type TokenizerOptions,
|
||||
} from "@streamparser/json/tokenizer.js";
|
||||
|
||||
export default class TokenizerTransform extends Transform {
|
||||
private tokenizer: Tokenizer;
|
||||
|
||||
constructor(
|
||||
opts: TokenizerOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.tokenizer = new Tokenizer(opts);
|
||||
|
||||
this.tokenizer.onToken = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenizer.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.tokenizer.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.tokenizer.isEnded) this.tokenizer.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
63
dev/env/node_modules/@streamparser/json-node/src/tokenparser.ts
generated
vendored
Executable file
63
dev/env/node_modules/@streamparser/json-node/src/tokenparser.ts
generated
vendored
Executable file
@@ -0,0 +1,63 @@
|
||||
import {
|
||||
Transform,
|
||||
type TransformOptions,
|
||||
type TransformCallback,
|
||||
} from "stream";
|
||||
import { TokenParser, type TokenParserOptions } from "@streamparser/json";
|
||||
|
||||
export default class TokenParserTransform extends Transform {
|
||||
private tokenParser: TokenParser;
|
||||
|
||||
constructor(
|
||||
opts: TokenParserOptions = {},
|
||||
transformOpts: Omit<
|
||||
TransformOptions,
|
||||
"readableObjectMode" | "writableObjectMode"
|
||||
> = {},
|
||||
) {
|
||||
super({
|
||||
...transformOpts,
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
});
|
||||
this.tokenParser = new TokenParser(opts);
|
||||
|
||||
this.tokenParser.onValue = (parsedTokenInfo) => this.push(parsedTokenInfo);
|
||||
this.tokenParser.onError = (err) => {
|
||||
throw err;
|
||||
};
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.writableEnded) this.end();
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function that send data to the parser to be processed.
|
||||
*
|
||||
* @param {Buffer} chunk Incoming data
|
||||
* @param {String} encoding Encoding of the incoming data. Defaults to 'utf8'
|
||||
* @param {Function} done Called when the proceesing of the supplied chunk is done
|
||||
*/
|
||||
override _transform(
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
chunk: any,
|
||||
encoding: BufferEncoding,
|
||||
done: TransformCallback,
|
||||
): void {
|
||||
try {
|
||||
this.tokenParser.write(chunk);
|
||||
done();
|
||||
} catch (err: unknown) {
|
||||
done(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
override _final(callback: (error?: Error | null) => void): void {
|
||||
try {
|
||||
if (!this.tokenParser.isEnded) this.tokenParser.end();
|
||||
callback();
|
||||
} catch (err: unknown) {
|
||||
callback(err as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
1
dev/env/node_modules/@streamparser/json-node/test/CommonJS.cjs
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json-node/test/CommonJS.cjs
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
require('@streamparser/json-whatwg');
|
||||
12
dev/env/node_modules/@streamparser/json-node/test/bom.ts
generated
vendored
Executable file
12
dev/env/node_modules/@streamparser/json-node/test/bom.ts
generated
vendored
Executable file
@@ -0,0 +1,12 @@
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
import { runJSONParserTest } from "./utils/testRunner.js";
|
||||
|
||||
describe("BOM", () => {
|
||||
test("should support UTF-8 BOM", () => {
|
||||
runJSONParserTest(
|
||||
new JSONParser(),
|
||||
new Uint8Array([0xef, 0xbb, 0xbf, 0x31]),
|
||||
({ value }) => expect(value).toBe(1),
|
||||
);
|
||||
});
|
||||
});
|
||||
647
dev/env/node_modules/@streamparser/json-node/test/emitPartial.ts
generated
vendored
Executable file
647
dev/env/node_modules/@streamparser/json-node/test/emitPartial.ts
generated
vendored
Executable file
@@ -0,0 +1,647 @@
|
||||
import TokenType from "@streamparser/json/utils/types/tokenType.js";
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
import Tokenizer from "../src/tokenizer.js";
|
||||
import {
|
||||
TestData,
|
||||
runJSONParserTest,
|
||||
runTokenizerTest,
|
||||
} from "./utils/testRunner.js";
|
||||
|
||||
describe("Emit Partial", () => {
|
||||
describe("Tokenizer emit partial tokens", () => {
|
||||
const emitPartialTokenTestData: TestData[] = [
|
||||
{
|
||||
value: ["tr", "ue"],
|
||||
expected: [
|
||||
{ token: TokenType.TRUE, value: true, partial: true },
|
||||
{ token: TokenType.TRUE, value: true, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["t", "ru", "e"],
|
||||
expected: [
|
||||
{ token: TokenType.TRUE, value: true, partial: true },
|
||||
{ token: TokenType.TRUE, value: true, partial: true },
|
||||
{ token: TokenType.TRUE, value: true, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["f", "al", "se"],
|
||||
expected: [
|
||||
{ token: TokenType.FALSE, value: false, partial: true },
|
||||
{ token: TokenType.FALSE, value: false, partial: true },
|
||||
{ token: TokenType.FALSE, value: false, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["fal", "se"],
|
||||
expected: [
|
||||
{ token: TokenType.FALSE, value: false, partial: true },
|
||||
{ token: TokenType.FALSE, value: false, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["0", ".", "123"],
|
||||
expected: [
|
||||
{ token: TokenType.NUMBER, value: 0, partial: true },
|
||||
{ token: TokenType.NUMBER, value: 0.123, partial: true },
|
||||
{ token: TokenType.NUMBER, value: 0.123, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["n", "u", "l", "l"],
|
||||
expected: [
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["n", "u", "l", "l"],
|
||||
expected: [
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: true },
|
||||
{ token: TokenType.NULL, value: null, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "{",
|
||||
expected: [{ token: TokenType.LEFT_BRACE, value: "{", partial: false }],
|
||||
},
|
||||
{
|
||||
value: ['{ "fo', "o", '"', ': "', '"'],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "fo", partial: true },
|
||||
{ token: TokenType.STRING, value: "foo", partial: true },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "", partial: true },
|
||||
{ token: TokenType.STRING, value: "", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "ba', "r", '"'],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "ba", partial: true },
|
||||
{ token: TokenType.STRING, value: "bar", partial: true },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar"', "}"],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.RIGHT_BRACE, value: "}", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "foo": "bar" }',
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.RIGHT_BRACE, value: "}", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: [
|
||||
'{ "foo": "bar", "ba',
|
||||
"z",
|
||||
'": [',
|
||||
'{ "foo": "bar", "baz": [',
|
||||
'{ "foo": "bar", "baz": [1',
|
||||
"2",
|
||||
"3, ",
|
||||
],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "ba", partial: true },
|
||||
{ token: TokenType.STRING, value: "baz", partial: true },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.NUMBER, value: 1, partial: true },
|
||||
{ token: TokenType.NUMBER, value: 12, partial: true },
|
||||
{ token: TokenType.NUMBER, value: 123, partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "foo": "bar", "baz": [1]',
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.NUMBER, value: 1, partial: false },
|
||||
{ token: TokenType.RIGHT_BRACKET, value: "]", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar", ', ' "baz": [1,'],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.NUMBER, value: 1, partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar", "baz": [1,2', "3, 4", "5", "6] }"],
|
||||
expected: [
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.LEFT_BRACE,
|
||||
value: "{",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.STRING,
|
||||
value: "foo",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.COLON,
|
||||
value: ":",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.STRING,
|
||||
value: "bar",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.COMMA,
|
||||
value: ",",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.STRING,
|
||||
value: "baz",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.COLON,
|
||||
value: ":",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.LEFT_BRACKET,
|
||||
value: "[",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.NUMBER,
|
||||
value: 1,
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.COMMA,
|
||||
value: ",",
|
||||
partial: false,
|
||||
},
|
||||
{ token: TokenType.NUMBER, value: 2, partial: true },
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.NUMBER,
|
||||
value: 23,
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.COMMA,
|
||||
value: ",",
|
||||
partial: false,
|
||||
},
|
||||
{ token: TokenType.NUMBER, value: 4, partial: true },
|
||||
{ token: TokenType.NUMBER, value: 45, partial: true },
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.NUMBER,
|
||||
value: 456,
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.RIGHT_BRACKET,
|
||||
value: "]",
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
type: "complete",
|
||||
token: TokenType.RIGHT_BRACE,
|
||||
value: "}",
|
||||
partial: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar", "baz"', ": [{"],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar", "baz": [{ "a', '"'],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "a", partial: true },
|
||||
{ token: TokenType.STRING, value: "a", partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ['{ "foo": "bar", "baz": [{ "a": "b', '"'],
|
||||
expected: [
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "foo", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "bar", partial: false },
|
||||
{ token: TokenType.COMMA, value: ",", partial: false },
|
||||
{ token: TokenType.STRING, value: "baz", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.LEFT_BRACKET, value: "[", partial: false },
|
||||
{ token: TokenType.LEFT_BRACE, value: "{", partial: false },
|
||||
{ token: TokenType.STRING, value: "a", partial: false },
|
||||
{ token: TokenType.COLON, value: ":", partial: false },
|
||||
{ token: TokenType.STRING, value: "b", partial: true },
|
||||
{ token: TokenType.STRING, value: "b", partial: false },
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
emitPartialTokenTestData.forEach(({ value, expected }) => {
|
||||
test(`Tokenizer emit partial tokens: ${value}`, async () => {
|
||||
let i = 0;
|
||||
await runTokenizerTest(
|
||||
new Tokenizer({ emitPartialTokens: true }),
|
||||
value,
|
||||
({ token, value, partial }) => {
|
||||
const expectedData = expected[i];
|
||||
expect(token).toEqual(expectedData.token);
|
||||
expect(value).toEqual(expectedData.value);
|
||||
expect(partial ?? false).toEqual(expectedData.partial);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
expect(i).toEqual(expected.length);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("TokenParser emit partial values", () => {
|
||||
const emitPartialValuesTestData: TestData[] = [
|
||||
{
|
||||
value: ['"a', "bc", '"'],
|
||||
expected: [
|
||||
{ value: "a", key: undefined, parent: undefined, partial: true },
|
||||
{ value: "abc", key: undefined, parent: undefined, partial: true },
|
||||
{ value: "abc", key: undefined, parent: undefined, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["12", ".34"],
|
||||
expected: [
|
||||
{ value: 12, key: undefined, parent: undefined, partial: true },
|
||||
{ value: 12.34, key: undefined, parent: undefined, partial: true },
|
||||
{ value: 12.34, key: undefined, parent: undefined, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["[", "]"],
|
||||
expected: [
|
||||
{ value: undefined, key: 0, parent: [], partial: true },
|
||||
{ value: [], key: undefined, parent: undefined, partial: false },
|
||||
],
|
||||
},
|
||||
{
|
||||
value: ["[", '"a', "bc", '"', ",", '"def"', "]"],
|
||||
expected: [
|
||||
{ value: undefined, key: 0, parent: [], partial: true },
|
||||
{ value: "a", key: 0, parent: [], partial: true },
|
||||
{ value: "abc", key: 0, parent: [], partial: true },
|
||||
{ value: "abc", key: 0, parent: ["abc"], partial: false },
|
||||
{ value: "def", key: 1, parent: ["abc", "def"], partial: false },
|
||||
{
|
||||
value: ["abc", "def"],
|
||||
key: undefined,
|
||||
parent: undefined,
|
||||
partial: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
value: [
|
||||
"{",
|
||||
'"a',
|
||||
"bc",
|
||||
'"',
|
||||
":",
|
||||
'"def"',
|
||||
",",
|
||||
'"ghi":',
|
||||
'"jkl"',
|
||||
"}",
|
||||
],
|
||||
expected: [
|
||||
{ value: undefined, key: undefined, parent: {}, partial: true },
|
||||
{ value: undefined, key: "a", parent: {}, partial: true },
|
||||
{ value: undefined, key: "abc", parent: {}, partial: true },
|
||||
{ value: undefined, key: "abc", parent: {}, partial: true },
|
||||
{ value: "def", key: "abc", parent: { abc: "def" }, partial: false },
|
||||
{
|
||||
value: undefined,
|
||||
key: "ghi",
|
||||
parent: { abc: "def" },
|
||||
partial: true,
|
||||
},
|
||||
{
|
||||
value: "jkl",
|
||||
key: "ghi",
|
||||
parent: { abc: "def", ghi: "jkl" },
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: { abc: "def", ghi: "jkl" },
|
||||
key: undefined,
|
||||
parent: undefined,
|
||||
partial: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
value: [
|
||||
'{ "foo"',
|
||||
": ",
|
||||
'{ "foo1": "ba',
|
||||
"r",
|
||||
'" , "baz',
|
||||
'": [',
|
||||
'{ "foo2": "bar2", "baz2": [',
|
||||
'{ "foo3": "bar3", "baz3": [1',
|
||||
"2",
|
||||
"3, ",
|
||||
"3, 4",
|
||||
"5",
|
||||
"6] }",
|
||||
"] }] }}",
|
||||
],
|
||||
expected: [
|
||||
{ value: undefined, key: undefined, parent: {}, partial: true },
|
||||
{ value: undefined, key: "foo", parent: {}, partial: true },
|
||||
{ value: undefined, key: undefined, parent: {}, partial: true },
|
||||
{ value: undefined, key: "foo1", parent: {}, partial: true },
|
||||
{ value: "ba", key: "foo1", parent: {}, partial: true },
|
||||
{ value: "bar", key: "foo1", parent: {}, partial: true },
|
||||
{
|
||||
value: "bar",
|
||||
key: "foo1",
|
||||
parent: { foo1: "bar" },
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: undefined,
|
||||
key: "baz",
|
||||
parent: { foo1: "bar" },
|
||||
partial: true,
|
||||
},
|
||||
{
|
||||
value: undefined,
|
||||
key: "baz",
|
||||
parent: { foo1: "bar" },
|
||||
partial: true,
|
||||
},
|
||||
{ value: undefined, key: 0, parent: [], partial: true },
|
||||
{ value: undefined, key: undefined, parent: {}, partial: true },
|
||||
{ value: undefined, key: "foo2", parent: {}, partial: true },
|
||||
{
|
||||
value: "bar2",
|
||||
key: "foo2",
|
||||
parent: { foo2: "bar2" },
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: undefined,
|
||||
key: "baz2",
|
||||
parent: { foo2: "bar2" },
|
||||
partial: true,
|
||||
},
|
||||
{ value: undefined, key: 0, parent: [], partial: true },
|
||||
{ value: undefined, key: undefined, parent: {}, partial: true },
|
||||
{ value: undefined, key: "foo3", parent: {}, partial: true },
|
||||
{
|
||||
value: "bar3",
|
||||
key: "foo3",
|
||||
parent: { foo3: "bar3" },
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: undefined,
|
||||
key: "baz3",
|
||||
parent: { foo3: "bar3" },
|
||||
partial: true,
|
||||
},
|
||||
{ value: undefined, key: 0, parent: [], partial: true },
|
||||
{ value: 1, key: 0, parent: [], partial: true },
|
||||
{ value: 12, key: 0, parent: [], partial: true },
|
||||
{ value: 123, key: 0, parent: [123], partial: false },
|
||||
{ value: 3, key: 1, parent: [123, 3], partial: false },
|
||||
{ value: 4, key: 2, parent: [123, 3], partial: true },
|
||||
{ value: 45, key: 2, parent: [123, 3], partial: true },
|
||||
{ value: 456, key: 2, parent: [123, 3, 456], partial: false },
|
||||
{
|
||||
value: [123, 3, 456],
|
||||
key: "baz3",
|
||||
parent: { foo3: "bar3", baz3: [123, 3, 456] },
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: { foo3: "bar3", baz3: [123, 3, 456] },
|
||||
key: 0,
|
||||
parent: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
key: "baz2",
|
||||
parent: {
|
||||
foo2: "bar2",
|
||||
baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
},
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: {
|
||||
foo2: "bar2",
|
||||
baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
},
|
||||
key: 0,
|
||||
parent: [
|
||||
{ foo2: "bar2", baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }] },
|
||||
],
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: [
|
||||
{ foo2: "bar2", baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }] },
|
||||
],
|
||||
key: "baz",
|
||||
parent: {
|
||||
foo1: "bar",
|
||||
baz: [
|
||||
{ foo2: "bar2", baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }] },
|
||||
],
|
||||
},
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: {
|
||||
foo1: "bar",
|
||||
baz: [
|
||||
{ foo2: "bar2", baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }] },
|
||||
],
|
||||
},
|
||||
key: "foo",
|
||||
parent: {
|
||||
foo: {
|
||||
foo1: "bar",
|
||||
baz: [
|
||||
{
|
||||
foo2: "bar2",
|
||||
baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
partial: false,
|
||||
},
|
||||
{
|
||||
value: {
|
||||
foo: {
|
||||
foo1: "bar",
|
||||
baz: [
|
||||
{
|
||||
foo2: "bar2",
|
||||
baz2: [{ foo3: "bar3", baz3: [123, 3, 456] }],
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
key: undefined,
|
||||
parent: undefined,
|
||||
partial: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
emitPartialValuesTestData.forEach(({ value, expected }) => {
|
||||
test(`TokenParser emit partial values: ${value}`, async () => {
|
||||
let i = 0;
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ emitPartialTokens: true, emitPartialValues: true }),
|
||||
value,
|
||||
({ value, key, parent, partial }) => {
|
||||
const expectedData = expected[i];
|
||||
expect(value).toEqual(expectedData.value);
|
||||
expect(key).toEqual(expectedData.key);
|
||||
expect(parent).toEqual(expectedData.parent);
|
||||
expect(partial ?? false).toEqual(expectedData.partial);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
expect(i).toEqual(expected.length);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
test("TokenParser emit partial values only if matching paths when paths is present", async () => {
|
||||
const value = ['{ "a"', ": 1,", '"b":', '{ "c":', "1 } }"];
|
||||
const expected = [
|
||||
{ value: undefined, key: "c", parent: {}, partial: true },
|
||||
{ value: 1, key: "c", parent: { c: 1 }, partial: false },
|
||||
];
|
||||
let i = 0;
|
||||
await runJSONParserTest(
|
||||
new JSONParser({
|
||||
paths: ["$.b.c"],
|
||||
emitPartialTokens: true,
|
||||
emitPartialValues: true,
|
||||
}),
|
||||
value,
|
||||
({ value, key, parent, partial }) => {
|
||||
const expectedData = expected[i];
|
||||
expect(value).toEqual(expectedData.value);
|
||||
expect(key).toEqual(expectedData.key);
|
||||
expect(parent).toEqual(expectedData.parent);
|
||||
expect(partial ?? false).toEqual(expectedData.partial);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
expect(i).toEqual(expected.length);
|
||||
});
|
||||
});
|
||||
85
dev/env/node_modules/@streamparser/json-node/test/end.ts
generated
vendored
Executable file
85
dev/env/node_modules/@streamparser/json-node/test/end.ts
generated
vendored
Executable file
@@ -0,0 +1,85 @@
|
||||
import { runJSONParserTest } from "./utils/testRunner.js";
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
|
||||
describe("end", () => {
|
||||
test("should fail if writing after ending", async () => {
|
||||
const p = new JSONParser({ separator: "" });
|
||||
|
||||
try {
|
||||
await runJSONParserTest(p, ['"test"', '"test"']);
|
||||
fail("Expected to fail!");
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
|
||||
// const autoEndValues = ["2 2", "2.33456{}", "{}{}{}"];
|
||||
|
||||
// autoEndValues.forEach((value) => {
|
||||
// test(`should auto-end after emiting one object: ${value}`, async () => {
|
||||
// const p = new JSONParser();
|
||||
|
||||
// try {
|
||||
// await runJSONParserTest(p, [value]);
|
||||
// fail(`Expected to fail on value "${value}"`);
|
||||
// } catch (e) {
|
||||
// // Expected error
|
||||
// }
|
||||
// });
|
||||
// });
|
||||
|
||||
// const numberValues = [
|
||||
// "0",
|
||||
// "2",
|
||||
// "2.33456",
|
||||
// "2.33456e+1",
|
||||
// "-2",
|
||||
// "-2.33456",
|
||||
// "-2.33456e+1",
|
||||
// ];
|
||||
|
||||
// numberValues.forEach((numberValue) => {
|
||||
// test(`should emit numbers if ending on a valid number: ${numberValue}`, async () => {
|
||||
// const p = new JSONParser({ separator: "" });
|
||||
|
||||
// await runJSONParserTest(p, [numberValue], ({ value }) =>
|
||||
// expect(value).toEqual(JSON.parse(numberValue))
|
||||
// );
|
||||
// });
|
||||
// });
|
||||
|
||||
// const endingFailingValues = [
|
||||
// "2.",
|
||||
// "2.33456e",
|
||||
// "2.33456e+",
|
||||
// '"asdfasd',
|
||||
// "tru",
|
||||
// '"fa',
|
||||
// '"nul',
|
||||
// "{",
|
||||
// "[",
|
||||
// '{ "a":',
|
||||
// '{ "a": { "b": 1, ',
|
||||
// '{ "a": { "b": 1, "c": 2, "d": 3, "e": 4 }',
|
||||
// ];
|
||||
|
||||
// endingFailingValues.forEach((value) => {
|
||||
// test(`should fail if ending in the middle of parsing: ${value}`, async () => {
|
||||
// const p = new JSONParser();
|
||||
|
||||
// try {
|
||||
// await runJSONParserTest(p, [value]);
|
||||
// fail(`Expected to fail on value "${value}"`);
|
||||
// } catch (e) {
|
||||
// // Expected error
|
||||
// }
|
||||
// });
|
||||
// });
|
||||
|
||||
// test("should not fail if ending waiting for a separator", async () => {
|
||||
// const separator = "\n";
|
||||
// const p = new JSONParser({ separator });
|
||||
|
||||
// await runJSONParserTest(p, ["1", separator, "2"]);
|
||||
// });
|
||||
});
|
||||
32
dev/env/node_modules/@streamparser/json-node/test/inputs.ts
generated
vendored
Executable file
32
dev/env/node_modules/@streamparser/json-node/test/inputs.ts
generated
vendored
Executable file
@@ -0,0 +1,32 @@
|
||||
import { runJSONParserTest, type TestData } from "./utils/testRunner.js";
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
import { charset } from "@streamparser/json/utils/utf-8.js";
|
||||
|
||||
const quote = String.fromCharCode(charset.QUOTATION_MARK);
|
||||
|
||||
describe("inputs", () => {
|
||||
const testData: TestData[] = [
|
||||
{
|
||||
value: "test",
|
||||
expected: ["test"],
|
||||
},
|
||||
{
|
||||
value: new Uint8Array([116, 101, 115, 116]),
|
||||
expected: ["test"],
|
||||
},
|
||||
{
|
||||
value: Buffer.from([116, 101, 115, 116]),
|
||||
expected: ["test"],
|
||||
},
|
||||
];
|
||||
|
||||
testData.forEach(({ value, expected: [expected] }) => {
|
||||
test(`write accept ${value}`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
[quote, value, quote],
|
||||
({ value }) => expect(value).toEqual(expected),
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
41
dev/env/node_modules/@streamparser/json-node/test/keepStack.ts
generated
vendored
Executable file
41
dev/env/node_modules/@streamparser/json-node/test/keepStack.ts
generated
vendored
Executable file
@@ -0,0 +1,41 @@
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
import { runJSONParserTest } from "./utils/testRunner.js";
|
||||
|
||||
describe("keepStack", () => {
|
||||
const testData = [
|
||||
{
|
||||
value: '{ "a": { "b": 1, "c": 2, "d": 3, "e": 4 } }',
|
||||
paths: ["$"],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": 1, "c": 2, "d": 3, "e": 4 } }',
|
||||
paths: ["$.a.*"],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": 1, "c": 2, "d": 3, "e": 4 } }',
|
||||
paths: ["$.a.e"],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": [1,2,3,4,5,6] } }',
|
||||
paths: ["$.a.b.*"],
|
||||
expected: 6,
|
||||
},
|
||||
{
|
||||
value: '[{ "a": 1 }, { "a": 2 }, { "a": 3 }]',
|
||||
paths: ["$.*"],
|
||||
},
|
||||
];
|
||||
|
||||
testData.forEach(({ value, paths }) => {
|
||||
test(`should keep parent empty if keepStack === false (${value} - ${paths})`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ paths, keepStack: false }),
|
||||
[value],
|
||||
({ parent }) => {
|
||||
if (parent === undefined) return;
|
||||
expect(Object.keys(parent).length).toEqual(0);
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
69
dev/env/node_modules/@streamparser/json-node/test/offset.ts
generated
vendored
Executable file
69
dev/env/node_modules/@streamparser/json-node/test/offset.ts
generated
vendored
Executable file
@@ -0,0 +1,69 @@
|
||||
import { runTokenizerTest } from "./utils/testRunner.js";
|
||||
import { Tokenizer } from "../src/index.js";
|
||||
import TokenType from "@streamparser/json/utils/types/tokenType.js";
|
||||
|
||||
const input1 = '{\n "string": "value",\n "number": 3,\n "object"';
|
||||
const input2 = ': {\n "key": "vд"\n },\n "array": [\n -1,\n 12\n ]\n ';
|
||||
const input3 = '"null": null, "true": true, "false": false, "frac": 3.14,';
|
||||
const input4 = '"escape": "\\"\\u00e1" }';
|
||||
|
||||
const offsets = [
|
||||
[0, TokenType.LEFT_BRACE],
|
||||
[4, TokenType.STRING],
|
||||
[12, TokenType.COLON],
|
||||
[14, TokenType.STRING],
|
||||
[21, TokenType.COMMA],
|
||||
[25, TokenType.STRING],
|
||||
[33, TokenType.COLON],
|
||||
[35, TokenType.NUMBER],
|
||||
[36, TokenType.COMMA],
|
||||
[40, TokenType.STRING],
|
||||
[48, TokenType.COLON],
|
||||
[50, TokenType.LEFT_BRACE],
|
||||
[54, TokenType.STRING],
|
||||
[59, TokenType.COLON],
|
||||
[61, TokenType.STRING],
|
||||
[69, TokenType.RIGHT_BRACE],
|
||||
[70, TokenType.COMMA],
|
||||
[74, TokenType.STRING],
|
||||
[81, TokenType.COLON],
|
||||
[83, TokenType.LEFT_BRACKET],
|
||||
[87, TokenType.NUMBER],
|
||||
[89, TokenType.COMMA],
|
||||
[93, TokenType.NUMBER],
|
||||
[98, TokenType.RIGHT_BRACKET],
|
||||
[102, TokenType.STRING],
|
||||
[108, TokenType.COLON],
|
||||
[110, TokenType.NULL],
|
||||
[114, TokenType.COMMA],
|
||||
[116, TokenType.STRING],
|
||||
[122, TokenType.COLON],
|
||||
[124, TokenType.TRUE],
|
||||
[128, TokenType.COMMA],
|
||||
[130, TokenType.STRING],
|
||||
[137, TokenType.COLON],
|
||||
[139, TokenType.FALSE],
|
||||
[144, TokenType.COMMA],
|
||||
[146, TokenType.STRING],
|
||||
[152, TokenType.COLON],
|
||||
[154, TokenType.NUMBER],
|
||||
[158, TokenType.COMMA],
|
||||
[159, TokenType.STRING],
|
||||
[167, TokenType.COLON],
|
||||
[169, TokenType.STRING],
|
||||
[180, TokenType.RIGHT_BRACE],
|
||||
];
|
||||
|
||||
test("offset", async () => {
|
||||
let i = 0;
|
||||
|
||||
await runTokenizerTest(
|
||||
new Tokenizer(),
|
||||
[input1, input2, input3, input4],
|
||||
({ token, offset }) => {
|
||||
expect(offset).toEqual(offsets[i][0]);
|
||||
expect(token).toEqual(offsets[i][1]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
123
dev/env/node_modules/@streamparser/json-node/test/performance.ts
generated
vendored
Executable file
123
dev/env/node_modules/@streamparser/json-node/test/performance.ts
generated
vendored
Executable file
@@ -0,0 +1,123 @@
|
||||
// Commented out due to timing
|
||||
test("", () => {
|
||||
/* Do nothing */
|
||||
});
|
||||
// import { runJSONParserTest } from "./utils/testRunner.js";
|
||||
// import JSONParser from "../src/jsonparser.js";
|
||||
// import { charset } from "@streamparser/json/utils/utf-8.js";
|
||||
|
||||
// const quote = String.fromCharCode(charset.QUOTATION_MARK);
|
||||
|
||||
// const oneKB = 1024;
|
||||
// const oneMB = 1024 * oneKB;
|
||||
// const twoHundredMB = 200 * oneMB;
|
||||
// const kbsIn200MBs = twoHundredMB / oneKB;
|
||||
|
||||
// describe("performance", () => {
|
||||
// describe("buffered parsing", () => {
|
||||
// test("can handle large strings without running out of memory", async () => {
|
||||
// await runJSONParserTest(
|
||||
// new JSONParser({ stringBufferSize: 64 * 1024 }),
|
||||
// function* () {
|
||||
// const chunk = new Uint8Array(oneKB).fill(charset.LATIN_SMALL_LETTER_A);
|
||||
// yield quote;
|
||||
// for (let index = 0; index < kbsIn200MBs; index++) {
|
||||
// yield chunk;
|
||||
// }
|
||||
// yield quote;
|
||||
// },
|
||||
// ({ value }) => expect((value as string).length).toEqual(twoHundredMB)
|
||||
// );
|
||||
// });
|
||||
|
||||
// test("can handle large numbers without running out of memory", async () => {
|
||||
// const jsonParser = new JSONParser({ numberBufferSize: 64 * 1024 });
|
||||
// await runJSONParserTest(
|
||||
// jsonParser,
|
||||
// function* () {
|
||||
// const chunk = new Uint8Array(oneKB).fill(charset.DIGIT_ONE);
|
||||
// yield "1.";
|
||||
// for (let index = 0; index < kbsIn200MBs; index++) {
|
||||
// yield chunk;
|
||||
// }
|
||||
// },
|
||||
// ({ value }) => expect(value).toEqual(1.1111111111111112)
|
||||
// );
|
||||
// jsonParser.end();
|
||||
// });
|
||||
// });
|
||||
|
||||
// test(`should keep memory stable if keepStack === false on array`, async () => {
|
||||
// const chunk = new Uint8Array(oneKB).fill(charset.LATIN_SMALL_LETTER_A);
|
||||
// chunk[0] = charset.QUOTATION_MARK;
|
||||
// chunk[chunk.length - 1] = charset.QUOTATION_MARK;
|
||||
// const commaChunk = new Uint8Array([charset.COMMA]);
|
||||
|
||||
// const intialMemoryUsage = process.memoryUsage().heapUsed;
|
||||
// const thirtyMBs = 20 * 1024 * 1024;
|
||||
// let valuesLeft = kbsIn200MBs;
|
||||
|
||||
// await runJSONParserTest(
|
||||
// new JSONParser({
|
||||
// paths: ["$.*"],
|
||||
// keepStack: false,
|
||||
// stringBufferSize: oneKB,
|
||||
// }),
|
||||
// function* () {
|
||||
// yield new Uint8Array([charset.LEFT_SQUARE_BRACKET]);
|
||||
// // decreasing so the number doesn't need to be reallocated
|
||||
// for (let index = kbsIn200MBs; index > 0; index--) {
|
||||
// yield chunk;
|
||||
// yield commaChunk;
|
||||
// }
|
||||
// yield chunk;
|
||||
// yield new Uint8Array([charset.RIGHT_SQUARE_BRACKET]);
|
||||
// },
|
||||
// () => {
|
||||
// if (valuesLeft-- % oneKB !== 0) return;
|
||||
|
||||
// const actualMemoryUsage = process.memoryUsage().heapUsed;
|
||||
// expect(actualMemoryUsage - intialMemoryUsage < thirtyMBs).toBeTruthy();
|
||||
// }
|
||||
// );
|
||||
// });
|
||||
|
||||
// test(`should keep memory stable if keepStack === false on object`, async () => {
|
||||
// const chunk = new Uint8Array(oneKB).fill(charset.LATIN_SMALL_LETTER_A);
|
||||
// chunk[0] = charset.QUOTATION_MARK;
|
||||
// chunk[1] = charset.LATIN_SMALL_LETTER_A;
|
||||
// chunk[2] = charset.QUOTATION_MARK;
|
||||
// chunk[3] = charset.COLON;
|
||||
// chunk[4] = charset.QUOTATION_MARK;
|
||||
// chunk[chunk.length - 1] = charset.QUOTATION_MARK;
|
||||
// const commaChunk = new Uint8Array([charset.COMMA]);
|
||||
|
||||
// const intialMemoryUsage = process.memoryUsage().heapUsed;
|
||||
// const thirtyMBs = 20 * 1024 * 1024;
|
||||
// let valuesLeft = kbsIn200MBs;
|
||||
|
||||
// await runJSONParserTest(
|
||||
// new JSONParser({
|
||||
// paths: ["$.*"],
|
||||
// keepStack: false,
|
||||
// stringBufferSize: oneKB,
|
||||
// }),
|
||||
// function* () {
|
||||
// yield new Uint8Array([charset.LEFT_CURLY_BRACKET]);
|
||||
// // decreasing so the number doesn't need to be reallocated
|
||||
// for (let index = kbsIn200MBs; index > 0; index--) {
|
||||
// yield chunk;
|
||||
// yield commaChunk;
|
||||
// }
|
||||
// yield chunk;
|
||||
// yield new Uint8Array([charset.RIGHT_CURLY_BRACKET]);
|
||||
// },
|
||||
// () => {
|
||||
// if (valuesLeft-- % oneKB !== 0) return;
|
||||
|
||||
// const actualMemoryUsage = process.memoryUsage().heapUsed;
|
||||
// expect(actualMemoryUsage - intialMemoryUsage < thirtyMBs).toBeTruthy();
|
||||
// }
|
||||
// );
|
||||
// });
|
||||
// });
|
||||
92
dev/env/node_modules/@streamparser/json-node/test/selectors.ts
generated
vendored
Executable file
92
dev/env/node_modules/@streamparser/json-node/test/selectors.ts
generated
vendored
Executable file
@@ -0,0 +1,92 @@
|
||||
import { runJSONParserTest, type TestData } from "./utils/testRunner.js";
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
|
||||
describe("selectors", () => {
|
||||
const testData: TestData[] = [
|
||||
{ value: "[0,1,-1]", paths: ["$"], expected: [[0, 1, -1]] },
|
||||
{ value: "[0,1,-1]", paths: ["$.*"], expected: [0, 1, -1] },
|
||||
{ value: "[0,1,-1]", expected: [0, 1, -1, [0, 1, -1]] },
|
||||
{ value: "[0,1,-1]", paths: ["$*"], expected: [0, 1, -1, [0, 1, -1]] },
|
||||
{
|
||||
value: "[0,1,[-1, 2]]",
|
||||
paths: ["$", "$.*"],
|
||||
expected: [0, 1, [-1, 2], [0, 1, [-1, 2]]],
|
||||
},
|
||||
{ value: "[0,1,-1]", paths: ["$.1"], expected: [1] },
|
||||
{
|
||||
value: '{ "a": { "b": 1, "c": 2 } }',
|
||||
paths: ["$.a.*"],
|
||||
expected: [1, 2],
|
||||
},
|
||||
{ value: '{ "a": { "b": 1, "c": 2 } }', paths: ["$.a.c"], expected: [2] },
|
||||
{
|
||||
value: '{ "a": { "b": [1,2], "c": [3, 4] } }',
|
||||
paths: ["$.a.*.*"],
|
||||
expected: [1, 2, 3, 4],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": [1,2], "c": [3, 4] } }',
|
||||
paths: ["$.a.*.1"],
|
||||
expected: [2, 4],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": [1,2], "c": [3, 4] } }',
|
||||
paths: ["$.a.c.*"],
|
||||
expected: [3, 4],
|
||||
},
|
||||
{
|
||||
value: '{ "a": { "b": [1,2], "c": [3, 4] } }',
|
||||
paths: ["$.a.c.1"],
|
||||
expected: [4],
|
||||
},
|
||||
{
|
||||
value: '{ "a": [ {"b": 1}, {"c": 2} ] }',
|
||||
paths: ["$.a.0.b"],
|
||||
expected: [1],
|
||||
},
|
||||
];
|
||||
|
||||
testData.forEach(({ value, paths, expected }) => {
|
||||
test(`Using selector ${paths} should emit only selected values`, async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ paths }),
|
||||
[value],
|
||||
({ value }) => {
|
||||
expect(value).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
|
||||
expect(i).toEqual(expected.length);
|
||||
});
|
||||
});
|
||||
|
||||
const invalidTestData = [
|
||||
{
|
||||
paths: ["*"],
|
||||
expectedError: 'Invalid selector "*". Should start with "$".',
|
||||
},
|
||||
{
|
||||
paths: [".*"],
|
||||
expectedError: 'Invalid selector ".*". Should start with "$".',
|
||||
},
|
||||
{
|
||||
paths: ["$..*"],
|
||||
expectedError: 'Invalid selector "$..*". ".." syntax not supported.',
|
||||
},
|
||||
];
|
||||
|
||||
invalidTestData.forEach(({ paths, expectedError }) => {
|
||||
test(`fail on invalid selector ${paths}`, () => {
|
||||
try {
|
||||
new JSONParser({ paths });
|
||||
fail("Error expected on invalid selector");
|
||||
} catch (err: unknown) {
|
||||
expect(err).toBeInstanceOf(Error);
|
||||
expect((err as Error).message).toEqual(expectedError);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
119
dev/env/node_modules/@streamparser/json-node/test/separator.ts
generated
vendored
Executable file
119
dev/env/node_modules/@streamparser/json-node/test/separator.ts
generated
vendored
Executable file
@@ -0,0 +1,119 @@
|
||||
import {
|
||||
runJSONParserTest,
|
||||
runTokenParserTest,
|
||||
TestData,
|
||||
} from "./utils/testRunner.js";
|
||||
import JSONParser from "../src/jsonparser.js";
|
||||
import TokenParser from "../src/tokenparser.js";
|
||||
import TokenType from "@streamparser/json/utils/types/tokenType.js";
|
||||
|
||||
describe("separator", () => {
|
||||
const testData: TestData[] = [
|
||||
{ value: "true", expected: [true] },
|
||||
{ value: "false", expected: [false] },
|
||||
{ value: "null", expected: [null] },
|
||||
{ value: '"string"', expected: ["string"] },
|
||||
{ value: "[1,2,3]", expected: [1, 2, 3, [1, 2, 3]] },
|
||||
{
|
||||
value: '{ "a": 0, "b": 1, "c": -1 }',
|
||||
expected: [0, 1, -1, { a: 0, b: 1, c: -1 }],
|
||||
},
|
||||
];
|
||||
|
||||
const expected = testData
|
||||
.map(({ expected }) => expected)
|
||||
.reduce((acc, val) => [...acc, ...val], []);
|
||||
|
||||
const separators = ["", "\n", "\t\n", "abc", "SEPARATOR"];
|
||||
separators.forEach((separator) => {
|
||||
test(`separator: "${separator}"`, async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ separator }),
|
||||
testData.flatMap(({ value }) => [value, separator]),
|
||||
({ value }) => {
|
||||
expect(value).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("support multiple whitespace separators", async () => {
|
||||
let i = 0;
|
||||
const value = "1 2\t3\n4\n\r5 \n6\n\n7\n\r\n\r8 \t\n\n\r9";
|
||||
const expected = [1, 2, 3, 4, 5, 6, 7, 8, 9];
|
||||
const separator = "";
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ separator }),
|
||||
value,
|
||||
({ value }) => {
|
||||
expect(value).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
test(`separator: fail on invalid value`, async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser({ separator: "abc" }), ["abe"]);
|
||||
} catch (err: unknown) {
|
||||
expect(err).toBeInstanceOf(Error);
|
||||
expect((err as Error).message).toEqual(
|
||||
'Unexpected "e" at position "2" in state SEPARATOR',
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
test(`fail on invalid token type`, async () => {
|
||||
try {
|
||||
await runTokenParserTest(new TokenParser({ separator: "\n" }), [
|
||||
{ token: TokenType.TRUE, value: true },
|
||||
{ token: TokenType.TRUE, value: true },
|
||||
]);
|
||||
fail("Error expected on invalid selector");
|
||||
} catch (err: unknown) {
|
||||
expect(err).toBeInstanceOf(Error);
|
||||
expect((err as Error).message).toEqual(
|
||||
"Unexpected TRUE (true) in state SEPARATOR",
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
test("fail on invalid value passed to TokenParser", async () => {
|
||||
try {
|
||||
await runTokenParserTest(new TokenParser({ separator: "\n" }), [
|
||||
{ token: TokenType.TRUE, value: true },
|
||||
{ token: TokenType.SEPARATOR, value: "\r\n" },
|
||||
]);
|
||||
fail("Error expected on invalid selector");
|
||||
} catch (err: unknown) {
|
||||
expect(err).toBeInstanceOf(Error);
|
||||
expect((err as Error).message).toEqual(
|
||||
'Unexpected SEPARATOR ("\\r\\n") in state SEPARATOR',
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
test("not fail when whitespaces match separator", async () => {
|
||||
let i = 0;
|
||||
const value = `{
|
||||
"a": 0,
|
||||
"b": 1,
|
||||
"c": -1
|
||||
}`;
|
||||
const expected = [0, 1, -1, { a: 0, b: 1, c: -1 }];
|
||||
const separator = "\n";
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ separator }),
|
||||
value,
|
||||
({ value }) => {
|
||||
expect(value).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
109
dev/env/node_modules/@streamparser/json-node/test/types/arrays.ts
generated
vendored
Executable file
109
dev/env/node_modules/@streamparser/json-node/test/types/arrays.ts
generated
vendored
Executable file
@@ -0,0 +1,109 @@
|
||||
import { runJSONParserTest, type TestData } from "../utils/testRunner.js";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
|
||||
describe("arrays", () => {
|
||||
const testData: TestData[] = [
|
||||
{ value: "[]", expected: [[[], []]] },
|
||||
{
|
||||
value: "[0,1,-1]",
|
||||
expected: [
|
||||
[[0], 0],
|
||||
[[1], 1],
|
||||
[[2], -1],
|
||||
[[], [0, 1, -1]],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "[1.0,1.1,-1.1,-1.0]",
|
||||
expected: [
|
||||
[[0], 1],
|
||||
[[1], 1.1],
|
||||
[[2], -1.1],
|
||||
[[3], -1],
|
||||
[[], [1, 1.1, -1.1, -1]],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "[-1]",
|
||||
expected: [
|
||||
[[0], -1],
|
||||
[[], [-1]],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "[-0.1]",
|
||||
expected: [
|
||||
[[0], -0.1],
|
||||
[[], [-0.1]],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "[6.02e23, 6.02e+23, 6.02e-23, 0e23]",
|
||||
expected: [
|
||||
[[0], 6.02e23],
|
||||
[[1], 6.02e23],
|
||||
[[2], 6.02e-23],
|
||||
[[3], 0e23],
|
||||
[[], [6.02e23, 6.02e23, 6.02e-23, 0e23]],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: "[7161093205057351174]",
|
||||
expected: [
|
||||
[[0], 7161093205057352000],
|
||||
[[], [7161093205057352000]],
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
testData.forEach(({ value, expected }) => {
|
||||
test(value as string, async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
[value],
|
||||
({ value, key, stack }) => {
|
||||
const keys = stack
|
||||
.slice(1)
|
||||
.map((item) => item.key)
|
||||
.concat(key !== undefined ? key : []);
|
||||
|
||||
expect([keys, value]).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
test("chuncked", async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
(value as string).split(""),
|
||||
({ value, key, stack }) => {
|
||||
const keys = stack
|
||||
.slice(1)
|
||||
.map((item) => item.key)
|
||||
.concat(key !== undefined ? key : []);
|
||||
|
||||
expect([keys, value]).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
const invalidValues = ["[,", "[1, eer]", "[1,]", "[1;", "[1}"];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test(`fail on invalid values: ${value}`, async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), [value]);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
45
dev/env/node_modules/@streamparser/json-node/test/types/booleans.ts
generated
vendored
Executable file
45
dev/env/node_modules/@streamparser/json-node/test/types/booleans.ts
generated
vendored
Executable file
@@ -0,0 +1,45 @@
|
||||
import { runJSONParserTest } from "../utils/testRunner.js";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
|
||||
describe("boolean", () => {
|
||||
const values = ["true", "false"];
|
||||
|
||||
values.forEach((stringValue) => {
|
||||
test(stringValue, async () => {
|
||||
await runJSONParserTest(new JSONParser(), [stringValue], ({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
});
|
||||
});
|
||||
|
||||
test(`${stringValue} (chuncked)`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
(stringValue as string).split(""),
|
||||
({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
const invalidValues = [
|
||||
"tRue",
|
||||
"trUe",
|
||||
"truE",
|
||||
"fAlse",
|
||||
"faLse",
|
||||
"falSe",
|
||||
"falsE",
|
||||
];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test("fail on invalid values", async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), [value]);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
37
dev/env/node_modules/@streamparser/json-node/test/types/null.ts
generated
vendored
Executable file
37
dev/env/node_modules/@streamparser/json-node/test/types/null.ts
generated
vendored
Executable file
@@ -0,0 +1,37 @@
|
||||
import { runJSONParserTest } from "../utils/testRunner.js";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
|
||||
describe("null", () => {
|
||||
const values = ["null"];
|
||||
|
||||
values.forEach((stringValue) => {
|
||||
test(stringValue, async () => {
|
||||
await runJSONParserTest(new JSONParser(), [stringValue], ({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
});
|
||||
});
|
||||
|
||||
test(`${stringValue} (chuncked)`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
(stringValue as string).split(""),
|
||||
({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
const invalidValues = ["nUll", "nuLl", "nulL"];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test("fail on invalid values", async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), [value]);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
101
dev/env/node_modules/@streamparser/json-node/test/types/numbers.ts
generated
vendored
Executable file
101
dev/env/node_modules/@streamparser/json-node/test/types/numbers.ts
generated
vendored
Executable file
@@ -0,0 +1,101 @@
|
||||
import { runJSONParserTest } from "../utils/testRunner.js";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
|
||||
describe("number", () => {
|
||||
const values = [
|
||||
"0",
|
||||
"0e1",
|
||||
"0e+1",
|
||||
"0e-1",
|
||||
"0.123",
|
||||
"0.123e00",
|
||||
"0.123e+1",
|
||||
"0.123e-1",
|
||||
"0.123E00",
|
||||
"0.123E+1",
|
||||
"0.123E-1",
|
||||
"-0",
|
||||
"-0e1",
|
||||
"-0e+1",
|
||||
"-0e-1",
|
||||
"-0.123",
|
||||
"-0.123e00",
|
||||
"-0.123e+1",
|
||||
"-0.123e-1",
|
||||
"-0.123E00",
|
||||
"-0.123E+1",
|
||||
"-0.123E-1",
|
||||
"-123",
|
||||
"-123e1",
|
||||
"-123e+1",
|
||||
"-123e-1",
|
||||
"-123.123",
|
||||
"-123.123e00",
|
||||
"-123.123e+1",
|
||||
"-123.123e-1",
|
||||
"-123.123E00",
|
||||
"-123.123E+1",
|
||||
"-123.123E-1",
|
||||
"123",
|
||||
"123e1",
|
||||
"123e+1",
|
||||
"123e-1",
|
||||
"123.123",
|
||||
"123.123e00",
|
||||
"123.123e+1",
|
||||
"123.123e-1",
|
||||
"123.123E00",
|
||||
"123.123E+1",
|
||||
"123.123E-1",
|
||||
"7161093205057351174",
|
||||
"21e999",
|
||||
];
|
||||
|
||||
const bufferSizes = [0, 1, 64 * 1024];
|
||||
|
||||
bufferSizes.forEach((numberBufferSize) => {
|
||||
values.forEach((stringValue) => {
|
||||
test(`${stringValue} (bufferSize ${numberBufferSize})`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ numberBufferSize }),
|
||||
[stringValue],
|
||||
({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
test(`${stringValue} (chunked, bufferSize ${numberBufferSize})`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ numberBufferSize }),
|
||||
(stringValue as string).split(""),
|
||||
({ value }) => {
|
||||
expect(value).toEqual(JSON.parse(stringValue));
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
const invalidValues = [
|
||||
"-a",
|
||||
"-e",
|
||||
"1a",
|
||||
"1.a",
|
||||
"1.e",
|
||||
"1.-",
|
||||
"1.0ea",
|
||||
"1.0e1.2",
|
||||
];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test("fail on invalid values", async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), [value]);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
134
dev/env/node_modules/@streamparser/json-node/test/types/objects.ts
generated
vendored
Executable file
134
dev/env/node_modules/@streamparser/json-node/test/types/objects.ts
generated
vendored
Executable file
@@ -0,0 +1,134 @@
|
||||
import { runJSONParserTest, type TestData } from "../utils/testRunner.js";
|
||||
import { readFileSync } from "fs";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
|
||||
describe("objects", () => {
|
||||
const testData: TestData[] = [
|
||||
{ value: "{}", expected: [[[], {}]] },
|
||||
{
|
||||
value: '{ "a": 0, "b": 1, "c": -1 }',
|
||||
expected: [
|
||||
[["a"], 0],
|
||||
[["b"], 1],
|
||||
[["c"], -1],
|
||||
[[], { a: 0, b: 1, c: -1 }],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "a": 1.0, "b": 1.1, "c": -1.1, "d": -1.0 }',
|
||||
expected: [
|
||||
[["a"], 1],
|
||||
[["b"], 1.1],
|
||||
[["c"], -1.1],
|
||||
[["d"], -1],
|
||||
[[], { a: 1, b: 1.1, c: -1.1, d: -1 }],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "e": -1 }',
|
||||
expected: [
|
||||
[["e"], -1],
|
||||
[[], { e: -1 }],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "f": -0.1 }',
|
||||
expected: [
|
||||
[["f"], -0.1],
|
||||
[[], { f: -0.1 }],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "a": 6.02e23, "b": 6.02e+23, "c": 6.02e-23, "d": 0e23 }',
|
||||
expected: [
|
||||
[["a"], 6.02e23],
|
||||
[["b"], 6.02e23],
|
||||
[["c"], 6.02e-23],
|
||||
[["d"], 0e23],
|
||||
[[], { a: 6.02e23, b: 6.02e23, c: 6.02e-23, d: 0e23 }],
|
||||
],
|
||||
},
|
||||
{
|
||||
value: '{ "a": 7161093205057351174 }',
|
||||
expected: [
|
||||
[["a"], 7161093205057352000],
|
||||
[[], { a: 7161093205057352000 }],
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
testData.forEach(({ value, expected }) => {
|
||||
test(value as string, async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
[value],
|
||||
({ value, key, stack }) => {
|
||||
const keys = stack
|
||||
.slice(1)
|
||||
.map((item) => item.key)
|
||||
.concat(key !== undefined ? key : []);
|
||||
|
||||
expect([keys, value]).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
test("chuncked", async () => {
|
||||
let i = 0;
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
(value as string).split(""),
|
||||
({ value, key, stack }) => {
|
||||
const keys = stack
|
||||
.slice(1)
|
||||
.map((item) => item.key)
|
||||
.concat(key !== undefined ? key : []);
|
||||
|
||||
expect([keys, value]).toEqual(expected[i]);
|
||||
i += 1;
|
||||
},
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("complex object", async () => {
|
||||
const stringifiedJson = readFileSync(
|
||||
`${__dirname}/../../../../samplejson/basic.json`,
|
||||
).toString();
|
||||
|
||||
await runJSONParserTest(
|
||||
new JSONParser(),
|
||||
[stringifiedJson],
|
||||
({ value, stack }) => {
|
||||
if (stack.length === 0) {
|
||||
expect(value).toEqual(JSON.parse(stringifiedJson));
|
||||
}
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
const invalidValues = [
|
||||
"{,",
|
||||
'{"test": eer[ }',
|
||||
"{ test: 1 }",
|
||||
'{ "test": 1 ;',
|
||||
'{ "test": 1 ]',
|
||||
'{ "test": 1, }',
|
||||
'{ "test", }',
|
||||
];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test(`fail on invalid values: ${value}`, async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), [value]);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
184
dev/env/node_modules/@streamparser/json-node/test/types/strings.ts
generated
vendored
Executable file
184
dev/env/node_modules/@streamparser/json-node/test/types/strings.ts
generated
vendored
Executable file
@@ -0,0 +1,184 @@
|
||||
import { runJSONParserTest } from "../utils/testRunner.js";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
import { charset } from "@streamparser/json/utils/utf-8.js";
|
||||
|
||||
const quote = String.fromCharCode(charset.QUOTATION_MARK);
|
||||
|
||||
describe("string", () => {
|
||||
const values = [
|
||||
"Hello world!",
|
||||
'\\r\\n\\f\\t\\\\\\/\\"',
|
||||
"\\u039b\\u03ac\\u03bc\\u03b2\\u03b4\\u03b1",
|
||||
"☃",
|
||||
"├──",
|
||||
"snow: ☃!",
|
||||
"õ",
|
||||
];
|
||||
|
||||
const bufferSizes = [0, 1, 64 * 1024];
|
||||
|
||||
bufferSizes.forEach((stringBufferSize) => {
|
||||
values.forEach((stringValue) => {
|
||||
test(`${stringValue} (bufferSize ${stringBufferSize})`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, stringValue, quote],
|
||||
({ value }) => expect(value).toEqual(JSON.parse(`"${stringValue}"`)),
|
||||
);
|
||||
});
|
||||
|
||||
test(`${stringValue} (chunked, bufferSize ${stringBufferSize})`, async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, ...(stringValue as string).split(""), quote],
|
||||
({ value }) => expect(value).toEqual(JSON.parse(`"${stringValue}"`)),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("multibyte characters", () => {
|
||||
test("2 byte utf8 'De' character: д", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, new Uint8Array([0xd0, 0xb4]), quote],
|
||||
({ value }) => expect(value).toEqual("д"),
|
||||
);
|
||||
});
|
||||
|
||||
test("3 byte utf8 'Han' character: 我", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, new Uint8Array([0xe6, 0x88, 0x91]), quote],
|
||||
({ value }) => expect(value).toEqual("我"),
|
||||
);
|
||||
});
|
||||
|
||||
test("4 byte utf8 character (unicode scalar U+2070E): 𠜎", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, new Uint8Array([0xf0, 0xa0, 0x9c, 0x8e]), quote],
|
||||
({ value }) => expect(value).toEqual("𠜎"),
|
||||
);
|
||||
});
|
||||
|
||||
describe("chunking", () => {
|
||||
test("2 byte utf8 'De' character chunked inbetween 1st and 3nd byte: д", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, new Uint8Array([0xd0]), new Uint8Array([0xb4]), quote],
|
||||
({ value }) => expect(value).toEqual("д"),
|
||||
);
|
||||
});
|
||||
|
||||
test("3 byte utf8 'Han' character chunked inbetween 2nd and 3rd byte: 我", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[
|
||||
quote,
|
||||
new Uint8Array([0xe6, 0x88]),
|
||||
new Uint8Array([0x91]),
|
||||
quote,
|
||||
],
|
||||
({ value }) => expect(value).toEqual("我"),
|
||||
);
|
||||
});
|
||||
|
||||
test("4 byte utf8 character (unicode scalar U+2070E) chunked inbetween 2nd and 3rd byte: 𠜎", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[
|
||||
quote,
|
||||
new Uint8Array([0xf0, 0xa0]),
|
||||
new Uint8Array([0x9c, 0x8e]),
|
||||
quote,
|
||||
],
|
||||
({ value }) => expect(value).toEqual("𠜎"),
|
||||
);
|
||||
});
|
||||
|
||||
test("1-4 byte utf8 character string chunked inbetween random bytes: Aж文𠜱B", async () => {
|
||||
const eclectic_buffer = new Uint8Array([
|
||||
0x41, // A
|
||||
0xd0,
|
||||
0xb6, // ж
|
||||
0xe6,
|
||||
0x96,
|
||||
0x87, // 文
|
||||
0xf0,
|
||||
0xa0,
|
||||
0x9c,
|
||||
0xb1, // 𠜱
|
||||
0x42,
|
||||
]); // B
|
||||
|
||||
for (let i = 0; i < 11; i++) {
|
||||
const firstBuffer = eclectic_buffer.slice(0, i);
|
||||
const secondBuffer = eclectic_buffer.slice(i);
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, firstBuffer, secondBuffer, quote],
|
||||
({ value }) => expect(value).toEqual("Aж文𠜱B"),
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe("surrogate", () => {
|
||||
test("parse surrogate pair", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, "\\uD83D\\uDE0B", quote],
|
||||
({ value }) => expect(value).toEqual("😋"),
|
||||
);
|
||||
});
|
||||
|
||||
test("surrogate pair (chunked)", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, "\\uD83D", "\\uDE0B", quote],
|
||||
({ value }) => expect(value).toEqual("😋"),
|
||||
);
|
||||
});
|
||||
|
||||
test("not error on broken surrogate pair", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize }),
|
||||
[quote, "\\uD83D\\uEFFF", quote],
|
||||
({ value }) => expect(value).toEqual("<22>"),
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
test("should flush the buffer if there is not space for incoming data", async () => {
|
||||
await runJSONParserTest(
|
||||
new JSONParser({ stringBufferSize: 1 }),
|
||||
[quote, "aaaa", "𠜎", quote],
|
||||
({ value }) => expect(value).toEqual("aaaa𠜎"),
|
||||
);
|
||||
});
|
||||
|
||||
const invalidValues = [
|
||||
'"\n"',
|
||||
'"\\j"',
|
||||
'"\\ua"',
|
||||
'"\\u1*"',
|
||||
'"\\u12*"',
|
||||
"\\u123*",
|
||||
'"\0"',
|
||||
'"\\uG"',
|
||||
'"\\u000G"',
|
||||
];
|
||||
|
||||
invalidValues.forEach((value) => {
|
||||
test(`fail on invalid values ${value}`, async () => {
|
||||
try {
|
||||
await runJSONParserTest(new JSONParser(), value);
|
||||
fail(`Expected to fail on value "${value}"`);
|
||||
} catch {
|
||||
// Expected error
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
11
dev/env/node_modules/@streamparser/json-node/test/utils/setup.ts
generated
vendored
Executable file
11
dev/env/node_modules/@streamparser/json-node/test/utils/setup.ts
generated
vendored
Executable file
@@ -0,0 +1,11 @@
|
||||
import {
|
||||
TransformStream as NodeTransformStream,
|
||||
ReadableStream as NodeReadableStream,
|
||||
} from "node:stream/web";
|
||||
|
||||
if (!global.TransformStream) {
|
||||
// @ts-expect-error Overriding TransformStream for Node 16
|
||||
global.TransformStream = NodeTransformStream;
|
||||
// @ts-expect-error Overriding ReadableStream for Node 16
|
||||
global.ReadableStream = NodeReadableStream;
|
||||
}
|
||||
90
dev/env/node_modules/@streamparser/json-node/test/utils/testRunner.ts
generated
vendored
Executable file
90
dev/env/node_modules/@streamparser/json-node/test/utils/testRunner.ts
generated
vendored
Executable file
@@ -0,0 +1,90 @@
|
||||
import { Readable } from "stream";
|
||||
import JSONParser from "../../src/jsonparser.js";
|
||||
import Tokenizer from "../../src/tokenizer.js";
|
||||
import TokenParser from "../../src/tokenparser.js";
|
||||
import type { ParsedTokenInfo } from "@streamparser/json/utils/types/parsedTokenInfo.js";
|
||||
import type { ParsedElementInfo } from "@streamparser/json/utils/types/parsedElementInfo.js";
|
||||
|
||||
export type TestData = {
|
||||
value: string | string[] | Iterable<number>;
|
||||
paths?: string[];
|
||||
expected: unknown[];
|
||||
};
|
||||
|
||||
type ParseableData = string | Iterable<number>;
|
||||
type InputData<T> = T | T[] | (() => Generator<T>);
|
||||
|
||||
function iterableData<T>(data: InputData<T>): Iterable<T> {
|
||||
if (typeof data === "function") return (data as () => Generator<T>)();
|
||||
if (Array.isArray(data)) return data;
|
||||
return [data];
|
||||
}
|
||||
|
||||
export async function runJSONParserTest(
|
||||
jsonparser: JSONParser,
|
||||
data: InputData<ParseableData>,
|
||||
onValue: (parsedElementInfo: ParsedElementInfo) => void = () => {
|
||||
/* Do nothing */
|
||||
},
|
||||
) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const input = new Readable();
|
||||
input._read = () => {
|
||||
/* Do nothing */
|
||||
};
|
||||
jsonparser.on("data", onValue);
|
||||
jsonparser.on("error", reject);
|
||||
jsonparser.on("end", resolve);
|
||||
input.pipe(jsonparser);
|
||||
for (const value of iterableData(data)) {
|
||||
input.push(value);
|
||||
}
|
||||
input.push(null);
|
||||
});
|
||||
}
|
||||
|
||||
export async function runTokenizerTest(
|
||||
tokenizer: Tokenizer,
|
||||
data: InputData<ParseableData>,
|
||||
onToken: (parsedElementInfo: ParsedTokenInfo) => void = () => {
|
||||
/* Do nothing */
|
||||
},
|
||||
) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const input = new Readable({ objectMode: true });
|
||||
input._read = () => {
|
||||
/* Do nothing */
|
||||
};
|
||||
tokenizer.on("data", onToken);
|
||||
tokenizer.on("error", reject);
|
||||
tokenizer.on("end", resolve);
|
||||
input.pipe(tokenizer);
|
||||
for (const value of iterableData(data)) {
|
||||
input.push(value);
|
||||
}
|
||||
input.push(null);
|
||||
});
|
||||
}
|
||||
|
||||
export async function runTokenParserTest(
|
||||
tokenParser: TokenParser,
|
||||
data: InputData<Omit<ParsedTokenInfo, "offset">>,
|
||||
onValue: (parsedElementInfo: ParsedElementInfo) => void = () => {
|
||||
/* Do nothing */
|
||||
},
|
||||
) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const input = new Readable({ objectMode: true });
|
||||
input._read = () => {
|
||||
/* Do nothing */
|
||||
};
|
||||
tokenParser.on("data", onValue);
|
||||
tokenParser.on("error", reject);
|
||||
tokenParser.on("end", resolve);
|
||||
input.pipe(tokenParser);
|
||||
for (const value of iterableData(data)) {
|
||||
input.push(value);
|
||||
}
|
||||
input.push(null);
|
||||
});
|
||||
}
|
||||
6
dev/env/node_modules/@streamparser/json-node/tsconfig.json
generated
vendored
Executable file
6
dev/env/node_modules/@streamparser/json-node/tsconfig.json
generated
vendored
Executable file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"extends": "../../tsconfig.json",
|
||||
"include": [
|
||||
"src/**/*.ts"
|
||||
],
|
||||
}
|
||||
Reference in New Issue
Block a user