refactor: move brother_node development artifact to dev/test-nodes subdirectory
Development Artifact Cleanup: ✅ BROTHER_NODE REORGANIZATION: Moved development test node to appropriate location - dev/test-nodes/brother_node/: Moved from root directory for better organization - Contains development configuration, test logs, and test chain data - No impact on production systems - purely development/testing artifact ✅ DEVELOPMENT ARTIFACTS IDENTIFIED: - Chain ID: aitbc-brother-chain (test/development chain) - Ports: 8010 (P2P) and 8011 (RPC) - different from production - Environment: .env file with test configuration - Logs: rpc.log and node.log from development testing session (March 15, 2026) ✅ ROOT DIRECTORY CLEANUP: Removed development clutter from production directory - brother_node/ moved to dev/test-nodes/brother_node/ - Root directory now contains only production-ready components - Development artifacts properly organized in dev/ subdirectory DIRECTORY STRUCTURE IMPROVEMENT: 📁 dev/test-nodes/: Development and testing node configurations 🏗️ Root Directory: Clean production structure with only essential components 🧪 Development Isolation: Test environments separated from production BENEFITS: ✅ Clean Production Directory: No development artifacts in root ✅ Better Organization: Development nodes grouped in dev/ subdirectory ✅ Clear Separation: Production vs development environments clearly distinguished ✅ Maintainability: Easier to identify and manage development components RESULT: Successfully moved brother_node development artifact to dev/test-nodes/ subdirectory, cleaning up the root directory while preserving development testing environment for future use.
This commit is contained in:
377
dev/env/node_modules/@streamparser/json/README.md
generated
vendored
Executable file
377
dev/env/node_modules/@streamparser/json/README.md
generated
vendored
Executable file
@@ -0,0 +1,377 @@
|
||||
# @streamparser/json
|
||||
|
||||
[![npm version][npm-version-badge]][npm-badge-url]
|
||||
[![npm monthly downloads][npm-downloads-badge]][npm-badge-url]
|
||||
[![Build Status][build-status-badge]][build-status-url]
|
||||
[![Coverage Status][coverage-status-badge]][coverage-status-url]
|
||||
|
||||
Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node.js, Deno or any modern browser. Fully compliant with the JSON spec and `JSON.parse(...)`.
|
||||
|
||||
*tldr;*
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
parser.onValue = ({ value }) => { /* process data */ };
|
||||
|
||||
// Or passing the stream in several chunks
|
||||
try {
|
||||
parser.write('{ "test": ["a"] }');
|
||||
// onValue will be called 3 times:
|
||||
// "a"
|
||||
// ["a"]
|
||||
// { test: ["a"] }
|
||||
} catch (err) {
|
||||
console.log(err); // handler errors
|
||||
}
|
||||
```
|
||||
|
||||
## @streamparser/json ecosystem
|
||||
|
||||
There are multiple flavours of @streamparser:
|
||||
|
||||
* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
|
||||
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
|
||||
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.
|
||||
|
||||
## Dependencies / Polyfilling
|
||||
|
||||
@streamparser/json requires a few ES6 classes:
|
||||
|
||||
* [Uint8Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array)
|
||||
* [TextEncoder](https://developer.mozilla.org/en-US/docs/Web/API/TextEncoder)
|
||||
* [TextDecoder](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder)
|
||||
|
||||
If you are targeting browsers or systems in which these might be missing, you need to polyfil them.
|
||||
|
||||
## Components
|
||||
|
||||
### Tokenizer
|
||||
|
||||
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens
|
||||
|
||||
```javascript
|
||||
import { Tokenizer } from '@streamparser/json';
|
||||
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
```
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
|
||||
numberBufferSize: <number>, // set to 0 to don't buffer.
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js.
|
||||
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
If buffer sizes are set to anything else than zero, instead of using a string to apppend the data as it comes in, the data is buffered using a TypedArray. A reasonable size could be `64 * 1024` (64 KB).
|
||||
|
||||
#### Buffering
|
||||
|
||||
When parsing strings or numbers, the parser needs to gather the data in-memory until the whole value is ready.
|
||||
|
||||
Strings are inmutable in Javascript so every string operation creates a new string. The V8 engine, behind Node, Deno and most modern browsers, performs a many different types of optimization. One of this optimizations is to over-allocate memory when it detects many string concatenations. This increases significatly the memory consumption and can easily exhaust your memory when parsing JSON containing very large strings or numbers. For those cases, the parser can buffer the characters using a TypedArray. This requires encoding/decoding from/to the buffer into an actual string once the value is ready. This is done using the `TextEncoder` and `TextDecoder` APIs. Unfortunately, these APIs creates a significant overhead when the strings are small so should be used only when strictly necessary.
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(data: string|typedArray|buffer)** push data into the tokenizer.
|
||||
* **end()** closes the tokenizer so it can not be used anymore. Throws an error if the tokenizer was in the middle of parsing.
|
||||
* **isEnded** readonly boolean property indicating whether the Tokenizer is ended or is still accepting data.
|
||||
* **parseNumber(numberStr)** method used internally to parse numbers. By default, it is equivalent to `Number(numberStr)` but the user can override it if he wants some other behaviour.
|
||||
* **onToken({ token: TokenType, value: any, offset: number })** no-op method that the user should override to follow the tokenization process.
|
||||
* **onError(err: Error)** no-op method that the user can override to act on errors. If not set, the write method simply throws synchronously.
|
||||
* **onEnd()** no-op method that the user can override to act when the tokenizer is ended.
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyTokenizer extends Tokenizer {
|
||||
parseNumber(numberStr) {
|
||||
const number = super.parseNumber(numberStr);
|
||||
// if number is too large. Just keep the string.
|
||||
return Number.isFinite(numberStr) ? number : numberStr;
|
||||
}
|
||||
onToken({ token, value }) {
|
||||
if (token = TokenTypes.NUMBER && typeof value === 'string') {
|
||||
super(TokenTypes.STRING, value);
|
||||
} else {
|
||||
super(token, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const myTokenizer = new MyTokenizer();
|
||||
|
||||
// or just overriding it
|
||||
const tokenizer = new Tokenizer();
|
||||
tokenizer.parseNumber = (numberStr) => { ... };
|
||||
tokenizer.onToken = ({ token, value, offset }) => { ... };
|
||||
```
|
||||
|
||||
### TokenParser
|
||||
|
||||
A token parser that processes JSON tokens as emitted by the `Tokenizer` and emits JSON values/objects.
|
||||
|
||||
```javascript
|
||||
import { TokenParser} from '@streamparser/json';
|
||||
|
||||
const tokenParser = new TokenParser(opts);
|
||||
```
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
paths: <string[]>,
|
||||
keepStack: <boolean>, // whether to keep all the properties in the stack
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
|
||||
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
* paths: Array of paths to emit. Defaults to `undefined` which emits everything. The paths are intended to suppot jsonpath although at the time being it only supports the root object selector (`$`) and subproperties selectors including wildcards (`$.a`, `$.*`, `$.a.b`, , `$.*.b`, etc).
|
||||
* keepStack: Whether to keep full objects on the stack even if they won't be emitted. Defaults to `true`. When set to `false` the it does preserve properties in the parent object some ancestor will be emitted. This means that the parent object passed to the `onValue` function will be empty, which doesn't reflect the truth, but it's more memory-efficient.
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(token: TokenType, value: any)** push data into the token parser.
|
||||
* **end()** closes the token parser so it can not be used anymore. Throws an error if the tokenizer was in the middle of parsing.
|
||||
* **isEnded** readonly boolean property indicating whether the token parser is ended or is still accepting data.
|
||||
* **onValue(value: any)** no-op method that the user should override to get the parsed value.
|
||||
* **onError(err: Error)** no-op method that the user should override to act on errors. If not set, the write method simply throws synchronously.
|
||||
* **onEnd()** no-op method that the user should override to act when the token parser is ended.
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyTokenParser extends TokenParser {
|
||||
onValue(value: any) {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
const myTokenParser = new MyTokenParser();
|
||||
|
||||
// or just overriding it
|
||||
const tokenParser = new TokenParser();
|
||||
tokenParser.onValue = (value) => { ... };
|
||||
```
|
||||
|
||||
### JSONParser
|
||||
|
||||
A drop-in replacement of `JSONparse` (with few ~~breaking changes~~ improvements. See below.).
|
||||
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
```
|
||||
|
||||
It takes the same options as the tokenizer.
|
||||
|
||||
This class is just for convenience. In reality, it simply connects the tokenizer and the parser:
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
tokenizer.onToken = tokenParser.write.bind(tokenParser);
|
||||
tokenParser.onValue = (value) => { /* Process values */ }
|
||||
```
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(token: TokenType, value: any)** alias to the Tokenizer write method.
|
||||
* **end()** alias to the Tokenizer end method.
|
||||
* **isEnded** readonly boolean property indicating whether the JSONParser is ended or is still accepting data.
|
||||
* **onToken(token: TokenType, value: any, offset: number)** alias to the Tokenizer onToken method. (write only).
|
||||
* **onValue(value: any)** alias to the Token Parser onValue method (write only).
|
||||
* **onError(err: Error)** alias to the Tokenizer/Token Parser onError method (write only).
|
||||
* **onEnd()** alias to the Tokenizer onEnd method (which will call the Token Parser onEnd methods) (write only).
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyJsonParser extends JSONParser {
|
||||
onToken(value: any) {
|
||||
// ...
|
||||
}
|
||||
onValue(value: any) {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
const myJsonParser = new MyJsonParser();
|
||||
|
||||
// or just overriding it
|
||||
const jsonParser = new JSONParser();
|
||||
jsonParser.onToken = (token, value, offset) => { ... };
|
||||
jsonParser.onValue = (value) => { ... };
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
You can use both components independently as
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
tokenizer.onToken = tokenParser.write.bind(tokenParser);
|
||||
```
|
||||
|
||||
You push data using the `write` method which takes a string or an array-like object.
|
||||
|
||||
You can subscribe to the resulting data using the
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
|
||||
parser.onValue = console.log;
|
||||
|
||||
parser.write('"Hello world!"'); // logs "Hello world!"
|
||||
|
||||
// Or passing the stream in several chunks
|
||||
parser.write('"');
|
||||
parser.write('Hello');
|
||||
parser.write(' ');
|
||||
parser.write('world!');
|
||||
parser.write('"');// logs "Hello world!"
|
||||
```
|
||||
|
||||
Write is always a synchronous operation so any error during the parsing of the stream will be thrown during the write operation. After an error, the parser can't continue parsing.
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined });
|
||||
parser.onValue = console.log;
|
||||
|
||||
try {
|
||||
parser.write('"""');
|
||||
} catch (err) {
|
||||
console.log(err); // logs
|
||||
}
|
||||
```
|
||||
|
||||
You can also handle errors using callbacks:
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined });
|
||||
parser.onValue = console.log;
|
||||
parser.onError = console.error;
|
||||
|
||||
parser.write('"""');
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Stream-parsing a fetch request returning a JSONstream
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`{"id":1}{"id":2}{"id":3}...`).
|
||||
|
||||
```js
|
||||
import { JSONParser} from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
parser.onValue = (value, key, parent, stack) => {
|
||||
if (stack > 0) return; // ignore inner values
|
||||
// TODO process element
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a JSON array
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const jsonparser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'] });
|
||||
jsonparser.onValue = ({ value, key, parent, stack }) => {
|
||||
// TODO process element
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a very long string getting previews of the string
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const jsonparser = new JSONParser({ emitPartialTokens: true, emitPartialValues: true });
|
||||
jsonparser.onValue = ({ value, key, parent, stack, partial }) => {
|
||||
if (partial) {
|
||||
console.log(`Parsing value: ${value}... (still parsing)`);
|
||||
} else {
|
||||
console.log(`Value parsed: ${value}`);
|
||||
}
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
## Migration guide
|
||||
|
||||
### Upgrading from 0.10 to 0.11
|
||||
|
||||
The arguments of callbacks have been objectified.
|
||||
|
||||
What used to be
|
||||
|
||||
```js
|
||||
jsonparser.onToken = ({ token, value }) => {
|
||||
// TODO process token
|
||||
};
|
||||
jsonparser.onValue = ({ value, key, parent, stack }) => {
|
||||
// TODO process element
|
||||
};
|
||||
```
|
||||
now is:
|
||||
|
||||
```js
|
||||
jsonparser.onToken = (token, value) => {
|
||||
// TODO process token
|
||||
};
|
||||
jsonparser.onValue = (value, key, parent, stack) => {
|
||||
// TODO process element
|
||||
};
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
See [LICENSE.md](../../LICENSE).
|
||||
|
||||
[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson.svg
|
||||
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json
|
||||
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson.svg
|
||||
[build-status-badge]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml/badge.svg
|
||||
[build-status-url]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml
|
||||
[coverage-status-badge]: https://coveralls.io/repos/github/juanjoDiaz/streamparser-json/badge.svg?branch=main
|
||||
[coverage-status-url]: https://coveralls.io/github/juanjoDiaz/streamparser-json?branch=main
|
||||
10
dev/env/node_modules/@streamparser/json/dist/cjs/index.d.ts
generated
vendored
Executable file
10
dev/env/node_modules/@streamparser/json/dist/cjs/index.d.ts
generated
vendored
Executable file
@@ -0,0 +1,10 @@
|
||||
export { default as JSONParser, type JSONParserOptions } from "./jsonparser.js";
|
||||
export { default as Tokenizer, type TokenizerOptions, TokenizerError, } from "./tokenizer.js";
|
||||
export { default as TokenParser, type TokenParserOptions, TokenParserError, } from "./tokenparser.js";
|
||||
export * as utf8 from "./utils/utf-8.js";
|
||||
export * as JsonTypes from "./utils/types/jsonTypes.js";
|
||||
export * as ParsedTokenInfo from "./utils/types/parsedTokenInfo.js";
|
||||
export * as ParsedElementInfo from "./utils/types/parsedElementInfo.js";
|
||||
export { TokenParserMode, type StackElement, } from "./utils/types/stackElement.js";
|
||||
export { default as TokenType } from "./utils/types/tokenType.js";
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/index.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/index.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,KAAK,iBAAiB,EAAE,MAAM,iBAAiB,CAAC;AAChF,OAAO,EACL,OAAO,IAAI,SAAS,EACpB,KAAK,gBAAgB,EACrB,cAAc,GACf,MAAM,gBAAgB,CAAC;AACxB,OAAO,EACL,OAAO,IAAI,WAAW,EACtB,KAAK,kBAAkB,EACvB,gBAAgB,GACjB,MAAM,kBAAkB,CAAC;AAE1B,OAAO,KAAK,IAAI,MAAM,kBAAkB,CAAC;AACzC,OAAO,KAAK,SAAS,MAAM,4BAA4B,CAAC;AACxD,OAAO,KAAK,eAAe,MAAM,kCAAkC,CAAC;AACpE,OAAO,KAAK,iBAAiB,MAAM,oCAAoC,CAAC;AACxE,OAAO,EACL,eAAe,EACf,KAAK,YAAY,GAClB,MAAM,+BAA+B,CAAC;AACvC,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,4BAA4B,CAAC"}
|
||||
46
dev/env/node_modules/@streamparser/json/dist/cjs/index.js
generated
vendored
Executable file
46
dev/env/node_modules/@streamparser/json/dist/cjs/index.js
generated
vendored
Executable file
@@ -0,0 +1,46 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TokenType = exports.TokenParserMode = exports.ParsedElementInfo = exports.ParsedTokenInfo = exports.JsonTypes = exports.utf8 = exports.TokenParserError = exports.TokenParser = exports.TokenizerError = exports.Tokenizer = exports.JSONParser = void 0;
|
||||
var jsonparser_js_1 = require("./jsonparser.js");
|
||||
Object.defineProperty(exports, "JSONParser", { enumerable: true, get: function () { return __importDefault(jsonparser_js_1).default; } });
|
||||
var tokenizer_js_1 = require("./tokenizer.js");
|
||||
Object.defineProperty(exports, "Tokenizer", { enumerable: true, get: function () { return __importDefault(tokenizer_js_1).default; } });
|
||||
Object.defineProperty(exports, "TokenizerError", { enumerable: true, get: function () { return tokenizer_js_1.TokenizerError; } });
|
||||
var tokenparser_js_1 = require("./tokenparser.js");
|
||||
Object.defineProperty(exports, "TokenParser", { enumerable: true, get: function () { return __importDefault(tokenparser_js_1).default; } });
|
||||
Object.defineProperty(exports, "TokenParserError", { enumerable: true, get: function () { return tokenparser_js_1.TokenParserError; } });
|
||||
exports.utf8 = __importStar(require("./utils/utf-8.js"));
|
||||
exports.JsonTypes = __importStar(require("./utils/types/jsonTypes.js"));
|
||||
exports.ParsedTokenInfo = __importStar(require("./utils/types/parsedTokenInfo.js"));
|
||||
exports.ParsedElementInfo = __importStar(require("./utils/types/parsedElementInfo.js"));
|
||||
var stackElement_js_1 = require("./utils/types/stackElement.js");
|
||||
Object.defineProperty(exports, "TokenParserMode", { enumerable: true, get: function () { return stackElement_js_1.TokenParserMode; } });
|
||||
var tokenType_js_1 = require("./utils/types/tokenType.js");
|
||||
Object.defineProperty(exports, "TokenType", { enumerable: true, get: function () { return __importDefault(tokenType_js_1).default; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/index.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/index.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,iDAAgF;AAAvE,4HAAA,OAAO,OAAc;AAC9B,+CAIwB;AAHtB,0HAAA,OAAO,OAAa;AAEpB,8GAAA,cAAc,OAAA;AAEhB,mDAI0B;AAHxB,8HAAA,OAAO,OAAe;AAEtB,kHAAA,gBAAgB,OAAA;AAGlB,yDAAyC;AACzC,wEAAwD;AACxD,oFAAoE;AACpE,wFAAwE;AACxE,iEAGuC;AAFrC,kHAAA,eAAe,OAAA;AAGjB,2DAAkE;AAAzD,0HAAA,OAAO,OAAa"}
|
||||
19
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.d.ts
generated
vendored
Executable file
19
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,19 @@
|
||||
import { type TokenizerOptions } from "./tokenizer.js";
|
||||
import { type TokenParserOptions } from "./tokenparser.js";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.js";
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
export interface JSONParserOptions extends TokenizerOptions, TokenParserOptions {
|
||||
}
|
||||
export default class JSONParser {
|
||||
private tokenizer;
|
||||
private tokenParser;
|
||||
constructor(opts?: JSONParserOptions);
|
||||
get isEnded(): boolean;
|
||||
write(input: Iterable<number> | string): void;
|
||||
end(): void;
|
||||
set onToken(cb: (parsedTokenInfo: ParsedTokenInfo) => void);
|
||||
set onValue(cb: (parsedElementInfo: ParsedElementInfo) => void);
|
||||
set onError(cb: (err: Error) => void);
|
||||
set onEnd(cb: () => void);
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.d.ts","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":"AAAA,OAAkB,EAAE,KAAK,gBAAgB,EAAE,MAAM,gBAAgB,CAAC;AAClE,OAAoB,EAAE,KAAK,kBAAkB,EAAE,MAAM,kBAAkB,CAAC;AACxE,OAAO,KAAK,EAAE,iBAAiB,EAAE,MAAM,oCAAoC,CAAC;AAC5E,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AAExE,MAAM,WAAW,iBACf,SAAQ,gBAAgB,EACtB,kBAAkB;CAAG;AAEzB,MAAM,CAAC,OAAO,OAAO,UAAU;IAC7B,OAAO,CAAC,SAAS,CAAY;IAC7B,OAAO,CAAC,WAAW,CAAc;gBAErB,IAAI,GAAE,iBAAsB;IAexC,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,KAAK,EAAE,QAAQ,CAAC,MAAM,CAAC,GAAG,MAAM,GAAG,IAAI;IAI7C,GAAG,IAAI,IAAI;IAIlB,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,eAAe,EAAE,eAAe,KAAK,IAAI,EAKhE;IAED,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,iBAAiB,EAAE,iBAAiB,KAAK,IAAI,EAEpE;IAED,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,KAAK,KAAK,IAAI,EAE1C;IAED,IAAW,KAAK,CAAC,EAAE,EAAE,MAAM,IAAI,EAK9B;CACF"}
|
||||
53
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.js
generated
vendored
Executable file
53
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.js
generated
vendored
Executable file
@@ -0,0 +1,53 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const tokenizer_js_1 = __importDefault(require("./tokenizer.js"));
|
||||
const tokenparser_js_1 = __importDefault(require("./tokenparser.js"));
|
||||
class JSONParser {
|
||||
constructor(opts = {}) {
|
||||
this.tokenizer = new tokenizer_js_1.default(opts);
|
||||
this.tokenParser = new tokenparser_js_1.default(opts);
|
||||
this.tokenizer.onToken = this.tokenParser.write.bind(this.tokenParser);
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.tokenParser.isEnded)
|
||||
this.tokenParser.end();
|
||||
};
|
||||
this.tokenParser.onError = this.tokenizer.error.bind(this.tokenizer);
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
};
|
||||
}
|
||||
get isEnded() {
|
||||
return this.tokenizer.isEnded && this.tokenParser.isEnded;
|
||||
}
|
||||
write(input) {
|
||||
this.tokenizer.write(input);
|
||||
}
|
||||
end() {
|
||||
this.tokenizer.end();
|
||||
}
|
||||
set onToken(cb) {
|
||||
this.tokenizer.onToken = (parsedToken) => {
|
||||
cb(parsedToken);
|
||||
this.tokenParser.write(parsedToken);
|
||||
};
|
||||
}
|
||||
set onValue(cb) {
|
||||
this.tokenParser.onValue = cb;
|
||||
}
|
||||
set onError(cb) {
|
||||
this.tokenizer.onError = cb;
|
||||
}
|
||||
set onEnd(cb) {
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
cb.call(this.tokenParser);
|
||||
};
|
||||
}
|
||||
}
|
||||
exports.default = JSONParser;
|
||||
//# sourceMappingURL=jsonparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/jsonparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.js","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":";;;;;AAAA,kEAAkE;AAClE,sEAAwE;AAQxE,MAAqB,UAAU;IAI7B,YAAY,OAA0B,EAAE;QACtC,IAAI,CAAC,SAAS,GAAG,IAAI,sBAAS,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,WAAW,GAAG,IAAI,wBAAW,CAAC,IAAI,CAAC,CAAC;QAEzC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QACvE,IAAI,CAAC,SAAS,CAAC,KAAK,GAAG,GAAG,EAAE;YAC1B,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,OAAO;gBAAE,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,CAAC;QACxD,CAAC,CAAC;QAEF,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;QACrE,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;QACpD,CAAC,CAAC;IACJ,CAAC;IAED,IAAW,OAAO;QAChB,OAAO,IAAI,CAAC,SAAS,CAAC,OAAO,IAAI,IAAI,CAAC,WAAW,CAAC,OAAO,CAAC;IAC5D,CAAC;IAEM,KAAK,CAAC,KAAgC;QAC3C,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;IAC9B,CAAC;IAEM,GAAG;QACR,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;IACvB,CAAC;IAED,IAAW,OAAO,CAAC,EAA8C;QAC/D,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,WAAW,EAAE,EAAE;YACvC,EAAE,CAAC,WAAW,CAAC,CAAC;YAChB,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED,IAAW,OAAO,CAAC,EAAkD;QACnE,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,EAAE,CAAC;IAChC,CAAC;IAED,IAAW,OAAO,CAAC,EAAwB;QACzC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,EAAE,CAAC;IAC9B,CAAC;IAED,IAAW,KAAK,CAAC,EAAc;QAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;YAClD,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QAC5B,CAAC,CAAC;IACJ,CAAC;CACF;AApDD,6BAoDC"}
|
||||
3
dev/env/node_modules/@streamparser/json/dist/cjs/package.json
generated
vendored
Executable file
3
dev/env/node_modules/@streamparser/json/dist/cjs/package.json
generated
vendored
Executable file
@@ -0,0 +1,3 @@
|
||||
{
|
||||
"type": "commonjs"
|
||||
}
|
||||
40
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.d.ts
generated
vendored
Executable file
40
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.d.ts
generated
vendored
Executable file
@@ -0,0 +1,40 @@
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
export interface TokenizerOptions {
|
||||
stringBufferSize?: number;
|
||||
numberBufferSize?: number;
|
||||
separator?: string;
|
||||
emitPartialTokens?: boolean;
|
||||
}
|
||||
export declare class TokenizerError extends Error {
|
||||
constructor(message: string);
|
||||
}
|
||||
export default class Tokenizer {
|
||||
private state;
|
||||
private bom?;
|
||||
private bomIndex;
|
||||
private emitPartialTokens;
|
||||
private separator?;
|
||||
private separatorBytes?;
|
||||
private separatorIndex;
|
||||
private escapedCharsByteLength;
|
||||
private bufferedString;
|
||||
private bufferedNumber;
|
||||
private unicode?;
|
||||
private highSurrogate?;
|
||||
private bytes_remaining;
|
||||
private bytes_in_sequence;
|
||||
private char_split_buffer;
|
||||
private encoder;
|
||||
private offset;
|
||||
constructor(opts?: TokenizerOptions);
|
||||
get isEnded(): boolean;
|
||||
write(input: Iterable<number> | string): void;
|
||||
private emitNumber;
|
||||
protected parseNumber(numberStr: string): number;
|
||||
error(err: Error): void;
|
||||
end(): void;
|
||||
onToken(parsedToken: ParsedTokenInfo): void;
|
||||
onError(err: Error): void;
|
||||
onEnd(): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.d.ts","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":"AAOA,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AAyExE,MAAM,WAAW,gBAAgB;IAC/B,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,iBAAiB,CAAC,EAAE,OAAO,CAAC;CAC7B;AASD,qBAAa,cAAe,SAAQ,KAAK;gBAC3B,OAAO,EAAE,MAAM;CAK5B;AAED,MAAM,CAAC,OAAO,OAAO,SAAS;IAC5B,OAAO,CAAC,KAAK,CAAgC;IAE7C,OAAO,CAAC,GAAG,CAAC,CAAW;IACvB,OAAO,CAAC,QAAQ,CAAK;IAErB,OAAO,CAAC,iBAAiB,CAAU;IACnC,OAAO,CAAC,SAAS,CAAC,CAAS;IAC3B,OAAO,CAAC,cAAc,CAAC,CAAa;IACpC,OAAO,CAAC,cAAc,CAAK;IAC3B,OAAO,CAAC,sBAAsB,CAAK;IACnC,OAAO,CAAC,cAAc,CAAgB;IACtC,OAAO,CAAC,cAAc,CAAgB;IAEtC,OAAO,CAAC,OAAO,CAAC,CAAS;IACzB,OAAO,CAAC,aAAa,CAAC,CAAS;IAC/B,OAAO,CAAC,eAAe,CAAK;IAC5B,OAAO,CAAC,iBAAiB,CAAK;IAC9B,OAAO,CAAC,iBAAiB,CAAqB;IAC9C,OAAO,CAAC,OAAO,CAAqB;IACpC,OAAO,CAAC,MAAM,CAAM;gBAER,IAAI,CAAC,EAAE,gBAAgB;IAmBnC,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,KAAK,EAAE,QAAQ,CAAC,MAAM,CAAC,GAAG,MAAM,GAAG,IAAI;IA8nBpD,OAAO,CAAC,UAAU;IASlB,SAAS,CAAC,WAAW,CAAC,SAAS,EAAE,MAAM,GAAG,MAAM;IAIzC,KAAK,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAQvB,GAAG,IAAI,IAAI;IA6BX,OAAO,CAAC,WAAW,EAAE,eAAe,GAAG,IAAI;IAO3C,OAAO,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAKzB,KAAK,IAAI,IAAI;CAGrB"}
|
||||
742
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.js
generated
vendored
Executable file
742
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.js
generated
vendored
Executable file
@@ -0,0 +1,742 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TokenizerError = void 0;
|
||||
const utf_8_js_1 = require("./utils/utf-8.js");
|
||||
const bufferedString_js_1 = require("./utils/bufferedString.js");
|
||||
const tokenType_js_1 = __importDefault(require("./utils/types/tokenType.js"));
|
||||
// Tokenizer States
|
||||
var TokenizerStates;
|
||||
(function (TokenizerStates) {
|
||||
TokenizerStates[TokenizerStates["START"] = 0] = "START";
|
||||
TokenizerStates[TokenizerStates["ENDED"] = 1] = "ENDED";
|
||||
TokenizerStates[TokenizerStates["ERROR"] = 2] = "ERROR";
|
||||
TokenizerStates[TokenizerStates["TRUE1"] = 3] = "TRUE1";
|
||||
TokenizerStates[TokenizerStates["TRUE2"] = 4] = "TRUE2";
|
||||
TokenizerStates[TokenizerStates["TRUE3"] = 5] = "TRUE3";
|
||||
TokenizerStates[TokenizerStates["FALSE1"] = 6] = "FALSE1";
|
||||
TokenizerStates[TokenizerStates["FALSE2"] = 7] = "FALSE2";
|
||||
TokenizerStates[TokenizerStates["FALSE3"] = 8] = "FALSE3";
|
||||
TokenizerStates[TokenizerStates["FALSE4"] = 9] = "FALSE4";
|
||||
TokenizerStates[TokenizerStates["NULL1"] = 10] = "NULL1";
|
||||
TokenizerStates[TokenizerStates["NULL2"] = 11] = "NULL2";
|
||||
TokenizerStates[TokenizerStates["NULL3"] = 12] = "NULL3";
|
||||
TokenizerStates[TokenizerStates["STRING_DEFAULT"] = 13] = "STRING_DEFAULT";
|
||||
TokenizerStates[TokenizerStates["STRING_AFTER_BACKSLASH"] = 14] = "STRING_AFTER_BACKSLASH";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_1"] = 15] = "STRING_UNICODE_DIGIT_1";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_2"] = 16] = "STRING_UNICODE_DIGIT_2";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_3"] = 17] = "STRING_UNICODE_DIGIT_3";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_4"] = 18] = "STRING_UNICODE_DIGIT_4";
|
||||
TokenizerStates[TokenizerStates["STRING_INCOMPLETE_CHAR"] = 19] = "STRING_INCOMPLETE_CHAR";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_MINUS"] = 20] = "NUMBER_AFTER_INITIAL_MINUS";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_ZERO"] = 21] = "NUMBER_AFTER_INITIAL_ZERO";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_NON_ZERO"] = 22] = "NUMBER_AFTER_INITIAL_NON_ZERO";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_FULL_STOP"] = 23] = "NUMBER_AFTER_FULL_STOP";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_DECIMAL"] = 24] = "NUMBER_AFTER_DECIMAL";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E"] = 25] = "NUMBER_AFTER_E";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E_AND_SIGN"] = 26] = "NUMBER_AFTER_E_AND_SIGN";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E_AND_DIGIT"] = 27] = "NUMBER_AFTER_E_AND_DIGIT";
|
||||
TokenizerStates[TokenizerStates["SEPARATOR"] = 28] = "SEPARATOR";
|
||||
TokenizerStates[TokenizerStates["BOM_OR_START"] = 29] = "BOM_OR_START";
|
||||
TokenizerStates[TokenizerStates["BOM"] = 30] = "BOM";
|
||||
})(TokenizerStates || (TokenizerStates = {}));
|
||||
function TokenizerStateToString(tokenizerState) {
|
||||
return [
|
||||
"START",
|
||||
"ENDED",
|
||||
"ERROR",
|
||||
"TRUE1",
|
||||
"TRUE2",
|
||||
"TRUE3",
|
||||
"FALSE1",
|
||||
"FALSE2",
|
||||
"FALSE3",
|
||||
"FALSE4",
|
||||
"NULL1",
|
||||
"NULL2",
|
||||
"NULL3",
|
||||
"STRING_DEFAULT",
|
||||
"STRING_AFTER_BACKSLASH",
|
||||
"STRING_UNICODE_DIGIT_1",
|
||||
"STRING_UNICODE_DIGIT_2",
|
||||
"STRING_UNICODE_DIGIT_3",
|
||||
"STRING_UNICODE_DIGIT_4",
|
||||
"STRING_INCOMPLETE_CHAR",
|
||||
"NUMBER_AFTER_INITIAL_MINUS",
|
||||
"NUMBER_AFTER_INITIAL_ZERO",
|
||||
"NUMBER_AFTER_INITIAL_NON_ZERO",
|
||||
"NUMBER_AFTER_FULL_STOP",
|
||||
"NUMBER_AFTER_DECIMAL",
|
||||
"NUMBER_AFTER_E",
|
||||
"NUMBER_AFTER_E_AND_SIGN",
|
||||
"NUMBER_AFTER_E_AND_DIGIT",
|
||||
"SEPARATOR",
|
||||
"BOM_OR_START",
|
||||
"BOM",
|
||||
][tokenizerState];
|
||||
}
|
||||
const defaultOpts = {
|
||||
stringBufferSize: 0,
|
||||
numberBufferSize: 0,
|
||||
separator: undefined,
|
||||
emitPartialTokens: false,
|
||||
};
|
||||
class TokenizerError extends Error {
|
||||
constructor(message) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenizerError.prototype);
|
||||
}
|
||||
}
|
||||
exports.TokenizerError = TokenizerError;
|
||||
class Tokenizer {
|
||||
constructor(opts) {
|
||||
this.state = 29 /* TokenizerStates.BOM_OR_START */;
|
||||
this.bomIndex = 0;
|
||||
this.separatorIndex = 0;
|
||||
this.escapedCharsByteLength = 0;
|
||||
this.bytes_remaining = 0; // number of bytes remaining in multi byte utf8 char to read after split boundary
|
||||
this.bytes_in_sequence = 0; // bytes in multi byte utf8 char to read
|
||||
this.char_split_buffer = new Uint8Array(4); // for rebuilding chars split before boundary is reached
|
||||
this.encoder = new TextEncoder();
|
||||
this.offset = -1;
|
||||
opts = Object.assign(Object.assign({}, defaultOpts), opts);
|
||||
this.emitPartialTokens = opts.emitPartialTokens === true;
|
||||
this.bufferedString =
|
||||
opts.stringBufferSize && opts.stringBufferSize > 4
|
||||
? new bufferedString_js_1.BufferedString(opts.stringBufferSize)
|
||||
: new bufferedString_js_1.NonBufferedString();
|
||||
this.bufferedNumber =
|
||||
opts.numberBufferSize && opts.numberBufferSize > 0
|
||||
? new bufferedString_js_1.BufferedString(opts.numberBufferSize)
|
||||
: new bufferedString_js_1.NonBufferedString();
|
||||
this.separator = opts.separator;
|
||||
this.separatorBytes = opts.separator
|
||||
? this.encoder.encode(opts.separator)
|
||||
: undefined;
|
||||
}
|
||||
get isEnded() {
|
||||
return this.state === 1 /* TokenizerStates.ENDED */;
|
||||
}
|
||||
write(input) {
|
||||
try {
|
||||
let buffer;
|
||||
if (input instanceof Uint8Array) {
|
||||
buffer = input;
|
||||
}
|
||||
else if (typeof input === "string") {
|
||||
buffer = this.encoder.encode(input);
|
||||
}
|
||||
else if (Array.isArray(input)) {
|
||||
buffer = Uint8Array.from(input);
|
||||
}
|
||||
else if (ArrayBuffer.isView(input)) {
|
||||
buffer = new Uint8Array(input.buffer, input.byteOffset, input.byteLength);
|
||||
}
|
||||
else {
|
||||
throw new TypeError("Unexpected type. The `write` function only accepts Arrays, TypedArrays and Strings.");
|
||||
}
|
||||
for (let i = 0; i < buffer.length; i += 1) {
|
||||
const n = buffer[i]; // get current byte from buffer
|
||||
switch (this.state) {
|
||||
// @ts-expect-error fall through case
|
||||
case 29 /* TokenizerStates.BOM_OR_START */:
|
||||
if (input instanceof Uint8Array && n === 0xef) {
|
||||
this.bom = [0xef, 0xbb, 0xbf];
|
||||
this.bomIndex += 1;
|
||||
this.state = 30 /* TokenizerStates.BOM */;
|
||||
continue;
|
||||
}
|
||||
if (input instanceof Uint16Array) {
|
||||
if (n === 0xfe) {
|
||||
this.bom = [0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = 30 /* TokenizerStates.BOM */;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe];
|
||||
this.bomIndex += 1;
|
||||
this.state = 30 /* TokenizerStates.BOM */;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if (input instanceof Uint32Array) {
|
||||
if (n === 0x00) {
|
||||
this.bom = [0x00, 0x00, 0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = 30 /* TokenizerStates.BOM */;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe, 0x00, 0x00];
|
||||
this.bomIndex += 1;
|
||||
this.state = 30 /* TokenizerStates.BOM */;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case 0 /* TokenizerStates.START */:
|
||||
this.offset += 1;
|
||||
if (this.separatorBytes && n === this.separatorBytes[0]) {
|
||||
if (this.separatorBytes.length === 1) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.SEPARATOR,
|
||||
value: this.separator,
|
||||
offset: this.offset + this.separatorBytes.length - 1,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
this.state = 28 /* TokenizerStates.SEPARATOR */;
|
||||
continue;
|
||||
}
|
||||
if (n === 32 /* charset.SPACE */ ||
|
||||
n === 10 /* charset.NEWLINE */ ||
|
||||
n === 13 /* charset.CARRIAGE_RETURN */ ||
|
||||
n === 9 /* charset.TAB */) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
if (n === 123 /* charset.LEFT_CURLY_BRACKET */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.LEFT_BRACE,
|
||||
value: "{",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 125 /* charset.RIGHT_CURLY_BRACKET */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.RIGHT_BRACE,
|
||||
value: "}",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 91 /* charset.LEFT_SQUARE_BRACKET */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.LEFT_BRACKET,
|
||||
value: "[",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 93 /* charset.RIGHT_SQUARE_BRACKET */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.RIGHT_BRACKET,
|
||||
value: "]",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 58 /* charset.COLON */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.COLON,
|
||||
value: ":",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 44 /* charset.COMMA */) {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.COMMA,
|
||||
value: ",",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === 116 /* charset.LATIN_SMALL_LETTER_T */) {
|
||||
this.state = 3 /* TokenizerStates.TRUE1 */;
|
||||
continue;
|
||||
}
|
||||
if (n === 102 /* charset.LATIN_SMALL_LETTER_F */) {
|
||||
this.state = 6 /* TokenizerStates.FALSE1 */;
|
||||
continue;
|
||||
}
|
||||
if (n === 110 /* charset.LATIN_SMALL_LETTER_N */) {
|
||||
this.state = 10 /* TokenizerStates.NULL1 */;
|
||||
continue;
|
||||
}
|
||||
if (n === 34 /* charset.QUOTATION_MARK */) {
|
||||
this.bufferedString.reset();
|
||||
this.escapedCharsByteLength = 0;
|
||||
this.state = 13 /* TokenizerStates.STRING_DEFAULT */;
|
||||
continue;
|
||||
}
|
||||
if (n >= 49 /* charset.DIGIT_ONE */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 22 /* TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO */;
|
||||
continue;
|
||||
}
|
||||
if (n === 48 /* charset.DIGIT_ZERO */) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 21 /* TokenizerStates.NUMBER_AFTER_INITIAL_ZERO */;
|
||||
continue;
|
||||
}
|
||||
if (n === 45 /* charset.HYPHEN_MINUS */) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 20 /* TokenizerStates.NUMBER_AFTER_INITIAL_MINUS */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// STRING
|
||||
case 13 /* TokenizerStates.STRING_DEFAULT */:
|
||||
if (n === 34 /* charset.QUOTATION_MARK */) {
|
||||
const string = this.bufferedString.toString();
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset +=
|
||||
this.escapedCharsByteLength +
|
||||
this.bufferedString.byteLength +
|
||||
1;
|
||||
continue;
|
||||
}
|
||||
if (n === 92 /* charset.REVERSE_SOLIDUS */) {
|
||||
this.state = 14 /* TokenizerStates.STRING_AFTER_BACKSLASH */;
|
||||
continue;
|
||||
}
|
||||
if (n >= 128) {
|
||||
// Parse multi byte (>=128) chars one at a time
|
||||
if (n >= 194 && n <= 223) {
|
||||
this.bytes_in_sequence = 2;
|
||||
}
|
||||
else if (n <= 239) {
|
||||
this.bytes_in_sequence = 3;
|
||||
}
|
||||
else {
|
||||
this.bytes_in_sequence = 4;
|
||||
}
|
||||
if (this.bytes_in_sequence <= buffer.length - i) {
|
||||
// if bytes needed to complete char fall outside buffer length, we have a boundary split
|
||||
this.bufferedString.appendBuf(buffer, i, i + this.bytes_in_sequence);
|
||||
i += this.bytes_in_sequence - 1;
|
||||
continue;
|
||||
}
|
||||
this.bytes_remaining = i + this.bytes_in_sequence - buffer.length;
|
||||
this.char_split_buffer.set(buffer.subarray(i));
|
||||
i = buffer.length - 1;
|
||||
this.state = 19 /* TokenizerStates.STRING_INCOMPLETE_CHAR */;
|
||||
continue;
|
||||
}
|
||||
if (n >= 32 /* charset.SPACE */) {
|
||||
this.bufferedString.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 19 /* TokenizerStates.STRING_INCOMPLETE_CHAR */:
|
||||
// check for carry over of a multi byte char split between data chunks
|
||||
// & fill temp buffer it with start of this data chunk up to the boundary limit set in the last iteration
|
||||
this.char_split_buffer.set(buffer.subarray(i, i + this.bytes_remaining), this.bytes_in_sequence - this.bytes_remaining);
|
||||
this.bufferedString.appendBuf(this.char_split_buffer, 0, this.bytes_in_sequence);
|
||||
i = this.bytes_remaining - 1;
|
||||
this.state = 13 /* TokenizerStates.STRING_DEFAULT */;
|
||||
continue;
|
||||
case 14 /* TokenizerStates.STRING_AFTER_BACKSLASH */:
|
||||
// eslint-disable-next-line no-case-declarations
|
||||
const controlChar = utf_8_js_1.escapedSequences[n];
|
||||
if (controlChar) {
|
||||
this.bufferedString.appendChar(controlChar);
|
||||
this.escapedCharsByteLength += 1; // len(\")=2 minus the fact you're appending len(controlChar)=1
|
||||
this.state = 13 /* TokenizerStates.STRING_DEFAULT */;
|
||||
continue;
|
||||
}
|
||||
if (n === 117 /* charset.LATIN_SMALL_LETTER_U */) {
|
||||
this.unicode = "";
|
||||
this.state = 15 /* TokenizerStates.STRING_UNICODE_DIGIT_1 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 15 /* TokenizerStates.STRING_UNICODE_DIGIT_1 */:
|
||||
case 16 /* TokenizerStates.STRING_UNICODE_DIGIT_2 */:
|
||||
case 17 /* TokenizerStates.STRING_UNICODE_DIGIT_3 */:
|
||||
if ((n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) ||
|
||||
(n >= 65 /* charset.LATIN_CAPITAL_LETTER_A */ &&
|
||||
n <= 70 /* charset.LATIN_CAPITAL_LETTER_F */) ||
|
||||
(n >= 97 /* charset.LATIN_SMALL_LETTER_A */ &&
|
||||
n <= 102 /* charset.LATIN_SMALL_LETTER_F */)) {
|
||||
this.unicode += String.fromCharCode(n);
|
||||
this.state += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 18 /* TokenizerStates.STRING_UNICODE_DIGIT_4 */:
|
||||
if ((n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) ||
|
||||
(n >= 65 /* charset.LATIN_CAPITAL_LETTER_A */ &&
|
||||
n <= 70 /* charset.LATIN_CAPITAL_LETTER_F */) ||
|
||||
(n >= 97 /* charset.LATIN_SMALL_LETTER_A */ &&
|
||||
n <= 102 /* charset.LATIN_SMALL_LETTER_F */)) {
|
||||
const intVal = parseInt(this.unicode + String.fromCharCode(n), 16);
|
||||
let unicodeString;
|
||||
if (this.highSurrogate === undefined) {
|
||||
if (intVal >= 0xd800 && intVal <= 0xdbff) {
|
||||
//<55296,56319> - highSurrogate
|
||||
this.highSurrogate = intVal;
|
||||
this.state = 13 /* TokenizerStates.STRING_DEFAULT */;
|
||||
continue;
|
||||
}
|
||||
else {
|
||||
unicodeString = String.fromCharCode(intVal);
|
||||
}
|
||||
}
|
||||
else {
|
||||
if (intVal >= 0xdc00 && intVal <= 0xdfff) {
|
||||
//<56320,57343> - lowSurrogate
|
||||
unicodeString = String.fromCharCode(this.highSurrogate, intVal);
|
||||
}
|
||||
else {
|
||||
unicodeString = String.fromCharCode(this.highSurrogate);
|
||||
}
|
||||
this.highSurrogate = undefined;
|
||||
}
|
||||
const unicodeBuffer = this.encoder.encode(unicodeString);
|
||||
this.bufferedString.appendBuf(unicodeBuffer);
|
||||
// len(\u0000)=6 minus the fact you're appending len(buf)
|
||||
this.escapedCharsByteLength += 6 - unicodeBuffer.byteLength;
|
||||
this.state = 13 /* TokenizerStates.STRING_DEFAULT */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// Number
|
||||
case 20 /* TokenizerStates.NUMBER_AFTER_INITIAL_MINUS */:
|
||||
if (n === 48 /* charset.DIGIT_ZERO */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 21 /* TokenizerStates.NUMBER_AFTER_INITIAL_ZERO */;
|
||||
continue;
|
||||
}
|
||||
if (n >= 49 /* charset.DIGIT_ONE */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 22 /* TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 21 /* TokenizerStates.NUMBER_AFTER_INITIAL_ZERO */:
|
||||
if (n === 46 /* charset.FULL_STOP */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 23 /* TokenizerStates.NUMBER_AFTER_FULL_STOP */;
|
||||
continue;
|
||||
}
|
||||
if (n === 101 /* charset.LATIN_SMALL_LETTER_E */ ||
|
||||
n === 69 /* charset.LATIN_CAPITAL_LETTER_E */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 25 /* TokenizerStates.NUMBER_AFTER_E */;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case 22 /* TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO */:
|
||||
if (n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
if (n === 46 /* charset.FULL_STOP */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 23 /* TokenizerStates.NUMBER_AFTER_FULL_STOP */;
|
||||
continue;
|
||||
}
|
||||
if (n === 101 /* charset.LATIN_SMALL_LETTER_E */ ||
|
||||
n === 69 /* charset.LATIN_CAPITAL_LETTER_E */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 25 /* TokenizerStates.NUMBER_AFTER_E */;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case 23 /* TokenizerStates.NUMBER_AFTER_FULL_STOP */:
|
||||
if (n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 24 /* TokenizerStates.NUMBER_AFTER_DECIMAL */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 24 /* TokenizerStates.NUMBER_AFTER_DECIMAL */:
|
||||
if (n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
if (n === 101 /* charset.LATIN_SMALL_LETTER_E */ ||
|
||||
n === 69 /* charset.LATIN_CAPITAL_LETTER_E */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 25 /* TokenizerStates.NUMBER_AFTER_E */;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// @ts-expect-error fall through case
|
||||
case 25 /* TokenizerStates.NUMBER_AFTER_E */:
|
||||
if (n === 43 /* charset.PLUS_SIGN */ || n === 45 /* charset.HYPHEN_MINUS */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 26 /* TokenizerStates.NUMBER_AFTER_E_AND_SIGN */;
|
||||
continue;
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case 26 /* TokenizerStates.NUMBER_AFTER_E_AND_SIGN */:
|
||||
if (n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = 27 /* TokenizerStates.NUMBER_AFTER_E_AND_DIGIT */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 27 /* TokenizerStates.NUMBER_AFTER_E_AND_DIGIT */:
|
||||
if (n >= 48 /* charset.DIGIT_ZERO */ && n <= 57 /* charset.DIGIT_NINE */) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// TRUE
|
||||
case 3 /* TokenizerStates.TRUE1 */:
|
||||
if (n === 114 /* charset.LATIN_SMALL_LETTER_R */) {
|
||||
this.state = 4 /* TokenizerStates.TRUE2 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 4 /* TokenizerStates.TRUE2 */:
|
||||
if (n === 117 /* charset.LATIN_SMALL_LETTER_U */) {
|
||||
this.state = 5 /* TokenizerStates.TRUE3 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 5 /* TokenizerStates.TRUE3 */:
|
||||
if (n === 101 /* charset.LATIN_SMALL_LETTER_E */) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// FALSE
|
||||
case 6 /* TokenizerStates.FALSE1 */:
|
||||
if (n === 97 /* charset.LATIN_SMALL_LETTER_A */) {
|
||||
this.state = 7 /* TokenizerStates.FALSE2 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 7 /* TokenizerStates.FALSE2 */:
|
||||
if (n === 108 /* charset.LATIN_SMALL_LETTER_L */) {
|
||||
this.state = 8 /* TokenizerStates.FALSE3 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 8 /* TokenizerStates.FALSE3 */:
|
||||
if (n === 115 /* charset.LATIN_SMALL_LETTER_S */) {
|
||||
this.state = 9 /* TokenizerStates.FALSE4 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 9 /* TokenizerStates.FALSE4 */:
|
||||
if (n === 101 /* charset.LATIN_SMALL_LETTER_E */) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 4;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// NULL
|
||||
case 10 /* TokenizerStates.NULL1 */:
|
||||
if (n === 117 /* charset.LATIN_SMALL_LETTER_U */) {
|
||||
this.state = 11 /* TokenizerStates.NULL2 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 11 /* TokenizerStates.NULL2 */:
|
||||
if (n === 108 /* charset.LATIN_SMALL_LETTER_L */) {
|
||||
this.state = 12 /* TokenizerStates.NULL3 */;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 12 /* TokenizerStates.NULL3 */:
|
||||
if (n === 108 /* charset.LATIN_SMALL_LETTER_L */) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 28 /* TokenizerStates.SEPARATOR */:
|
||||
this.separatorIndex += 1;
|
||||
if (!this.separatorBytes ||
|
||||
n !== this.separatorBytes[this.separatorIndex]) {
|
||||
break;
|
||||
}
|
||||
if (this.separatorIndex === this.separatorBytes.length - 1) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.SEPARATOR,
|
||||
value: this.separator,
|
||||
offset: this.offset + this.separatorIndex,
|
||||
});
|
||||
this.separatorIndex = 0;
|
||||
}
|
||||
continue;
|
||||
// BOM support
|
||||
case 30 /* TokenizerStates.BOM */:
|
||||
if (n === this.bom[this.bomIndex]) {
|
||||
if (this.bomIndex === this.bom.length - 1) {
|
||||
this.state = 0 /* TokenizerStates.START */;
|
||||
this.bom = undefined;
|
||||
this.bomIndex = 0;
|
||||
continue;
|
||||
}
|
||||
this.bomIndex += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case 1 /* TokenizerStates.ENDED */:
|
||||
if (n === 32 /* charset.SPACE */ ||
|
||||
n === 10 /* charset.NEWLINE */ ||
|
||||
n === 13 /* charset.CARRIAGE_RETURN */ ||
|
||||
n === 9 /* charset.TAB */) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
}
|
||||
throw new TokenizerError(`Unexpected "${String.fromCharCode(n)}" at position "${i}" in state ${TokenizerStateToString(this.state)}`);
|
||||
}
|
||||
if (this.emitPartialTokens) {
|
||||
switch (this.state) {
|
||||
case 3 /* TokenizerStates.TRUE1 */:
|
||||
case 4 /* TokenizerStates.TRUE2 */:
|
||||
case 5 /* TokenizerStates.TRUE3 */:
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case 6 /* TokenizerStates.FALSE1 */:
|
||||
case 7 /* TokenizerStates.FALSE2 */:
|
||||
case 8 /* TokenizerStates.FALSE3 */:
|
||||
case 9 /* TokenizerStates.FALSE4 */:
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case 10 /* TokenizerStates.NULL1 */:
|
||||
case 11 /* TokenizerStates.NULL2 */:
|
||||
case 12 /* TokenizerStates.NULL3 */:
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case 13 /* TokenizerStates.STRING_DEFAULT */: {
|
||||
const string = this.bufferedString.toString();
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
}
|
||||
case 21 /* TokenizerStates.NUMBER_AFTER_INITIAL_ZERO */:
|
||||
case 22 /* TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO */:
|
||||
case 24 /* TokenizerStates.NUMBER_AFTER_DECIMAL */:
|
||||
case 27 /* TokenizerStates.NUMBER_AFTER_E_AND_DIGIT */:
|
||||
try {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
}
|
||||
catch (_a) {
|
||||
// Number couldn't be parsed. Do nothing.
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (err) {
|
||||
this.error(err);
|
||||
}
|
||||
}
|
||||
emitNumber() {
|
||||
this.onToken({
|
||||
token: tokenType_js_1.default.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += this.bufferedNumber.byteLength - 1;
|
||||
}
|
||||
parseNumber(numberStr) {
|
||||
return Number(numberStr);
|
||||
}
|
||||
error(err) {
|
||||
if (this.state !== 1 /* TokenizerStates.ENDED */) {
|
||||
this.state = 2 /* TokenizerStates.ERROR */;
|
||||
}
|
||||
this.onError(err);
|
||||
}
|
||||
end() {
|
||||
switch (this.state) {
|
||||
case 21 /* TokenizerStates.NUMBER_AFTER_INITIAL_ZERO */:
|
||||
case 22 /* TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO */:
|
||||
case 24 /* TokenizerStates.NUMBER_AFTER_DECIMAL */:
|
||||
case 27 /* TokenizerStates.NUMBER_AFTER_E_AND_DIGIT */:
|
||||
this.state = 1 /* TokenizerStates.ENDED */;
|
||||
this.emitNumber();
|
||||
this.onEnd();
|
||||
break;
|
||||
case 29 /* TokenizerStates.BOM_OR_START */:
|
||||
case 0 /* TokenizerStates.START */:
|
||||
case 2 /* TokenizerStates.ERROR */:
|
||||
case 28 /* TokenizerStates.SEPARATOR */:
|
||||
this.state = 1 /* TokenizerStates.ENDED */;
|
||||
this.onEnd();
|
||||
break;
|
||||
default:
|
||||
this.error(new TokenizerError(`Tokenizer ended in the middle of a token (state: ${TokenizerStateToString(this.state)}). Either not all the data was received or the data was invalid.`));
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
onToken(parsedToken) {
|
||||
// Override me
|
||||
throw new TokenizerError('Can\'t emit tokens before the "onToken" callback has been set up.');
|
||||
}
|
||||
onError(err) {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
onEnd() {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
exports.default = Tokenizer;
|
||||
//# sourceMappingURL=tokenizer.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenizer.js.map
generated
vendored
Executable file
File diff suppressed because one or more lines are too long
35
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.d.ts
generated
vendored
Executable file
35
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,35 @@
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.js";
|
||||
export interface TokenParserOptions {
|
||||
paths?: string[];
|
||||
keepStack?: boolean;
|
||||
separator?: string;
|
||||
emitPartialValues?: boolean;
|
||||
}
|
||||
export declare class TokenParserError extends Error {
|
||||
constructor(message: string);
|
||||
}
|
||||
export default class TokenParser {
|
||||
private readonly paths?;
|
||||
private readonly keepStack;
|
||||
private readonly separator?;
|
||||
private state;
|
||||
private mode;
|
||||
private key;
|
||||
private value;
|
||||
private stack;
|
||||
constructor(opts?: TokenParserOptions);
|
||||
private shouldEmit;
|
||||
private push;
|
||||
private pop;
|
||||
private emit;
|
||||
private emitPartial;
|
||||
get isEnded(): boolean;
|
||||
write({ token, value, partial, }: Omit<ParsedTokenInfo, "offset">): void;
|
||||
error(err: Error): void;
|
||||
end(): void;
|
||||
onValue(parsedElementInfo: ParsedElementInfo): void;
|
||||
onError(err: Error): void;
|
||||
onEnd(): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.d.ts","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":"AAaA,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AACxE,OAAO,KAAK,EAAE,iBAAiB,EAAE,MAAM,oCAAoC,CAAC;AAmB5E,MAAM,WAAW,kBAAkB;IACjC,KAAK,CAAC,EAAE,MAAM,EAAE,CAAC;IACjB,SAAS,CAAC,EAAE,OAAO,CAAC;IACpB,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,iBAAiB,CAAC,EAAE,OAAO,CAAC;CAC7B;AASD,qBAAa,gBAAiB,SAAQ,KAAK;gBAC7B,OAAO,EAAE,MAAM;CAK5B;AAED,MAAM,CAAC,OAAO,OAAO,WAAW;IAC9B,OAAO,CAAC,QAAQ,CAAC,KAAK,CAAC,CAA2B;IAClD,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAU;IACpC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAS;IACpC,OAAO,CAAC,KAAK,CAA4C;IACzD,OAAO,CAAC,IAAI,CAA0C;IACtD,OAAO,CAAC,GAAG,CAAsB;IACjC,OAAO,CAAC,KAAK,CAAqC;IAClD,OAAO,CAAC,KAAK,CAAsB;gBAEvB,IAAI,CAAC,EAAE,kBAAkB;IA2BrC,OAAO,CAAC,UAAU;IAoBlB,OAAO,CAAC,IAAI;IASZ,OAAO,CAAC,GAAG;IAiBX,OAAO,CAAC,IAAI;IA6BZ,OAAO,CAAC,WAAW;IAuBnB,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,EACX,KAAK,EACL,KAAK,EACL,OAAO,GACR,EAAE,IAAI,CAAC,eAAe,EAAE,QAAQ,CAAC,GAAG,IAAI;IA8JlC,KAAK,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAQvB,GAAG,IAAI,IAAI;IAoBX,OAAO,CAAC,iBAAiB,EAAE,iBAAiB,GAAG,IAAI;IAOnD,OAAO,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAKzB,KAAK,IAAI,IAAI;CAGrB"}
|
||||
318
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.js
generated
vendored
Executable file
318
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.js
generated
vendored
Executable file
@@ -0,0 +1,318 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TokenParserError = void 0;
|
||||
const tokenType_js_1 = __importDefault(require("./utils/types/tokenType.js"));
|
||||
// Parser States
|
||||
var TokenParserState;
|
||||
(function (TokenParserState) {
|
||||
TokenParserState[TokenParserState["VALUE"] = 0] = "VALUE";
|
||||
TokenParserState[TokenParserState["KEY"] = 1] = "KEY";
|
||||
TokenParserState[TokenParserState["COLON"] = 2] = "COLON";
|
||||
TokenParserState[TokenParserState["COMMA"] = 3] = "COMMA";
|
||||
TokenParserState[TokenParserState["ENDED"] = 4] = "ENDED";
|
||||
TokenParserState[TokenParserState["ERROR"] = 5] = "ERROR";
|
||||
TokenParserState[TokenParserState["SEPARATOR"] = 6] = "SEPARATOR";
|
||||
})(TokenParserState || (TokenParserState = {}));
|
||||
function TokenParserStateToString(state) {
|
||||
return ["VALUE", "KEY", "COLON", "COMMA", "ENDED", "ERROR", "SEPARATOR"][state];
|
||||
}
|
||||
const defaultOpts = {
|
||||
paths: undefined,
|
||||
keepStack: true,
|
||||
separator: undefined,
|
||||
emitPartialValues: false,
|
||||
};
|
||||
class TokenParserError extends Error {
|
||||
constructor(message) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenParserError.prototype);
|
||||
}
|
||||
}
|
||||
exports.TokenParserError = TokenParserError;
|
||||
class TokenParser {
|
||||
constructor(opts) {
|
||||
this.state = 0 /* TokenParserState.VALUE */;
|
||||
this.mode = undefined;
|
||||
this.key = undefined;
|
||||
this.value = undefined;
|
||||
this.stack = [];
|
||||
opts = Object.assign(Object.assign({}, defaultOpts), opts);
|
||||
if (opts.paths) {
|
||||
this.paths = opts.paths.map((path) => {
|
||||
if (path === undefined || path === "$*")
|
||||
return undefined;
|
||||
if (!path.startsWith("$"))
|
||||
throw new TokenParserError(`Invalid selector "${path}". Should start with "$".`);
|
||||
const pathParts = path.split(".").slice(1);
|
||||
if (pathParts.includes(""))
|
||||
throw new TokenParserError(`Invalid selector "${path}". ".." syntax not supported.`);
|
||||
return pathParts;
|
||||
});
|
||||
}
|
||||
this.keepStack = opts.keepStack || false;
|
||||
this.separator = opts.separator;
|
||||
if (!opts.emitPartialValues) {
|
||||
this.emitPartial = () => { };
|
||||
}
|
||||
}
|
||||
shouldEmit() {
|
||||
if (!this.paths)
|
||||
return true;
|
||||
return this.paths.some((path) => {
|
||||
var _a;
|
||||
if (path === undefined)
|
||||
return true;
|
||||
if (path.length !== this.stack.length)
|
||||
return false;
|
||||
for (let i = 0; i < path.length - 1; i++) {
|
||||
const selector = path[i];
|
||||
const key = this.stack[i + 1].key;
|
||||
if (selector === "*")
|
||||
continue;
|
||||
if (selector !== (key === null || key === void 0 ? void 0 : key.toString()))
|
||||
return false;
|
||||
}
|
||||
const selector = path[path.length - 1];
|
||||
if (selector === "*")
|
||||
return true;
|
||||
return selector === ((_a = this.key) === null || _a === void 0 ? void 0 : _a.toString());
|
||||
});
|
||||
}
|
||||
push() {
|
||||
this.stack.push({
|
||||
key: this.key,
|
||||
value: this.value,
|
||||
mode: this.mode,
|
||||
emit: this.shouldEmit(),
|
||||
});
|
||||
}
|
||||
pop() {
|
||||
const value = this.value;
|
||||
let emit;
|
||||
({
|
||||
key: this.key,
|
||||
value: this.value,
|
||||
mode: this.mode,
|
||||
emit,
|
||||
} = this.stack.pop());
|
||||
this.state =
|
||||
this.mode !== undefined ? 3 /* TokenParserState.COMMA */ : 0 /* TokenParserState.VALUE */;
|
||||
this.emit(value, emit);
|
||||
}
|
||||
emit(value, emit) {
|
||||
if (!this.keepStack &&
|
||||
this.value &&
|
||||
this.stack.every((item) => !item.emit)) {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
delete this.value[this.key];
|
||||
}
|
||||
if (emit) {
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
});
|
||||
}
|
||||
if (this.stack.length === 0) {
|
||||
if (this.separator) {
|
||||
this.state = 6 /* TokenParserState.SEPARATOR */;
|
||||
}
|
||||
else if (this.separator === undefined) {
|
||||
this.end();
|
||||
}
|
||||
// else if separator === '', expect next JSON object.
|
||||
}
|
||||
}
|
||||
emitPartial(value) {
|
||||
if (!this.shouldEmit())
|
||||
return;
|
||||
if (this.state === 1 /* TokenParserState.KEY */) {
|
||||
this.onValue({
|
||||
value: undefined,
|
||||
key: value,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
return;
|
||||
}
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
}
|
||||
get isEnded() {
|
||||
return this.state === 4 /* TokenParserState.ENDED */;
|
||||
}
|
||||
write({ token, value, partial, }) {
|
||||
try {
|
||||
if (partial) {
|
||||
this.emitPartial(value);
|
||||
return;
|
||||
}
|
||||
if (this.state === 0 /* TokenParserState.VALUE */) {
|
||||
if (token === tokenType_js_1.default.STRING ||
|
||||
token === tokenType_js_1.default.NUMBER ||
|
||||
token === tokenType_js_1.default.TRUE ||
|
||||
token === tokenType_js_1.default.FALSE ||
|
||||
token === tokenType_js_1.default.NULL) {
|
||||
if (this.mode === 0 /* TokenParserMode.OBJECT */) {
|
||||
this.value[this.key] = value;
|
||||
this.state = 3 /* TokenParserState.COMMA */;
|
||||
}
|
||||
else if (this.mode === 1 /* TokenParserMode.ARRAY */) {
|
||||
this.value.push(value);
|
||||
this.state = 3 /* TokenParserState.COMMA */;
|
||||
}
|
||||
this.emit(value, this.shouldEmit());
|
||||
return;
|
||||
}
|
||||
if (token === tokenType_js_1.default.LEFT_BRACE) {
|
||||
this.push();
|
||||
if (this.mode === 0 /* TokenParserMode.OBJECT */) {
|
||||
this.value = this.value[this.key] = {};
|
||||
}
|
||||
else if (this.mode === 1 /* TokenParserMode.ARRAY */) {
|
||||
const val = {};
|
||||
this.value.push(val);
|
||||
this.value = val;
|
||||
}
|
||||
else {
|
||||
this.value = {};
|
||||
}
|
||||
this.mode = 0 /* TokenParserMode.OBJECT */;
|
||||
this.state = 1 /* TokenParserState.KEY */;
|
||||
this.key = undefined;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (token === tokenType_js_1.default.LEFT_BRACKET) {
|
||||
this.push();
|
||||
if (this.mode === 0 /* TokenParserMode.OBJECT */) {
|
||||
this.value = this.value[this.key] = [];
|
||||
}
|
||||
else if (this.mode === 1 /* TokenParserMode.ARRAY */) {
|
||||
const val = [];
|
||||
this.value.push(val);
|
||||
this.value = val;
|
||||
}
|
||||
else {
|
||||
this.value = [];
|
||||
}
|
||||
this.mode = 1 /* TokenParserMode.ARRAY */;
|
||||
this.state = 0 /* TokenParserState.VALUE */;
|
||||
this.key = 0;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (this.mode === 1 /* TokenParserMode.ARRAY */ &&
|
||||
token === tokenType_js_1.default.RIGHT_BRACKET &&
|
||||
this.value.length === 0) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === 1 /* TokenParserState.KEY */) {
|
||||
if (token === tokenType_js_1.default.STRING) {
|
||||
this.key = value;
|
||||
this.state = 2 /* TokenParserState.COLON */;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (token === tokenType_js_1.default.RIGHT_BRACE &&
|
||||
Object.keys(this.value).length === 0) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === 2 /* TokenParserState.COLON */) {
|
||||
if (token === tokenType_js_1.default.COLON) {
|
||||
this.state = 0 /* TokenParserState.VALUE */;
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === 3 /* TokenParserState.COMMA */) {
|
||||
if (token === tokenType_js_1.default.COMMA) {
|
||||
if (this.mode === 1 /* TokenParserMode.ARRAY */) {
|
||||
this.state = 0 /* TokenParserState.VALUE */;
|
||||
this.key += 1;
|
||||
return;
|
||||
}
|
||||
/* istanbul ignore else */
|
||||
if (this.mode === 0 /* TokenParserMode.OBJECT */) {
|
||||
this.state = 1 /* TokenParserState.KEY */;
|
||||
return;
|
||||
}
|
||||
}
|
||||
if ((token === tokenType_js_1.default.RIGHT_BRACE &&
|
||||
this.mode === 0 /* TokenParserMode.OBJECT */) ||
|
||||
(token === tokenType_js_1.default.RIGHT_BRACKET &&
|
||||
this.mode === 1 /* TokenParserMode.ARRAY */)) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === 6 /* TokenParserState.SEPARATOR */) {
|
||||
if (token === tokenType_js_1.default.SEPARATOR && value === this.separator) {
|
||||
this.state = 0 /* TokenParserState.VALUE */;
|
||||
return;
|
||||
}
|
||||
}
|
||||
// Edge case in which the separator is just whitespace and it's found in the middle of the JSON
|
||||
if (token === tokenType_js_1.default.SEPARATOR &&
|
||||
this.state !== 6 /* TokenParserState.SEPARATOR */ &&
|
||||
Array.from(value)
|
||||
.map((n) => n.charCodeAt(0))
|
||||
.every((n) => n === 32 /* charset.SPACE */ ||
|
||||
n === 10 /* charset.NEWLINE */ ||
|
||||
n === 13 /* charset.CARRIAGE_RETURN */ ||
|
||||
n === 9 /* charset.TAB */)) {
|
||||
// whitespace
|
||||
return;
|
||||
}
|
||||
throw new TokenParserError(`Unexpected ${tokenType_js_1.default[token]} (${JSON.stringify(value)}) in state ${TokenParserStateToString(this.state)}`);
|
||||
}
|
||||
catch (err) {
|
||||
this.error(err);
|
||||
}
|
||||
}
|
||||
error(err) {
|
||||
if (this.state !== 4 /* TokenParserState.ENDED */) {
|
||||
this.state = 5 /* TokenParserState.ERROR */;
|
||||
}
|
||||
this.onError(err);
|
||||
}
|
||||
end() {
|
||||
if ((this.state !== 0 /* TokenParserState.VALUE */ &&
|
||||
this.state !== 6 /* TokenParserState.SEPARATOR */) ||
|
||||
this.stack.length > 0) {
|
||||
this.error(new Error(`Parser ended in mid-parsing (state: ${TokenParserStateToString(this.state)}). Either not all the data was received or the data was invalid.`));
|
||||
}
|
||||
else {
|
||||
this.state = 4 /* TokenParserState.ENDED */;
|
||||
this.onEnd();
|
||||
}
|
||||
}
|
||||
/* eslint-disable-next-line @typescript-eslint/no-unused-vars */
|
||||
onValue(parsedElementInfo) {
|
||||
// Override me
|
||||
throw new TokenParserError('Can\'t emit data before the "onValue" callback has been set up.');
|
||||
}
|
||||
onError(err) {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
onEnd() {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
exports.default = TokenParser;
|
||||
//# sourceMappingURL=tokenparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/tokenparser.js.map
generated
vendored
Executable file
File diff suppressed because one or more lines are too long
30
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.d.ts
generated
vendored
Executable file
30
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.d.ts
generated
vendored
Executable file
@@ -0,0 +1,30 @@
|
||||
export interface StringBuilder {
|
||||
byteLength: number;
|
||||
appendChar: (char: number) => void;
|
||||
appendBuf: (buf: Uint8Array, start?: number, end?: number) => void;
|
||||
reset: () => void;
|
||||
toString: () => string;
|
||||
}
|
||||
export declare class NonBufferedString implements StringBuilder {
|
||||
private decoder;
|
||||
private strings;
|
||||
byteLength: number;
|
||||
appendChar(char: number): void;
|
||||
appendBuf(buf: Uint8Array, start?: number, end?: number): void;
|
||||
reset(): void;
|
||||
toString(): string;
|
||||
}
|
||||
export declare class BufferedString implements StringBuilder {
|
||||
private decoder;
|
||||
private buffer;
|
||||
private bufferOffset;
|
||||
private string;
|
||||
byteLength: number;
|
||||
constructor(bufferSize: number);
|
||||
appendChar(char: number): void;
|
||||
appendBuf(buf: Uint8Array, start?: number, end?: number): void;
|
||||
private flushStringBuffer;
|
||||
reset(): void;
|
||||
toString(): string;
|
||||
}
|
||||
//# sourceMappingURL=bufferedString.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"bufferedString.d.ts","sourceRoot":"","sources":["../../../src/utils/bufferedString.ts"],"names":[],"mappings":"AAAA,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,MAAM,CAAC;IACnB,UAAU,EAAE,CAAC,IAAI,EAAE,MAAM,KAAK,IAAI,CAAC;IACnC,SAAS,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM,KAAK,IAAI,CAAC;IACnE,KAAK,EAAE,MAAM,IAAI,CAAC;IAClB,QAAQ,EAAE,MAAM,MAAM,CAAC;CACxB;AAED,qBAAa,iBAAkB,YAAW,aAAa;IACrD,OAAO,CAAC,OAAO,CAA4B;IAC3C,OAAO,CAAC,OAAO,CAAgB;IACxB,UAAU,SAAK;IAEf,UAAU,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAK9B,SAAS,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,SAAI,EAAE,GAAG,GAAE,MAAmB,GAAG,IAAI;IAKrE,KAAK,IAAI,IAAI;IAKb,QAAQ,IAAI,MAAM;CAG1B;AAED,qBAAa,cAAe,YAAW,aAAa;IAClD,OAAO,CAAC,OAAO,CAA4B;IAC3C,OAAO,CAAC,MAAM,CAAa;IAC3B,OAAO,CAAC,YAAY,CAAK;IACzB,OAAO,CAAC,MAAM,CAAM;IACb,UAAU,SAAK;gBAEH,UAAU,EAAE,MAAM;IAI9B,UAAU,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAM9B,SAAS,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,SAAI,EAAE,GAAG,GAAE,MAAmB,GAAG,IAAI;IAQ5E,OAAO,CAAC,iBAAiB;IAOlB,KAAK,IAAI,IAAI;IAKb,QAAQ,IAAI,MAAM;CAI1B"}
|
||||
64
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.js
generated
vendored
Executable file
64
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.js
generated
vendored
Executable file
@@ -0,0 +1,64 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.BufferedString = exports.NonBufferedString = void 0;
|
||||
class NonBufferedString {
|
||||
constructor() {
|
||||
this.decoder = new TextDecoder("utf-8");
|
||||
this.strings = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
appendChar(char) {
|
||||
this.strings.push(String.fromCharCode(char));
|
||||
this.byteLength += 1;
|
||||
}
|
||||
appendBuf(buf, start = 0, end = buf.length) {
|
||||
this.strings.push(this.decoder.decode(buf.subarray(start, end)));
|
||||
this.byteLength += end - start;
|
||||
}
|
||||
reset() {
|
||||
this.strings = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
toString() {
|
||||
return this.strings.join("");
|
||||
}
|
||||
}
|
||||
exports.NonBufferedString = NonBufferedString;
|
||||
class BufferedString {
|
||||
constructor(bufferSize) {
|
||||
this.decoder = new TextDecoder("utf-8");
|
||||
this.bufferOffset = 0;
|
||||
this.string = "";
|
||||
this.byteLength = 0;
|
||||
this.buffer = new Uint8Array(bufferSize);
|
||||
}
|
||||
appendChar(char) {
|
||||
if (this.bufferOffset >= this.buffer.length)
|
||||
this.flushStringBuffer();
|
||||
this.buffer[this.bufferOffset++] = char;
|
||||
this.byteLength += 1;
|
||||
}
|
||||
appendBuf(buf, start = 0, end = buf.length) {
|
||||
const size = end - start;
|
||||
if (this.bufferOffset + size > this.buffer.length)
|
||||
this.flushStringBuffer();
|
||||
this.buffer.set(buf.subarray(start, end), this.bufferOffset);
|
||||
this.bufferOffset += size;
|
||||
this.byteLength += size;
|
||||
}
|
||||
flushStringBuffer() {
|
||||
this.string += this.decoder.decode(this.buffer.subarray(0, this.bufferOffset));
|
||||
this.bufferOffset = 0;
|
||||
}
|
||||
reset() {
|
||||
this.string = "";
|
||||
this.bufferOffset = 0;
|
||||
this.byteLength = 0;
|
||||
}
|
||||
toString() {
|
||||
this.flushStringBuffer();
|
||||
return this.string;
|
||||
}
|
||||
}
|
||||
exports.BufferedString = BufferedString;
|
||||
//# sourceMappingURL=bufferedString.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/bufferedString.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"bufferedString.js","sourceRoot":"","sources":["../../../src/utils/bufferedString.ts"],"names":[],"mappings":";;;AAQA,MAAa,iBAAiB;IAA9B;QACU,YAAO,GAAG,IAAI,WAAW,CAAC,OAAO,CAAC,CAAC;QACnC,YAAO,GAAa,EAAE,CAAC;QACxB,eAAU,GAAG,CAAC,CAAC;IAoBxB,CAAC;IAlBQ,UAAU,CAAC,IAAY;QAC5B,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,IAAI,CAAC,CAAC,CAAC;QAC7C,IAAI,CAAC,UAAU,IAAI,CAAC,CAAC;IACvB,CAAC;IAEM,SAAS,CAAC,GAAe,EAAE,KAAK,GAAG,CAAC,EAAE,MAAc,GAAG,CAAC,MAAM;QACnE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC,GAAG,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC;QACjE,IAAI,CAAC,UAAU,IAAI,GAAG,GAAG,KAAK,CAAC;IACjC,CAAC;IAEM,KAAK;QACV,IAAI,CAAC,OAAO,GAAG,EAAE,CAAC;QAClB,IAAI,CAAC,UAAU,GAAG,CAAC,CAAC;IACtB,CAAC;IAEM,QAAQ;QACb,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;IAC/B,CAAC;CACF;AAvBD,8CAuBC;AAED,MAAa,cAAc;IAOzB,YAAmB,UAAkB;QAN7B,YAAO,GAAG,IAAI,WAAW,CAAC,OAAO,CAAC,CAAC;QAEnC,iBAAY,GAAG,CAAC,CAAC;QACjB,WAAM,GAAG,EAAE,CAAC;QACb,eAAU,GAAG,CAAC,CAAC;QAGpB,IAAI,CAAC,MAAM,GAAG,IAAI,UAAU,CAAC,UAAU,CAAC,CAAC;IAC3C,CAAC;IAEM,UAAU,CAAC,IAAY;QAC5B,IAAI,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,MAAM,CAAC,MAAM;YAAE,IAAI,CAAC,iBAAiB,EAAE,CAAC;QACtE,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,YAAY,EAAE,CAAC,GAAG,IAAI,CAAC;QACxC,IAAI,CAAC,UAAU,IAAI,CAAC,CAAC;IACvB,CAAC;IAEM,SAAS,CAAC,GAAe,EAAE,KAAK,GAAG,CAAC,EAAE,MAAc,GAAG,CAAC,MAAM;QACnE,MAAM,IAAI,GAAG,GAAG,GAAG,KAAK,CAAC;QACzB,IAAI,IAAI,CAAC,YAAY,GAAG,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,MAAM;YAAE,IAAI,CAAC,iBAAiB,EAAE,CAAC;QAC5E,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,EAAE,IAAI,CAAC,YAAY,CAAC,CAAC;QAC7D,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC;QAC1B,IAAI,CAAC,UAAU,IAAI,IAAI,CAAC;IAC1B,CAAC;IAEO,iBAAiB;QACvB,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,OAAO,CAAC,MAAM,CAChC,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,EAAE,IAAI,CAAC,YAAY,CAAC,CAC3C,CAAC;QACF,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC;IACxB,CAAC;IAEM,KAAK;QACV,IAAI,CAAC,MAAM,GAAG,EAAE,CAAC;QACjB,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC;QACtB,IAAI,CAAC,UAAU,GAAG,CAAC,CAAC;IACtB,CAAC;IACM,QAAQ;QACb,IAAI,CAAC,iBAAiB,EAAE,CAAC;QACzB,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;CACF;AAzCD,wCAyCC"}
|
||||
8
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.d.ts
generated
vendored
Executable file
8
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.d.ts
generated
vendored
Executable file
@@ -0,0 +1,8 @@
|
||||
export type JsonPrimitive = string | number | boolean | null;
|
||||
export type JsonKey = string | number | undefined;
|
||||
export type JsonObject = {
|
||||
[key: string]: JsonPrimitive | JsonStruct;
|
||||
};
|
||||
export type JsonArray = (JsonPrimitive | JsonStruct)[];
|
||||
export type JsonStruct = JsonObject | JsonArray;
|
||||
//# sourceMappingURL=jsonTypes.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonTypes.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/jsonTypes.ts"],"names":[],"mappings":"AAAA,MAAM,MAAM,aAAa,GAAG,MAAM,GAAG,MAAM,GAAG,OAAO,GAAG,IAAI,CAAC;AAC7D,MAAM,MAAM,OAAO,GAAG,MAAM,GAAG,MAAM,GAAG,SAAS,CAAC;AAClD,MAAM,MAAM,UAAU,GAAG;IAAE,CAAC,GAAG,EAAE,MAAM,GAAG,aAAa,GAAG,UAAU,CAAA;CAAE,CAAC;AACvE,MAAM,MAAM,SAAS,GAAG,CAAC,aAAa,GAAG,UAAU,CAAC,EAAE,CAAC;AACvD,MAAM,MAAM,UAAU,GAAG,UAAU,GAAG,SAAS,CAAC"}
|
||||
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.js
generated
vendored
Executable file
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.js
generated
vendored
Executable file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=jsonTypes.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/jsonTypes.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonTypes.js","sourceRoot":"","sources":["../../../../src/utils/types/jsonTypes.ts"],"names":[],"mappings":""}
|
||||
28
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.d.ts
generated
vendored
Executable file
28
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.d.ts
generated
vendored
Executable file
@@ -0,0 +1,28 @@
|
||||
import type { StackElement } from "./stackElement.js";
|
||||
import type { JsonPrimitive, JsonKey, JsonObject, JsonArray, JsonStruct } from "./jsonTypes.js";
|
||||
export interface ParsedElementInfo {
|
||||
value?: JsonPrimitive | JsonStruct;
|
||||
parent?: JsonStruct;
|
||||
key?: JsonKey;
|
||||
stack: StackElement[];
|
||||
partial?: boolean;
|
||||
}
|
||||
export interface ParsedArrayElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonArray;
|
||||
key: number;
|
||||
stack: StackElement[];
|
||||
}
|
||||
export interface ParsedObjectProperty extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonObject;
|
||||
key: string;
|
||||
stack: StackElement[];
|
||||
}
|
||||
export interface ParsedTopLevelElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: undefined;
|
||||
key: undefined;
|
||||
stack: [];
|
||||
}
|
||||
//# sourceMappingURL=parsedElementInfo.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedElementInfo.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/parsedElementInfo.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,mBAAmB,CAAC;AACtD,OAAO,KAAK,EACV,aAAa,EACb,OAAO,EACP,UAAU,EACV,SAAS,EACT,UAAU,EACX,MAAM,gBAAgB,CAAC;AAExB,MAAM,WAAW,iBAAiB;IAChC,KAAK,CAAC,EAAE,aAAa,GAAG,UAAU,CAAC;IACnC,MAAM,CAAC,EAAE,UAAU,CAAC;IACpB,GAAG,CAAC,EAAE,OAAO,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,OAAO,CAAC,EAAE,OAAO,CAAC;CACnB;AAED,MAAM,WAAW,kBAAmB,SAAQ,iBAAiB;IAC3D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,SAAS,CAAC;IAClB,GAAG,EAAE,MAAM,CAAC;IACZ,KAAK,EAAE,YAAY,EAAE,CAAC;CACvB;AAED,MAAM,WAAW,oBAAqB,SAAQ,iBAAiB;IAC7D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,UAAU,CAAC;IACnB,GAAG,EAAE,MAAM,CAAC;IACZ,KAAK,EAAE,YAAY,EAAE,CAAC;CACvB;AAED,MAAM,WAAW,qBAAsB,SAAQ,iBAAiB;IAC9D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,SAAS,CAAC;IAClB,GAAG,EAAE,SAAS,CAAC;IACf,KAAK,EAAE,EAAE,CAAC;CACX"}
|
||||
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.js
generated
vendored
Executable file
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.js
generated
vendored
Executable file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=parsedElementInfo.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedElementInfo.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedElementInfo.js","sourceRoot":"","sources":["../../../../src/utils/types/parsedElementInfo.ts"],"names":[],"mappings":""}
|
||||
57
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.d.ts
generated
vendored
Executable file
57
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.d.ts
generated
vendored
Executable file
@@ -0,0 +1,57 @@
|
||||
import TokenType from "./tokenType.js";
|
||||
import type { JsonPrimitive } from "./jsonTypes.js";
|
||||
export interface ParsedTokenInfo {
|
||||
token: TokenType;
|
||||
value: JsonPrimitive;
|
||||
offset: number;
|
||||
partial?: boolean;
|
||||
}
|
||||
export interface ParsedLeftBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACE;
|
||||
value: "{";
|
||||
}
|
||||
export interface ParsedRightBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACE;
|
||||
value: "}";
|
||||
}
|
||||
export interface ParsedLeftBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACKET;
|
||||
value: "[";
|
||||
}
|
||||
export interface ParsedRighBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACKET;
|
||||
value: "]";
|
||||
}
|
||||
export interface ParsedColonTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COLON;
|
||||
value: ":";
|
||||
}
|
||||
export interface ParsedCommaTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COMMA;
|
||||
value: ",";
|
||||
}
|
||||
export interface ParsedTrueTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.TRUE;
|
||||
value: true;
|
||||
}
|
||||
export interface ParsedFalseTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.FALSE;
|
||||
value: false;
|
||||
}
|
||||
export interface ParsedNullTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NULL;
|
||||
value: null;
|
||||
}
|
||||
export interface ParsedStringTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.STRING;
|
||||
value: string;
|
||||
}
|
||||
export interface ParsedNumberTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NUMBER;
|
||||
value: number;
|
||||
}
|
||||
export interface ParsedSeparatorTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.SEPARATOR;
|
||||
value: string;
|
||||
}
|
||||
//# sourceMappingURL=parsedTokenInfo.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedTokenInfo.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/parsedTokenInfo.ts"],"names":[],"mappings":"AAAA,OAAO,SAAS,MAAM,gBAAgB,CAAC;AACvC,OAAO,KAAK,EAAE,aAAa,EAAE,MAAM,gBAAgB,CAAC;AAEpD,MAAM,WAAW,eAAe;IAC9B,KAAK,EAAE,SAAS,CAAC;IACjB,KAAK,EAAE,aAAa,CAAC;IACrB,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,OAAO,CAAC;CACnB;AAED,MAAM,WAAW,wBAAyB,SAAQ,eAAe;IAC/D,KAAK,EAAE,SAAS,CAAC,UAAU,CAAC;IAC5B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,yBAA0B,SAAQ,eAAe;IAChE,KAAK,EAAE,SAAS,CAAC,WAAW,CAAC;IAC7B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,0BAA2B,SAAQ,eAAe;IACjE,KAAK,EAAE,SAAS,CAAC,YAAY,CAAC;IAC9B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,0BAA2B,SAAQ,eAAe;IACjE,KAAK,EAAE,SAAS,CAAC,aAAa,CAAC;IAC/B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,mBAAoB,SAAQ,eAAe;IAC1D,KAAK,EAAE,SAAS,CAAC,IAAI,CAAC;IACtB,KAAK,EAAE,IAAI,CAAC;CACb;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,KAAK,CAAC;CACd;AACD,MAAM,WAAW,mBAAoB,SAAQ,eAAe;IAC1D,KAAK,EAAE,SAAS,CAAC,IAAI,CAAC;IACtB,KAAK,EAAE,IAAI,CAAC;CACb;AACD,MAAM,WAAW,qBAAsB,SAAQ,eAAe;IAC5D,KAAK,EAAE,SAAS,CAAC,MAAM,CAAC;IACxB,KAAK,EAAE,MAAM,CAAC;CACf;AACD,MAAM,WAAW,qBAAsB,SAAQ,eAAe;IAC5D,KAAK,EAAE,SAAS,CAAC,MAAM,CAAC;IACxB,KAAK,EAAE,MAAM,CAAC;CACf;AACD,MAAM,WAAW,wBAAyB,SAAQ,eAAe;IAC/D,KAAK,EAAE,SAAS,CAAC,SAAS,CAAC;IAC3B,KAAK,EAAE,MAAM,CAAC;CACf"}
|
||||
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.js
generated
vendored
Executable file
3
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.js
generated
vendored
Executable file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=parsedTokenInfo.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/parsedTokenInfo.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedTokenInfo.js","sourceRoot":"","sources":["../../../../src/utils/types/parsedTokenInfo.ts"],"names":[],"mappings":""}
|
||||
12
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.d.ts
generated
vendored
Executable file
12
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.d.ts
generated
vendored
Executable file
@@ -0,0 +1,12 @@
|
||||
import type { JsonKey, JsonStruct } from "./jsonTypes.js";
|
||||
export declare const enum TokenParserMode {
|
||||
OBJECT = 0,
|
||||
ARRAY = 1
|
||||
}
|
||||
export interface StackElement {
|
||||
key: JsonKey;
|
||||
value: JsonStruct;
|
||||
mode?: TokenParserMode;
|
||||
emit: boolean;
|
||||
}
|
||||
//# sourceMappingURL=stackElement.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stackElement.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/stackElement.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EAAE,OAAO,EAAE,UAAU,EAAE,MAAM,gBAAgB,CAAC;AAE1D,0BAAkB,eAAe;IAC/B,MAAM,IAAA;IACN,KAAK,IAAA;CACN;AAED,MAAM,WAAW,YAAY;IAC3B,GAAG,EAAE,OAAO,CAAC;IACb,KAAK,EAAE,UAAU,CAAC;IAClB,IAAI,CAAC,EAAE,eAAe,CAAC;IACvB,IAAI,EAAE,OAAO,CAAC;CACf"}
|
||||
9
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.js
generated
vendored
Executable file
9
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.js
generated
vendored
Executable file
@@ -0,0 +1,9 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.TokenParserMode = void 0;
|
||||
var TokenParserMode;
|
||||
(function (TokenParserMode) {
|
||||
TokenParserMode[TokenParserMode["OBJECT"] = 0] = "OBJECT";
|
||||
TokenParserMode[TokenParserMode["ARRAY"] = 1] = "ARRAY";
|
||||
})(TokenParserMode || (exports.TokenParserMode = TokenParserMode = {}));
|
||||
//# sourceMappingURL=stackElement.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/stackElement.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stackElement.js","sourceRoot":"","sources":["../../../../src/utils/types/stackElement.ts"],"names":[],"mappings":";;;AAEA,IAAkB,eAGjB;AAHD,WAAkB,eAAe;IAC/B,yDAAM,CAAA;IACN,uDAAK,CAAA;AACP,CAAC,EAHiB,eAAe,+BAAf,eAAe,QAGhC"}
|
||||
16
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.d.ts
generated
vendored
Executable file
16
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.d.ts
generated
vendored
Executable file
@@ -0,0 +1,16 @@
|
||||
declare enum TokenType {
|
||||
LEFT_BRACE = 0,
|
||||
RIGHT_BRACE = 1,
|
||||
LEFT_BRACKET = 2,
|
||||
RIGHT_BRACKET = 3,
|
||||
COLON = 4,
|
||||
COMMA = 5,
|
||||
TRUE = 6,
|
||||
FALSE = 7,
|
||||
NULL = 8,
|
||||
STRING = 9,
|
||||
NUMBER = 10,
|
||||
SEPARATOR = 11
|
||||
}
|
||||
export default TokenType;
|
||||
//# sourceMappingURL=tokenType.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenType.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/tokenType.ts"],"names":[],"mappings":"AAAA,aAAK,SAAS;IACZ,UAAU,IAAA;IACV,WAAW,IAAA;IACX,YAAY,IAAA;IACZ,aAAa,IAAA;IACb,KAAK,IAAA;IACL,KAAK,IAAA;IACL,IAAI,IAAA;IACJ,KAAK,IAAA;IACL,IAAI,IAAA;IACJ,MAAM,IAAA;IACN,MAAM,KAAA;IACN,SAAS,KAAA;CACV;AAED,eAAe,SAAS,CAAC"}
|
||||
19
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.js
generated
vendored
Executable file
19
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.js
generated
vendored
Executable file
@@ -0,0 +1,19 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
var TokenType;
|
||||
(function (TokenType) {
|
||||
TokenType[TokenType["LEFT_BRACE"] = 0] = "LEFT_BRACE";
|
||||
TokenType[TokenType["RIGHT_BRACE"] = 1] = "RIGHT_BRACE";
|
||||
TokenType[TokenType["LEFT_BRACKET"] = 2] = "LEFT_BRACKET";
|
||||
TokenType[TokenType["RIGHT_BRACKET"] = 3] = "RIGHT_BRACKET";
|
||||
TokenType[TokenType["COLON"] = 4] = "COLON";
|
||||
TokenType[TokenType["COMMA"] = 5] = "COMMA";
|
||||
TokenType[TokenType["TRUE"] = 6] = "TRUE";
|
||||
TokenType[TokenType["FALSE"] = 7] = "FALSE";
|
||||
TokenType[TokenType["NULL"] = 8] = "NULL";
|
||||
TokenType[TokenType["STRING"] = 9] = "STRING";
|
||||
TokenType[TokenType["NUMBER"] = 10] = "NUMBER";
|
||||
TokenType[TokenType["SEPARATOR"] = 11] = "SEPARATOR";
|
||||
})(TokenType || (TokenType = {}));
|
||||
exports.default = TokenType;
|
||||
//# sourceMappingURL=tokenType.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/types/tokenType.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenType.js","sourceRoot":"","sources":["../../../../src/utils/types/tokenType.ts"],"names":[],"mappings":";;AAAA,IAAK,SAaJ;AAbD,WAAK,SAAS;IACZ,qDAAU,CAAA;IACV,uDAAW,CAAA;IACX,yDAAY,CAAA;IACZ,2DAAa,CAAA;IACb,2CAAK,CAAA;IACL,2CAAK,CAAA;IACL,yCAAI,CAAA;IACJ,2CAAK,CAAA;IACL,yCAAI,CAAA;IACJ,6CAAM,CAAA;IACN,8CAAM,CAAA;IACN,oDAAS,CAAA;AACX,CAAC,EAbI,SAAS,KAAT,SAAS,QAab;AAED,kBAAe,SAAS,CAAC"}
|
||||
106
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.d.ts
generated
vendored
Executable file
106
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.d.ts
generated
vendored
Executable file
@@ -0,0 +1,106 @@
|
||||
export declare const enum charset {
|
||||
BACKSPACE = 8,
|
||||
FORM_FEED = 12,
|
||||
NEWLINE = 10,
|
||||
CARRIAGE_RETURN = 13,
|
||||
TAB = 9,
|
||||
SPACE = 32,
|
||||
EXCLAMATION_MARK = 33,
|
||||
QUOTATION_MARK = 34,
|
||||
NUMBER_SIGN = 35,
|
||||
DOLLAR_SIGN = 36,
|
||||
PERCENT_SIGN = 37,
|
||||
AMPERSAND = 38,
|
||||
APOSTROPHE = 39,
|
||||
LEFT_PARENTHESIS = 40,
|
||||
RIGHT_PARENTHESIS = 41,
|
||||
ASTERISK = 42,
|
||||
PLUS_SIGN = 43,
|
||||
COMMA = 44,
|
||||
HYPHEN_MINUS = 45,
|
||||
FULL_STOP = 46,
|
||||
SOLIDUS = 47,
|
||||
DIGIT_ZERO = 48,
|
||||
DIGIT_ONE = 49,
|
||||
DIGIT_TWO = 50,
|
||||
DIGIT_THREE = 51,
|
||||
DIGIT_FOUR = 52,
|
||||
DIGIT_FIVE = 53,
|
||||
DIGIT_SIX = 54,
|
||||
DIGIT_SEVEN = 55,
|
||||
DIGIT_EIGHT = 56,
|
||||
DIGIT_NINE = 57,
|
||||
COLON = 58,
|
||||
SEMICOLON = 59,
|
||||
LESS_THAN_SIGN = 60,
|
||||
EQUALS_SIGN = 61,
|
||||
GREATER_THAN_SIGN = 62,
|
||||
QUESTION_MARK = 63,
|
||||
COMMERCIAL_AT = 64,
|
||||
LATIN_CAPITAL_LETTER_A = 65,
|
||||
LATIN_CAPITAL_LETTER_B = 66,
|
||||
LATIN_CAPITAL_LETTER_C = 67,
|
||||
LATIN_CAPITAL_LETTER_D = 68,
|
||||
LATIN_CAPITAL_LETTER_E = 69,
|
||||
LATIN_CAPITAL_LETTER_F = 70,
|
||||
LATIN_CAPITAL_LETTER_G = 71,
|
||||
LATIN_CAPITAL_LETTER_H = 72,
|
||||
LATIN_CAPITAL_LETTER_I = 73,
|
||||
LATIN_CAPITAL_LETTER_J = 74,
|
||||
LATIN_CAPITAL_LETTER_K = 75,
|
||||
LATIN_CAPITAL_LETTER_L = 76,
|
||||
LATIN_CAPITAL_LETTER_M = 77,
|
||||
LATIN_CAPITAL_LETTER_N = 78,
|
||||
LATIN_CAPITAL_LETTER_O = 79,
|
||||
LATIN_CAPITAL_LETTER_P = 80,
|
||||
LATIN_CAPITAL_LETTER_Q = 81,
|
||||
LATIN_CAPITAL_LETTER_R = 82,
|
||||
LATIN_CAPITAL_LETTER_S = 83,
|
||||
LATIN_CAPITAL_LETTER_T = 84,
|
||||
LATIN_CAPITAL_LETTER_U = 85,
|
||||
LATIN_CAPITAL_LETTER_V = 86,
|
||||
LATIN_CAPITAL_LETTER_W = 87,
|
||||
LATIN_CAPITAL_LETTER_X = 88,
|
||||
LATIN_CAPITAL_LETTER_Y = 89,
|
||||
LATIN_CAPITAL_LETTER_Z = 90,
|
||||
LEFT_SQUARE_BRACKET = 91,
|
||||
REVERSE_SOLIDUS = 92,
|
||||
RIGHT_SQUARE_BRACKET = 93,
|
||||
CIRCUMFLEX_ACCENT = 94,
|
||||
LOW_LINE = 95,
|
||||
GRAVE_ACCENT = 96,
|
||||
LATIN_SMALL_LETTER_A = 97,
|
||||
LATIN_SMALL_LETTER_B = 98,
|
||||
LATIN_SMALL_LETTER_C = 99,
|
||||
LATIN_SMALL_LETTER_D = 100,
|
||||
LATIN_SMALL_LETTER_E = 101,
|
||||
LATIN_SMALL_LETTER_F = 102,
|
||||
LATIN_SMALL_LETTER_G = 103,
|
||||
LATIN_SMALL_LETTER_H = 104,
|
||||
LATIN_SMALL_LETTER_I = 105,
|
||||
LATIN_SMALL_LETTER_J = 106,
|
||||
LATIN_SMALL_LETTER_K = 107,
|
||||
LATIN_SMALL_LETTER_L = 108,
|
||||
LATIN_SMALL_LETTER_M = 109,
|
||||
LATIN_SMALL_LETTER_N = 110,
|
||||
LATIN_SMALL_LETTER_O = 111,
|
||||
LATIN_SMALL_LETTER_P = 112,
|
||||
LATIN_SMALL_LETTER_Q = 113,
|
||||
LATIN_SMALL_LETTER_R = 114,
|
||||
LATIN_SMALL_LETTER_S = 115,
|
||||
LATIN_SMALL_LETTER_T = 116,
|
||||
LATIN_SMALL_LETTER_U = 117,
|
||||
LATIN_SMALL_LETTER_V = 118,
|
||||
LATIN_SMALL_LETTER_W = 119,
|
||||
LATIN_SMALL_LETTER_X = 120,
|
||||
LATIN_SMALL_LETTER_Y = 121,
|
||||
LATIN_SMALL_LETTER_Z = 122,
|
||||
LEFT_CURLY_BRACKET = 123,
|
||||
VERTICAL_LINE = 124,
|
||||
RIGHT_CURLY_BRACKET = 125,
|
||||
TILDE = 126
|
||||
}
|
||||
export declare const escapedSequences: {
|
||||
[key: number]: number;
|
||||
};
|
||||
//# sourceMappingURL=utf-8.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"utf-8.d.ts","sourceRoot":"","sources":["../../../src/utils/utf-8.ts"],"names":[],"mappings":"AAAA,0BAAkB,OAAO;IACvB,SAAS,IAAM;IACf,SAAS,KAAM;IACf,OAAO,KAAM;IACb,eAAe,KAAM;IACrB,GAAG,IAAM;IACT,KAAK,KAAO;IACZ,gBAAgB,KAAO;IACvB,cAAc,KAAO;IACrB,WAAW,KAAO;IAClB,WAAW,KAAO;IAClB,YAAY,KAAO;IACnB,SAAS,KAAO;IAChB,UAAU,KAAO;IACjB,gBAAgB,KAAO;IACvB,iBAAiB,KAAO;IACxB,QAAQ,KAAO;IACf,SAAS,KAAO;IAChB,KAAK,KAAO;IACZ,YAAY,KAAO;IACnB,SAAS,KAAO;IAChB,OAAO,KAAO;IACd,UAAU,KAAO;IACjB,SAAS,KAAO;IAChB,SAAS,KAAO;IAChB,WAAW,KAAO;IAClB,UAAU,KAAO;IACjB,UAAU,KAAO;IACjB,SAAS,KAAO;IAChB,WAAW,KAAO;IAClB,WAAW,KAAO;IAClB,UAAU,KAAO;IACjB,KAAK,KAAO;IACZ,SAAS,KAAO;IAChB,cAAc,KAAO;IACrB,WAAW,KAAO;IAClB,iBAAiB,KAAO;IACxB,aAAa,KAAO;IACpB,aAAa,KAAO;IACpB,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,mBAAmB,KAAO;IAC1B,eAAe,KAAO;IACtB,oBAAoB,KAAO;IAC3B,iBAAiB,KAAO;IACxB,QAAQ,KAAO;IACf,YAAY,KAAO;IACnB,oBAAoB,KAAO;IAC3B,oBAAoB,KAAO;IAC3B,oBAAoB,KAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,kBAAkB,MAAO;IACzB,aAAa,MAAO;IACpB,mBAAmB,MAAO;IAC1B,KAAK,MAAO;CACb;AAED,eAAO,MAAM,gBAAgB,EAAE;IAAE,CAAC,GAAG,EAAE,MAAM,GAAG,MAAM,CAAA;CASrD,CAAC"}
|
||||
117
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.js
generated
vendored
Executable file
117
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.js
generated
vendored
Executable file
@@ -0,0 +1,117 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.escapedSequences = exports.charset = void 0;
|
||||
var charset;
|
||||
(function (charset) {
|
||||
charset[charset["BACKSPACE"] = 8] = "BACKSPACE";
|
||||
charset[charset["FORM_FEED"] = 12] = "FORM_FEED";
|
||||
charset[charset["NEWLINE"] = 10] = "NEWLINE";
|
||||
charset[charset["CARRIAGE_RETURN"] = 13] = "CARRIAGE_RETURN";
|
||||
charset[charset["TAB"] = 9] = "TAB";
|
||||
charset[charset["SPACE"] = 32] = "SPACE";
|
||||
charset[charset["EXCLAMATION_MARK"] = 33] = "EXCLAMATION_MARK";
|
||||
charset[charset["QUOTATION_MARK"] = 34] = "QUOTATION_MARK";
|
||||
charset[charset["NUMBER_SIGN"] = 35] = "NUMBER_SIGN";
|
||||
charset[charset["DOLLAR_SIGN"] = 36] = "DOLLAR_SIGN";
|
||||
charset[charset["PERCENT_SIGN"] = 37] = "PERCENT_SIGN";
|
||||
charset[charset["AMPERSAND"] = 38] = "AMPERSAND";
|
||||
charset[charset["APOSTROPHE"] = 39] = "APOSTROPHE";
|
||||
charset[charset["LEFT_PARENTHESIS"] = 40] = "LEFT_PARENTHESIS";
|
||||
charset[charset["RIGHT_PARENTHESIS"] = 41] = "RIGHT_PARENTHESIS";
|
||||
charset[charset["ASTERISK"] = 42] = "ASTERISK";
|
||||
charset[charset["PLUS_SIGN"] = 43] = "PLUS_SIGN";
|
||||
charset[charset["COMMA"] = 44] = "COMMA";
|
||||
charset[charset["HYPHEN_MINUS"] = 45] = "HYPHEN_MINUS";
|
||||
charset[charset["FULL_STOP"] = 46] = "FULL_STOP";
|
||||
charset[charset["SOLIDUS"] = 47] = "SOLIDUS";
|
||||
charset[charset["DIGIT_ZERO"] = 48] = "DIGIT_ZERO";
|
||||
charset[charset["DIGIT_ONE"] = 49] = "DIGIT_ONE";
|
||||
charset[charset["DIGIT_TWO"] = 50] = "DIGIT_TWO";
|
||||
charset[charset["DIGIT_THREE"] = 51] = "DIGIT_THREE";
|
||||
charset[charset["DIGIT_FOUR"] = 52] = "DIGIT_FOUR";
|
||||
charset[charset["DIGIT_FIVE"] = 53] = "DIGIT_FIVE";
|
||||
charset[charset["DIGIT_SIX"] = 54] = "DIGIT_SIX";
|
||||
charset[charset["DIGIT_SEVEN"] = 55] = "DIGIT_SEVEN";
|
||||
charset[charset["DIGIT_EIGHT"] = 56] = "DIGIT_EIGHT";
|
||||
charset[charset["DIGIT_NINE"] = 57] = "DIGIT_NINE";
|
||||
charset[charset["COLON"] = 58] = "COLON";
|
||||
charset[charset["SEMICOLON"] = 59] = "SEMICOLON";
|
||||
charset[charset["LESS_THAN_SIGN"] = 60] = "LESS_THAN_SIGN";
|
||||
charset[charset["EQUALS_SIGN"] = 61] = "EQUALS_SIGN";
|
||||
charset[charset["GREATER_THAN_SIGN"] = 62] = "GREATER_THAN_SIGN";
|
||||
charset[charset["QUESTION_MARK"] = 63] = "QUESTION_MARK";
|
||||
charset[charset["COMMERCIAL_AT"] = 64] = "COMMERCIAL_AT";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_A"] = 65] = "LATIN_CAPITAL_LETTER_A";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_B"] = 66] = "LATIN_CAPITAL_LETTER_B";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_C"] = 67] = "LATIN_CAPITAL_LETTER_C";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_D"] = 68] = "LATIN_CAPITAL_LETTER_D";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_E"] = 69] = "LATIN_CAPITAL_LETTER_E";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_F"] = 70] = "LATIN_CAPITAL_LETTER_F";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_G"] = 71] = "LATIN_CAPITAL_LETTER_G";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_H"] = 72] = "LATIN_CAPITAL_LETTER_H";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_I"] = 73] = "LATIN_CAPITAL_LETTER_I";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_J"] = 74] = "LATIN_CAPITAL_LETTER_J";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_K"] = 75] = "LATIN_CAPITAL_LETTER_K";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_L"] = 76] = "LATIN_CAPITAL_LETTER_L";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_M"] = 77] = "LATIN_CAPITAL_LETTER_M";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_N"] = 78] = "LATIN_CAPITAL_LETTER_N";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_O"] = 79] = "LATIN_CAPITAL_LETTER_O";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_P"] = 80] = "LATIN_CAPITAL_LETTER_P";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_Q"] = 81] = "LATIN_CAPITAL_LETTER_Q";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_R"] = 82] = "LATIN_CAPITAL_LETTER_R";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_S"] = 83] = "LATIN_CAPITAL_LETTER_S";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_T"] = 84] = "LATIN_CAPITAL_LETTER_T";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_U"] = 85] = "LATIN_CAPITAL_LETTER_U";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_V"] = 86] = "LATIN_CAPITAL_LETTER_V";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_W"] = 87] = "LATIN_CAPITAL_LETTER_W";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_X"] = 88] = "LATIN_CAPITAL_LETTER_X";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_Y"] = 89] = "LATIN_CAPITAL_LETTER_Y";
|
||||
charset[charset["LATIN_CAPITAL_LETTER_Z"] = 90] = "LATIN_CAPITAL_LETTER_Z";
|
||||
charset[charset["LEFT_SQUARE_BRACKET"] = 91] = "LEFT_SQUARE_BRACKET";
|
||||
charset[charset["REVERSE_SOLIDUS"] = 92] = "REVERSE_SOLIDUS";
|
||||
charset[charset["RIGHT_SQUARE_BRACKET"] = 93] = "RIGHT_SQUARE_BRACKET";
|
||||
charset[charset["CIRCUMFLEX_ACCENT"] = 94] = "CIRCUMFLEX_ACCENT";
|
||||
charset[charset["LOW_LINE"] = 95] = "LOW_LINE";
|
||||
charset[charset["GRAVE_ACCENT"] = 96] = "GRAVE_ACCENT";
|
||||
charset[charset["LATIN_SMALL_LETTER_A"] = 97] = "LATIN_SMALL_LETTER_A";
|
||||
charset[charset["LATIN_SMALL_LETTER_B"] = 98] = "LATIN_SMALL_LETTER_B";
|
||||
charset[charset["LATIN_SMALL_LETTER_C"] = 99] = "LATIN_SMALL_LETTER_C";
|
||||
charset[charset["LATIN_SMALL_LETTER_D"] = 100] = "LATIN_SMALL_LETTER_D";
|
||||
charset[charset["LATIN_SMALL_LETTER_E"] = 101] = "LATIN_SMALL_LETTER_E";
|
||||
charset[charset["LATIN_SMALL_LETTER_F"] = 102] = "LATIN_SMALL_LETTER_F";
|
||||
charset[charset["LATIN_SMALL_LETTER_G"] = 103] = "LATIN_SMALL_LETTER_G";
|
||||
charset[charset["LATIN_SMALL_LETTER_H"] = 104] = "LATIN_SMALL_LETTER_H";
|
||||
charset[charset["LATIN_SMALL_LETTER_I"] = 105] = "LATIN_SMALL_LETTER_I";
|
||||
charset[charset["LATIN_SMALL_LETTER_J"] = 106] = "LATIN_SMALL_LETTER_J";
|
||||
charset[charset["LATIN_SMALL_LETTER_K"] = 107] = "LATIN_SMALL_LETTER_K";
|
||||
charset[charset["LATIN_SMALL_LETTER_L"] = 108] = "LATIN_SMALL_LETTER_L";
|
||||
charset[charset["LATIN_SMALL_LETTER_M"] = 109] = "LATIN_SMALL_LETTER_M";
|
||||
charset[charset["LATIN_SMALL_LETTER_N"] = 110] = "LATIN_SMALL_LETTER_N";
|
||||
charset[charset["LATIN_SMALL_LETTER_O"] = 111] = "LATIN_SMALL_LETTER_O";
|
||||
charset[charset["LATIN_SMALL_LETTER_P"] = 112] = "LATIN_SMALL_LETTER_P";
|
||||
charset[charset["LATIN_SMALL_LETTER_Q"] = 113] = "LATIN_SMALL_LETTER_Q";
|
||||
charset[charset["LATIN_SMALL_LETTER_R"] = 114] = "LATIN_SMALL_LETTER_R";
|
||||
charset[charset["LATIN_SMALL_LETTER_S"] = 115] = "LATIN_SMALL_LETTER_S";
|
||||
charset[charset["LATIN_SMALL_LETTER_T"] = 116] = "LATIN_SMALL_LETTER_T";
|
||||
charset[charset["LATIN_SMALL_LETTER_U"] = 117] = "LATIN_SMALL_LETTER_U";
|
||||
charset[charset["LATIN_SMALL_LETTER_V"] = 118] = "LATIN_SMALL_LETTER_V";
|
||||
charset[charset["LATIN_SMALL_LETTER_W"] = 119] = "LATIN_SMALL_LETTER_W";
|
||||
charset[charset["LATIN_SMALL_LETTER_X"] = 120] = "LATIN_SMALL_LETTER_X";
|
||||
charset[charset["LATIN_SMALL_LETTER_Y"] = 121] = "LATIN_SMALL_LETTER_Y";
|
||||
charset[charset["LATIN_SMALL_LETTER_Z"] = 122] = "LATIN_SMALL_LETTER_Z";
|
||||
charset[charset["LEFT_CURLY_BRACKET"] = 123] = "LEFT_CURLY_BRACKET";
|
||||
charset[charset["VERTICAL_LINE"] = 124] = "VERTICAL_LINE";
|
||||
charset[charset["RIGHT_CURLY_BRACKET"] = 125] = "RIGHT_CURLY_BRACKET";
|
||||
charset[charset["TILDE"] = 126] = "TILDE";
|
||||
})(charset || (exports.charset = charset = {}));
|
||||
exports.escapedSequences = {
|
||||
[34 /* charset.QUOTATION_MARK */]: 34 /* charset.QUOTATION_MARK */,
|
||||
[92 /* charset.REVERSE_SOLIDUS */]: 92 /* charset.REVERSE_SOLIDUS */,
|
||||
[47 /* charset.SOLIDUS */]: 47 /* charset.SOLIDUS */,
|
||||
[98 /* charset.LATIN_SMALL_LETTER_B */]: 8 /* charset.BACKSPACE */,
|
||||
[102 /* charset.LATIN_SMALL_LETTER_F */]: 12 /* charset.FORM_FEED */,
|
||||
[110 /* charset.LATIN_SMALL_LETTER_N */]: 10 /* charset.NEWLINE */,
|
||||
[114 /* charset.LATIN_SMALL_LETTER_R */]: 13 /* charset.CARRIAGE_RETURN */,
|
||||
[116 /* charset.LATIN_SMALL_LETTER_T */]: 9 /* charset.TAB */,
|
||||
};
|
||||
//# sourceMappingURL=utf-8.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/cjs/utils/utf-8.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"utf-8.js","sourceRoot":"","sources":["../../../src/utils/utf-8.ts"],"names":[],"mappings":";;;AAAA,IAAkB,OAqGjB;AArGD,WAAkB,OAAO;IACvB,+CAAe,CAAA;IACf,gDAAe,CAAA;IACf,4CAAa,CAAA;IACb,4DAAqB,CAAA;IACrB,mCAAS,CAAA;IACT,wCAAY,CAAA;IACZ,8DAAuB,CAAA;IACvB,0DAAqB,CAAA;IACrB,oDAAkB,CAAA;IAClB,oDAAkB,CAAA;IAClB,sDAAmB,CAAA;IACnB,gDAAgB,CAAA;IAChB,kDAAiB,CAAA;IACjB,8DAAuB,CAAA;IACvB,gEAAwB,CAAA;IACxB,8CAAe,CAAA;IACf,gDAAgB,CAAA;IAChB,wCAAY,CAAA;IACZ,sDAAmB,CAAA;IACnB,gDAAgB,CAAA;IAChB,4CAAc,CAAA;IACd,kDAAiB,CAAA;IACjB,gDAAgB,CAAA;IAChB,gDAAgB,CAAA;IAChB,oDAAkB,CAAA;IAClB,kDAAiB,CAAA;IACjB,kDAAiB,CAAA;IACjB,gDAAgB,CAAA;IAChB,oDAAkB,CAAA;IAClB,oDAAkB,CAAA;IAClB,kDAAiB,CAAA;IACjB,wCAAY,CAAA;IACZ,gDAAgB,CAAA;IAChB,0DAAqB,CAAA;IACrB,oDAAkB,CAAA;IAClB,gEAAwB,CAAA;IACxB,wDAAoB,CAAA;IACpB,wDAAoB,CAAA;IACpB,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,0EAA6B,CAAA;IAC7B,oEAA0B,CAAA;IAC1B,4DAAsB,CAAA;IACtB,sEAA2B,CAAA;IAC3B,gEAAwB,CAAA;IACxB,8CAAe,CAAA;IACf,sDAAmB,CAAA;IACnB,sEAA2B,CAAA;IAC3B,sEAA2B,CAAA;IAC3B,sEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,uEAA2B,CAAA;IAC3B,mEAAyB,CAAA;IACzB,yDAAoB,CAAA;IACpB,qEAA0B,CAAA;IAC1B,yCAAY,CAAA;AACd,CAAC,EArGiB,OAAO,uBAAP,OAAO,QAqGxB;AAEY,QAAA,gBAAgB,GAA8B;IACzD,iCAAwB,iCAAwB;IAChD,kCAAyB,kCAAyB;IAClD,0BAAiB,0BAAiB;IAClC,uCAA8B,2BAAmB;IACjD,wCAA8B,4BAAmB;IACjD,wCAA8B,0BAAiB;IAC/C,wCAA8B,kCAAyB;IACvD,wCAA8B,qBAAa;CAC5C,CAAC"}
|
||||
377
dev/env/node_modules/@streamparser/json/dist/deno/README.md
generated
vendored
Executable file
377
dev/env/node_modules/@streamparser/json/dist/deno/README.md
generated
vendored
Executable file
@@ -0,0 +1,377 @@
|
||||
# @streamparser/json
|
||||
|
||||
[![npm version][npm-version-badge]][npm-badge-url]
|
||||
[![npm monthly downloads][npm-downloads-badge]][npm-badge-url]
|
||||
[![Build Status][build-status-badge]][build-status-url]
|
||||
[![Coverage Status][coverage-status-badge]][coverage-status-url]
|
||||
|
||||
Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node.js, Deno or any modern browser. Fully compliant with the JSON spec and `JSON.parse(...)`.
|
||||
|
||||
*tldr;*
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
parser.onValue = ({ value }) => { /* process data */ };
|
||||
|
||||
// Or passing the stream in several chunks
|
||||
try {
|
||||
parser.write('{ "test": ["a"] }');
|
||||
// onValue will be called 3 times:
|
||||
// "a"
|
||||
// ["a"]
|
||||
// { test: ["a"] }
|
||||
} catch (err) {
|
||||
console.log(err); // handler errors
|
||||
}
|
||||
```
|
||||
|
||||
## @streamparser/json ecosystem
|
||||
|
||||
There are multiple flavours of @streamparser:
|
||||
|
||||
* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
|
||||
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
|
||||
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.
|
||||
|
||||
## Dependencies / Polyfilling
|
||||
|
||||
@streamparser/json requires a few ES6 classes:
|
||||
|
||||
* [Uint8Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array)
|
||||
* [TextEncoder](https://developer.mozilla.org/en-US/docs/Web/API/TextEncoder)
|
||||
* [TextDecoder](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder)
|
||||
|
||||
If you are targeting browsers or systems in which these might be missing, you need to polyfil them.
|
||||
|
||||
## Components
|
||||
|
||||
### Tokenizer
|
||||
|
||||
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens
|
||||
|
||||
```javascript
|
||||
import { Tokenizer } from '@streamparser/json';
|
||||
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
```
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
|
||||
numberBufferSize: <number>, // set to 0 to don't buffer.
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js.
|
||||
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
If buffer sizes are set to anything else than zero, instead of using a string to apppend the data as it comes in, the data is buffered using a TypedArray. A reasonable size could be `64 * 1024` (64 KB).
|
||||
|
||||
#### Buffering
|
||||
|
||||
When parsing strings or numbers, the parser needs to gather the data in-memory until the whole value is ready.
|
||||
|
||||
Strings are inmutable in Javascript so every string operation creates a new string. The V8 engine, behind Node, Deno and most modern browsers, performs a many different types of optimization. One of this optimizations is to over-allocate memory when it detects many string concatenations. This increases significatly the memory consumption and can easily exhaust your memory when parsing JSON containing very large strings or numbers. For those cases, the parser can buffer the characters using a TypedArray. This requires encoding/decoding from/to the buffer into an actual string once the value is ready. This is done using the `TextEncoder` and `TextDecoder` APIs. Unfortunately, these APIs creates a significant overhead when the strings are small so should be used only when strictly necessary.
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(data: string|typedArray|buffer)** push data into the tokenizer.
|
||||
* **end()** closes the tokenizer so it can not be used anymore. Throws an error if the tokenizer was in the middle of parsing.
|
||||
* **isEnded** readonly boolean property indicating whether the Tokenizer is ended or is still accepting data.
|
||||
* **parseNumber(numberStr)** method used internally to parse numbers. By default, it is equivalent to `Number(numberStr)` but the user can override it if he wants some other behaviour.
|
||||
* **onToken({ token: TokenType, value: any, offset: number })** no-op method that the user should override to follow the tokenization process.
|
||||
* **onError(err: Error)** no-op method that the user can override to act on errors. If not set, the write method simply throws synchronously.
|
||||
* **onEnd()** no-op method that the user can override to act when the tokenizer is ended.
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyTokenizer extends Tokenizer {
|
||||
parseNumber(numberStr) {
|
||||
const number = super.parseNumber(numberStr);
|
||||
// if number is too large. Just keep the string.
|
||||
return Number.isFinite(numberStr) ? number : numberStr;
|
||||
}
|
||||
onToken({ token, value }) {
|
||||
if (token = TokenTypes.NUMBER && typeof value === 'string') {
|
||||
super(TokenTypes.STRING, value);
|
||||
} else {
|
||||
super(token, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const myTokenizer = new MyTokenizer();
|
||||
|
||||
// or just overriding it
|
||||
const tokenizer = new Tokenizer();
|
||||
tokenizer.parseNumber = (numberStr) => { ... };
|
||||
tokenizer.onToken = ({ token, value, offset }) => { ... };
|
||||
```
|
||||
|
||||
### TokenParser
|
||||
|
||||
A token parser that processes JSON tokens as emitted by the `Tokenizer` and emits JSON values/objects.
|
||||
|
||||
```javascript
|
||||
import { TokenParser} from '@streamparser/json';
|
||||
|
||||
const tokenParser = new TokenParser(opts);
|
||||
```
|
||||
|
||||
The available options are:
|
||||
|
||||
```javascript
|
||||
{
|
||||
paths: <string[]>,
|
||||
keepStack: <boolean>, // whether to keep all the properties in the stack
|
||||
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
|
||||
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
|
||||
}
|
||||
```
|
||||
|
||||
* paths: Array of paths to emit. Defaults to `undefined` which emits everything. The paths are intended to suppot jsonpath although at the time being it only supports the root object selector (`$`) and subproperties selectors including wildcards (`$.a`, `$.*`, `$.a.b`, , `$.*.b`, etc).
|
||||
* keepStack: Whether to keep full objects on the stack even if they won't be emitted. Defaults to `true`. When set to `false` the it does preserve properties in the parent object some ancestor will be emitted. This means that the parent object passed to the `onValue` function will be empty, which doesn't reflect the truth, but it's more memory-efficient.
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(token: TokenType, value: any)** push data into the token parser.
|
||||
* **end()** closes the token parser so it can not be used anymore. Throws an error if the tokenizer was in the middle of parsing.
|
||||
* **isEnded** readonly boolean property indicating whether the token parser is ended or is still accepting data.
|
||||
* **onValue(value: any)** no-op method that the user should override to get the parsed value.
|
||||
* **onError(err: Error)** no-op method that the user should override to act on errors. If not set, the write method simply throws synchronously.
|
||||
* **onEnd()** no-op method that the user should override to act when the token parser is ended.
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyTokenParser extends TokenParser {
|
||||
onValue(value: any) {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
const myTokenParser = new MyTokenParser();
|
||||
|
||||
// or just overriding it
|
||||
const tokenParser = new TokenParser();
|
||||
tokenParser.onValue = (value) => { ... };
|
||||
```
|
||||
|
||||
### JSONParser
|
||||
|
||||
A drop-in replacement of `JSONparse` (with few ~~breaking changes~~ improvements. See below.).
|
||||
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
```
|
||||
|
||||
It takes the same options as the tokenizer.
|
||||
|
||||
This class is just for convenience. In reality, it simply connects the tokenizer and the parser:
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
tokenizer.onToken = tokenParser.write.bind(tokenParser);
|
||||
tokenParser.onValue = (value) => { /* Process values */ }
|
||||
```
|
||||
|
||||
#### Properties & Methods
|
||||
|
||||
* **write(token: TokenType, value: any)** alias to the Tokenizer write method.
|
||||
* **end()** alias to the Tokenizer end method.
|
||||
* **isEnded** readonly boolean property indicating whether the JSONParser is ended or is still accepting data.
|
||||
* **onToken(token: TokenType, value: any, offset: number)** alias to the Tokenizer onToken method. (write only).
|
||||
* **onValue(value: any)** alias to the Token Parser onValue method (write only).
|
||||
* **onError(err: Error)** alias to the Tokenizer/Token Parser onError method (write only).
|
||||
* **onEnd()** alias to the Tokenizer onEnd method (which will call the Token Parser onEnd methods) (write only).
|
||||
|
||||
```javascript
|
||||
// You can override the overridable methods by creating your own class extending Tokenizer
|
||||
class MyJsonParser extends JSONParser {
|
||||
onToken(value: any) {
|
||||
// ...
|
||||
}
|
||||
onValue(value: any) {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
const myJsonParser = new MyJsonParser();
|
||||
|
||||
// or just overriding it
|
||||
const jsonParser = new JSONParser();
|
||||
jsonParser.onToken = (token, value, offset) => { ... };
|
||||
jsonParser.onValue = (value) => { ... };
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
You can use both components independently as
|
||||
|
||||
```javascript
|
||||
const tokenizer = new Tokenizer(opts);
|
||||
const tokenParser = new TokenParser();
|
||||
tokenizer.onToken = tokenParser.write.bind(tokenParser);
|
||||
```
|
||||
|
||||
You push data using the `write` method which takes a string or an array-like object.
|
||||
|
||||
You can subscribe to the resulting data using the
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
|
||||
parser.onValue = console.log;
|
||||
|
||||
parser.write('"Hello world!"'); // logs "Hello world!"
|
||||
|
||||
// Or passing the stream in several chunks
|
||||
parser.write('"');
|
||||
parser.write('Hello');
|
||||
parser.write(' ');
|
||||
parser.write('world!');
|
||||
parser.write('"');// logs "Hello world!"
|
||||
```
|
||||
|
||||
Write is always a synchronous operation so any error during the parsing of the stream will be thrown during the write operation. After an error, the parser can't continue parsing.
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined });
|
||||
parser.onValue = console.log;
|
||||
|
||||
try {
|
||||
parser.write('"""');
|
||||
} catch (err) {
|
||||
console.log(err); // logs
|
||||
}
|
||||
```
|
||||
|
||||
You can also handle errors using callbacks:
|
||||
|
||||
```javascript
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser({ stringBufferSize: undefined });
|
||||
parser.onValue = console.log;
|
||||
parser.onError = console.error;
|
||||
|
||||
parser.write('"""');
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
### Stream-parsing a fetch request returning a JSONstream
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`{"id":1}{"id":2}{"id":3}...`).
|
||||
|
||||
```js
|
||||
import { JSONParser} from '@streamparser/json';
|
||||
|
||||
const parser = new JSONParser();
|
||||
parser.onValue = (value, key, parent, stack) => {
|
||||
if (stack > 0) return; // ignore inner values
|
||||
// TODO process element
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a JSON array
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const jsonparser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'] });
|
||||
jsonparser.onValue = ({ value, key, parent, stack }) => {
|
||||
// TODO process element
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
### Stream-parsing a fetch request returning a very long string getting previews of the string
|
||||
|
||||
Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).
|
||||
|
||||
```js
|
||||
import { JSONParser } from '@streamparser/json';
|
||||
|
||||
const jsonparser = new JSONParser({ emitPartialTokens: true, emitPartialValues: true });
|
||||
jsonparser.onValue = ({ value, key, parent, stack, partial }) => {
|
||||
if (partial) {
|
||||
console.log(`Parsing value: ${value}... (still parsing)`);
|
||||
} else {
|
||||
console.log(`Value parsed: ${value}`);
|
||||
}
|
||||
};
|
||||
|
||||
const response = await fetch('http://example.com/');
|
||||
const reader = response.body.getReader();
|
||||
while(true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
jsonparser.write(value);
|
||||
}
|
||||
```
|
||||
|
||||
## Migration guide
|
||||
|
||||
### Upgrading from 0.10 to 0.11
|
||||
|
||||
The arguments of callbacks have been objectified.
|
||||
|
||||
What used to be
|
||||
|
||||
```js
|
||||
jsonparser.onToken = ({ token, value }) => {
|
||||
// TODO process token
|
||||
};
|
||||
jsonparser.onValue = ({ value, key, parent, stack }) => {
|
||||
// TODO process element
|
||||
};
|
||||
```
|
||||
now is:
|
||||
|
||||
```js
|
||||
jsonparser.onToken = (token, value) => {
|
||||
// TODO process token
|
||||
};
|
||||
jsonparser.onValue = (value, key, parent, stack) => {
|
||||
// TODO process element
|
||||
};
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
See [LICENSE.md](../../LICENSE).
|
||||
|
||||
[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson.svg
|
||||
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json
|
||||
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson.svg
|
||||
[build-status-badge]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml/badge.svg
|
||||
[build-status-url]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml
|
||||
[coverage-status-badge]: https://coveralls.io/repos/github/juanjoDiaz/streamparser-json/badge.svg?branch=main
|
||||
[coverage-status-url]: https://coveralls.io/github/juanjoDiaz/streamparser-json?branch=main
|
||||
21
dev/env/node_modules/@streamparser/json/dist/deno/index.ts
generated
vendored
Executable file
21
dev/env/node_modules/@streamparser/json/dist/deno/index.ts
generated
vendored
Executable file
@@ -0,0 +1,21 @@
|
||||
export { default as JSONParser, type JSONParserOptions } from "./jsonparser.ts";
|
||||
export {
|
||||
default as Tokenizer,
|
||||
type TokenizerOptions,
|
||||
TokenizerError,
|
||||
} from "./tokenizer.ts";
|
||||
export {
|
||||
default as TokenParser,
|
||||
type TokenParserOptions,
|
||||
TokenParserError,
|
||||
} from "./tokenparser.ts";
|
||||
|
||||
export * as utf8 from "./utils/utf-8.ts";
|
||||
export * as JsonTypes from "./utils/types/jsonTypes.ts";
|
||||
export * as ParsedTokenInfo from "./utils/types/parsedTokenInfo.ts";
|
||||
export * as ParsedElementInfo from "./utils/types/parsedElementInfo.ts";
|
||||
export {
|
||||
TokenParserMode,
|
||||
type StackElement,
|
||||
} from "./utils/types/stackElement.ts";
|
||||
export { default as TokenType } from "./utils/types/tokenType.ts";
|
||||
62
dev/env/node_modules/@streamparser/json/dist/deno/jsonparser.ts
generated
vendored
Executable file
62
dev/env/node_modules/@streamparser/json/dist/deno/jsonparser.ts
generated
vendored
Executable file
@@ -0,0 +1,62 @@
|
||||
import Tokenizer, { type TokenizerOptions } from "./tokenizer.ts";
|
||||
import TokenParser, { type TokenParserOptions } from "./tokenparser.ts";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.ts";
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.ts";
|
||||
|
||||
export interface JSONParserOptions
|
||||
extends TokenizerOptions,
|
||||
TokenParserOptions {}
|
||||
|
||||
export default class JSONParser {
|
||||
private tokenizer: Tokenizer;
|
||||
private tokenParser: TokenParser;
|
||||
|
||||
constructor(opts: JSONParserOptions = {}) {
|
||||
this.tokenizer = new Tokenizer(opts);
|
||||
this.tokenParser = new TokenParser(opts);
|
||||
|
||||
this.tokenizer.onToken = this.tokenParser.write.bind(this.tokenParser);
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.tokenParser.isEnded) this.tokenParser.end();
|
||||
};
|
||||
|
||||
this.tokenParser.onError = this.tokenizer.error.bind(this.tokenizer);
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded) this.tokenizer.end();
|
||||
};
|
||||
}
|
||||
|
||||
public get isEnded(): boolean {
|
||||
return this.tokenizer.isEnded && this.tokenParser.isEnded;
|
||||
}
|
||||
|
||||
public write(input: Iterable<number> | string): void {
|
||||
this.tokenizer.write(input);
|
||||
}
|
||||
|
||||
public end(): void {
|
||||
this.tokenizer.end();
|
||||
}
|
||||
|
||||
public set onToken(cb: (parsedTokenInfo: ParsedTokenInfo) => void) {
|
||||
this.tokenizer.onToken = (parsedToken) => {
|
||||
cb(parsedToken);
|
||||
this.tokenParser.write(parsedToken);
|
||||
};
|
||||
}
|
||||
|
||||
public set onValue(cb: (parsedElementInfo: ParsedElementInfo) => void) {
|
||||
this.tokenParser.onValue = cb;
|
||||
}
|
||||
|
||||
public set onError(cb: (err: Error) => void) {
|
||||
this.tokenizer.onError = cb;
|
||||
}
|
||||
|
||||
public set onEnd(cb: () => void) {
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded) this.tokenizer.end();
|
||||
cb.call(this.tokenParser);
|
||||
};
|
||||
}
|
||||
}
|
||||
851
dev/env/node_modules/@streamparser/json/dist/deno/tokenizer.ts
generated
vendored
Executable file
851
dev/env/node_modules/@streamparser/json/dist/deno/tokenizer.ts
generated
vendored
Executable file
@@ -0,0 +1,851 @@
|
||||
import { charset, escapedSequences } from "./utils/utf-8.ts";
|
||||
import {
|
||||
type StringBuilder,
|
||||
NonBufferedString,
|
||||
BufferedString,
|
||||
} from "./utils/bufferedString.ts";
|
||||
import TokenType from "./utils/types/tokenType.ts";
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.ts";
|
||||
|
||||
// Tokenizer States
|
||||
const enum TokenizerStates {
|
||||
START,
|
||||
ENDED,
|
||||
ERROR,
|
||||
TRUE1,
|
||||
TRUE2,
|
||||
TRUE3,
|
||||
FALSE1,
|
||||
FALSE2,
|
||||
FALSE3,
|
||||
FALSE4,
|
||||
NULL1,
|
||||
NULL2,
|
||||
NULL3,
|
||||
STRING_DEFAULT,
|
||||
STRING_AFTER_BACKSLASH,
|
||||
STRING_UNICODE_DIGIT_1,
|
||||
STRING_UNICODE_DIGIT_2,
|
||||
STRING_UNICODE_DIGIT_3,
|
||||
STRING_UNICODE_DIGIT_4,
|
||||
STRING_INCOMPLETE_CHAR,
|
||||
NUMBER_AFTER_INITIAL_MINUS,
|
||||
NUMBER_AFTER_INITIAL_ZERO,
|
||||
NUMBER_AFTER_INITIAL_NON_ZERO,
|
||||
NUMBER_AFTER_FULL_STOP,
|
||||
NUMBER_AFTER_DECIMAL,
|
||||
NUMBER_AFTER_E,
|
||||
NUMBER_AFTER_E_AND_SIGN,
|
||||
NUMBER_AFTER_E_AND_DIGIT,
|
||||
SEPARATOR,
|
||||
BOM_OR_START,
|
||||
BOM,
|
||||
}
|
||||
|
||||
function TokenizerStateToString(tokenizerState: TokenizerStates): string {
|
||||
return [
|
||||
"START",
|
||||
"ENDED",
|
||||
"ERROR",
|
||||
"TRUE1",
|
||||
"TRUE2",
|
||||
"TRUE3",
|
||||
"FALSE1",
|
||||
"FALSE2",
|
||||
"FALSE3",
|
||||
"FALSE4",
|
||||
"NULL1",
|
||||
"NULL2",
|
||||
"NULL3",
|
||||
"STRING_DEFAULT",
|
||||
"STRING_AFTER_BACKSLASH",
|
||||
"STRING_UNICODE_DIGIT_1",
|
||||
"STRING_UNICODE_DIGIT_2",
|
||||
"STRING_UNICODE_DIGIT_3",
|
||||
"STRING_UNICODE_DIGIT_4",
|
||||
"STRING_INCOMPLETE_CHAR",
|
||||
"NUMBER_AFTER_INITIAL_MINUS",
|
||||
"NUMBER_AFTER_INITIAL_ZERO",
|
||||
"NUMBER_AFTER_INITIAL_NON_ZERO",
|
||||
"NUMBER_AFTER_FULL_STOP",
|
||||
"NUMBER_AFTER_DECIMAL",
|
||||
"NUMBER_AFTER_E",
|
||||
"NUMBER_AFTER_E_AND_SIGN",
|
||||
"NUMBER_AFTER_E_AND_DIGIT",
|
||||
"SEPARATOR",
|
||||
"BOM_OR_START",
|
||||
"BOM",
|
||||
][tokenizerState];
|
||||
}
|
||||
|
||||
export interface TokenizerOptions {
|
||||
stringBufferSize?: number;
|
||||
numberBufferSize?: number;
|
||||
separator?: string;
|
||||
emitPartialTokens?: boolean;
|
||||
}
|
||||
|
||||
const defaultOpts: TokenizerOptions = {
|
||||
stringBufferSize: 0,
|
||||
numberBufferSize: 0,
|
||||
separator: undefined,
|
||||
emitPartialTokens: false,
|
||||
};
|
||||
|
||||
export class TokenizerError extends Error {
|
||||
constructor(message: string) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenizerError.prototype);
|
||||
}
|
||||
}
|
||||
|
||||
export default class Tokenizer {
|
||||
private state = TokenizerStates.BOM_OR_START;
|
||||
|
||||
private bom?: number[];
|
||||
private bomIndex = 0;
|
||||
|
||||
private emitPartialTokens: boolean;
|
||||
private separator?: string;
|
||||
private separatorBytes?: Uint8Array;
|
||||
private separatorIndex = 0;
|
||||
private escapedCharsByteLength = 0;
|
||||
private bufferedString: StringBuilder;
|
||||
private bufferedNumber: StringBuilder;
|
||||
|
||||
private unicode?: string; // unicode escapes
|
||||
private highSurrogate?: number;
|
||||
private bytes_remaining = 0; // number of bytes remaining in multi byte utf8 char to read after split boundary
|
||||
private bytes_in_sequence = 0; // bytes in multi byte utf8 char to read
|
||||
private char_split_buffer = new Uint8Array(4); // for rebuilding chars split before boundary is reached
|
||||
private encoder = new TextEncoder();
|
||||
private offset = -1;
|
||||
|
||||
constructor(opts?: TokenizerOptions) {
|
||||
opts = { ...defaultOpts, ...opts };
|
||||
|
||||
this.emitPartialTokens = opts.emitPartialTokens === true;
|
||||
this.bufferedString =
|
||||
opts.stringBufferSize && opts.stringBufferSize > 4
|
||||
? new BufferedString(opts.stringBufferSize)
|
||||
: new NonBufferedString();
|
||||
this.bufferedNumber =
|
||||
opts.numberBufferSize && opts.numberBufferSize > 0
|
||||
? new BufferedString(opts.numberBufferSize)
|
||||
: new NonBufferedString();
|
||||
|
||||
this.separator = opts.separator;
|
||||
this.separatorBytes = opts.separator
|
||||
? this.encoder.encode(opts.separator)
|
||||
: undefined;
|
||||
}
|
||||
|
||||
public get isEnded(): boolean {
|
||||
return this.state === TokenizerStates.ENDED;
|
||||
}
|
||||
|
||||
public write(input: Iterable<number> | string): void {
|
||||
try {
|
||||
let buffer: Uint8Array;
|
||||
if (input instanceof Uint8Array) {
|
||||
buffer = input;
|
||||
} else if (typeof input === "string") {
|
||||
buffer = this.encoder.encode(input);
|
||||
} else if (Array.isArray(input)) {
|
||||
buffer = Uint8Array.from(input);
|
||||
} else if (ArrayBuffer.isView(input)) {
|
||||
buffer = new Uint8Array(
|
||||
input.buffer,
|
||||
input.byteOffset,
|
||||
input.byteLength,
|
||||
);
|
||||
} else {
|
||||
throw new TypeError(
|
||||
"Unexpected type. The `write` function only accepts Arrays, TypedArrays and Strings.",
|
||||
);
|
||||
}
|
||||
|
||||
for (let i = 0; i < buffer.length; i += 1) {
|
||||
const n = buffer[i]; // get current byte from buffer
|
||||
switch (this.state) {
|
||||
// @ts-expect-error fall through case
|
||||
case TokenizerStates.BOM_OR_START:
|
||||
if (input instanceof Uint8Array && n === 0xef) {
|
||||
this.bom = [0xef, 0xbb, 0xbf];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (input instanceof Uint16Array) {
|
||||
if (n === 0xfe) {
|
||||
this.bom = [0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if (input instanceof Uint32Array) {
|
||||
if (n === 0x00) {
|
||||
this.bom = [0x00, 0x00, 0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe, 0x00, 0x00];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case TokenizerStates.START:
|
||||
this.offset += 1;
|
||||
|
||||
if (this.separatorBytes && n === this.separatorBytes[0]) {
|
||||
if (this.separatorBytes.length === 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.SEPARATOR,
|
||||
value: this.separator as string,
|
||||
offset: this.offset + this.separatorBytes.length - 1,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
this.state = TokenizerStates.SEPARATOR;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (
|
||||
n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB
|
||||
) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.LEFT_CURLY_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.LEFT_BRACE,
|
||||
value: "{",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.RIGHT_CURLY_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.RIGHT_BRACE,
|
||||
value: "}",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LEFT_SQUARE_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.LEFT_BRACKET,
|
||||
value: "[",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.RIGHT_SQUARE_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.RIGHT_BRACKET,
|
||||
value: "]",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.COLON) {
|
||||
this.onToken({
|
||||
token: TokenType.COLON,
|
||||
value: ":",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.COMMA) {
|
||||
this.onToken({
|
||||
token: TokenType.COMMA,
|
||||
value: ",",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.LATIN_SMALL_LETTER_T) {
|
||||
this.state = TokenizerStates.TRUE1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.LATIN_SMALL_LETTER_F) {
|
||||
this.state = TokenizerStates.FALSE1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.LATIN_SMALL_LETTER_N) {
|
||||
this.state = TokenizerStates.NULL1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.QUOTATION_MARK) {
|
||||
this.bufferedString.reset();
|
||||
this.escapedCharsByteLength = 0;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n >= charset.DIGIT_ONE && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.DIGIT_ZERO) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_ZERO;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.HYPHEN_MINUS) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_MINUS;
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
// STRING
|
||||
case TokenizerStates.STRING_DEFAULT:
|
||||
if (n === charset.QUOTATION_MARK) {
|
||||
const string = this.bufferedString.toString();
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset +=
|
||||
this.escapedCharsByteLength +
|
||||
this.bufferedString.byteLength +
|
||||
1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.REVERSE_SOLIDUS) {
|
||||
this.state = TokenizerStates.STRING_AFTER_BACKSLASH;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n >= 128) {
|
||||
// Parse multi byte (>=128) chars one at a time
|
||||
if (n >= 194 && n <= 223) {
|
||||
this.bytes_in_sequence = 2;
|
||||
} else if (n <= 239) {
|
||||
this.bytes_in_sequence = 3;
|
||||
} else {
|
||||
this.bytes_in_sequence = 4;
|
||||
}
|
||||
|
||||
if (this.bytes_in_sequence <= buffer.length - i) {
|
||||
// if bytes needed to complete char fall outside buffer length, we have a boundary split
|
||||
this.bufferedString.appendBuf(
|
||||
buffer,
|
||||
i,
|
||||
i + this.bytes_in_sequence,
|
||||
);
|
||||
i += this.bytes_in_sequence - 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
this.bytes_remaining = i + this.bytes_in_sequence - buffer.length;
|
||||
this.char_split_buffer.set(buffer.subarray(i));
|
||||
i = buffer.length - 1;
|
||||
this.state = TokenizerStates.STRING_INCOMPLETE_CHAR;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n >= charset.SPACE) {
|
||||
this.bufferedString.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
case TokenizerStates.STRING_INCOMPLETE_CHAR:
|
||||
// check for carry over of a multi byte char split between data chunks
|
||||
// & fill temp buffer it with start of this data chunk up to the boundary limit set in the last iteration
|
||||
this.char_split_buffer.set(
|
||||
buffer.subarray(i, i + this.bytes_remaining),
|
||||
this.bytes_in_sequence - this.bytes_remaining,
|
||||
);
|
||||
this.bufferedString.appendBuf(
|
||||
this.char_split_buffer,
|
||||
0,
|
||||
this.bytes_in_sequence,
|
||||
);
|
||||
i = this.bytes_remaining - 1;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
case TokenizerStates.STRING_AFTER_BACKSLASH:
|
||||
// eslint-disable-next-line no-case-declarations
|
||||
const controlChar = escapedSequences[n];
|
||||
if (controlChar) {
|
||||
this.bufferedString.appendChar(controlChar);
|
||||
this.escapedCharsByteLength += 1; // len(\")=2 minus the fact you're appending len(controlChar)=1
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.unicode = "";
|
||||
this.state = TokenizerStates.STRING_UNICODE_DIGIT_1;
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_1:
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_2:
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_3:
|
||||
if (
|
||||
(n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) ||
|
||||
(n >= charset.LATIN_CAPITAL_LETTER_A &&
|
||||
n <= charset.LATIN_CAPITAL_LETTER_F) ||
|
||||
(n >= charset.LATIN_SMALL_LETTER_A &&
|
||||
n <= charset.LATIN_SMALL_LETTER_F)
|
||||
) {
|
||||
this.unicode += String.fromCharCode(n);
|
||||
this.state += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_4:
|
||||
if (
|
||||
(n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) ||
|
||||
(n >= charset.LATIN_CAPITAL_LETTER_A &&
|
||||
n <= charset.LATIN_CAPITAL_LETTER_F) ||
|
||||
(n >= charset.LATIN_SMALL_LETTER_A &&
|
||||
n <= charset.LATIN_SMALL_LETTER_F)
|
||||
) {
|
||||
const intVal = parseInt(
|
||||
this.unicode + String.fromCharCode(n),
|
||||
16,
|
||||
);
|
||||
let unicodeString: string;
|
||||
if (this.highSurrogate === undefined) {
|
||||
if (intVal >= 0xd800 && intVal <= 0xdbff) {
|
||||
//<55296,56319> - highSurrogate
|
||||
this.highSurrogate = intVal;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
} else {
|
||||
unicodeString = String.fromCharCode(intVal);
|
||||
}
|
||||
} else {
|
||||
if (intVal >= 0xdc00 && intVal <= 0xdfff) {
|
||||
//<56320,57343> - lowSurrogate
|
||||
unicodeString = String.fromCharCode(
|
||||
this.highSurrogate,
|
||||
intVal,
|
||||
);
|
||||
} else {
|
||||
unicodeString = String.fromCharCode(this.highSurrogate);
|
||||
}
|
||||
this.highSurrogate = undefined;
|
||||
}
|
||||
const unicodeBuffer = this.encoder.encode(unicodeString);
|
||||
this.bufferedString.appendBuf(unicodeBuffer);
|
||||
// len(\u0000)=6 minus the fact you're appending len(buf)
|
||||
this.escapedCharsByteLength += 6 - unicodeBuffer.byteLength;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// Number
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_MINUS:
|
||||
if (n === charset.DIGIT_ZERO) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_ZERO;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n >= charset.DIGIT_ONE && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO;
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
if (n === charset.FULL_STOP) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_FULL_STOP;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (
|
||||
n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E
|
||||
) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (n === charset.FULL_STOP) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_FULL_STOP;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (
|
||||
n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E
|
||||
) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case TokenizerStates.NUMBER_AFTER_FULL_STOP:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_DECIMAL;
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (
|
||||
n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E
|
||||
) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// @ts-expect-error fall through case
|
||||
case TokenizerStates.NUMBER_AFTER_E:
|
||||
if (n === charset.PLUS_SIGN || n === charset.HYPHEN_MINUS) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E_AND_SIGN;
|
||||
continue;
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_SIGN:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E_AND_DIGIT;
|
||||
continue;
|
||||
}
|
||||
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// TRUE
|
||||
case TokenizerStates.TRUE1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_R) {
|
||||
this.state = TokenizerStates.TRUE2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.TRUE2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.state = TokenizerStates.TRUE3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.TRUE3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_E) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// FALSE
|
||||
case TokenizerStates.FALSE1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_A) {
|
||||
this.state = TokenizerStates.FALSE2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.FALSE3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_S) {
|
||||
this.state = TokenizerStates.FALSE4;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE4:
|
||||
if (n === charset.LATIN_SMALL_LETTER_E) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 4;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// NULL
|
||||
case TokenizerStates.NULL1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.state = TokenizerStates.NULL2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NULL2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.NULL3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NULL3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.SEPARATOR:
|
||||
this.separatorIndex += 1;
|
||||
if (
|
||||
!this.separatorBytes ||
|
||||
n !== this.separatorBytes[this.separatorIndex]
|
||||
) {
|
||||
break;
|
||||
}
|
||||
if (this.separatorIndex === this.separatorBytes.length - 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.SEPARATOR,
|
||||
value: this.separator as string,
|
||||
offset: this.offset + this.separatorIndex,
|
||||
});
|
||||
this.separatorIndex = 0;
|
||||
}
|
||||
continue;
|
||||
// BOM support
|
||||
case TokenizerStates.BOM:
|
||||
if (n === this.bom![this.bomIndex]) {
|
||||
if (this.bomIndex === this.bom!.length - 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.bom = undefined;
|
||||
this.bomIndex = 0;
|
||||
continue;
|
||||
}
|
||||
this.bomIndex += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.ENDED:
|
||||
if (
|
||||
n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB
|
||||
) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
throw new TokenizerError(
|
||||
`Unexpected "${String.fromCharCode(
|
||||
n,
|
||||
)}" at position "${i}" in state ${TokenizerStateToString(
|
||||
this.state,
|
||||
)}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (this.emitPartialTokens) {
|
||||
switch (this.state) {
|
||||
case TokenizerStates.TRUE1:
|
||||
case TokenizerStates.TRUE2:
|
||||
case TokenizerStates.TRUE3:
|
||||
this.onToken({
|
||||
token: TokenType.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.FALSE1:
|
||||
case TokenizerStates.FALSE2:
|
||||
case TokenizerStates.FALSE3:
|
||||
case TokenizerStates.FALSE4:
|
||||
this.onToken({
|
||||
token: TokenType.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.NULL1:
|
||||
case TokenizerStates.NULL2:
|
||||
case TokenizerStates.NULL3:
|
||||
this.onToken({
|
||||
token: TokenType.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.STRING_DEFAULT: {
|
||||
const string = this.bufferedString.toString();
|
||||
this.onToken({
|
||||
token: TokenType.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
}
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
try {
|
||||
this.onToken({
|
||||
token: TokenType.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
} catch {
|
||||
// Number couldn't be parsed. Do nothing.
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
this.error(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
private emitNumber(): void {
|
||||
this.onToken({
|
||||
token: TokenType.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += this.bufferedNumber.byteLength - 1;
|
||||
}
|
||||
|
||||
protected parseNumber(numberStr: string): number {
|
||||
return Number(numberStr);
|
||||
}
|
||||
|
||||
public error(err: Error): void {
|
||||
if (this.state !== TokenizerStates.ENDED) {
|
||||
this.state = TokenizerStates.ERROR;
|
||||
}
|
||||
|
||||
this.onError(err);
|
||||
}
|
||||
|
||||
public end(): void {
|
||||
switch (this.state) {
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
this.state = TokenizerStates.ENDED;
|
||||
this.emitNumber();
|
||||
this.onEnd();
|
||||
break;
|
||||
case TokenizerStates.BOM_OR_START:
|
||||
case TokenizerStates.START:
|
||||
case TokenizerStates.ERROR:
|
||||
case TokenizerStates.SEPARATOR:
|
||||
this.state = TokenizerStates.ENDED;
|
||||
this.onEnd();
|
||||
break;
|
||||
default:
|
||||
this.error(
|
||||
new TokenizerError(
|
||||
`Tokenizer ended in the middle of a token (state: ${TokenizerStateToString(
|
||||
this.state,
|
||||
)}). Either not all the data was received or the data was invalid.`,
|
||||
),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
public onToken(parsedToken: ParsedTokenInfo): void {
|
||||
// Override me
|
||||
throw new TokenizerError(
|
||||
'Can\'t emit tokens before the "onToken" callback has been set up.',
|
||||
);
|
||||
}
|
||||
|
||||
public onError(err: Error): void {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
|
||||
public onEnd(): void {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
400
dev/env/node_modules/@streamparser/json/dist/deno/tokenparser.ts
generated
vendored
Executable file
400
dev/env/node_modules/@streamparser/json/dist/deno/tokenparser.ts
generated
vendored
Executable file
@@ -0,0 +1,400 @@
|
||||
import { charset } from "./utils/utf-8.ts";
|
||||
import TokenType from "./utils/types/tokenType.ts";
|
||||
import type {
|
||||
JsonPrimitive,
|
||||
JsonKey,
|
||||
JsonObject,
|
||||
JsonArray,
|
||||
JsonStruct,
|
||||
} from "./utils/types/jsonTypes.ts";
|
||||
import {
|
||||
type StackElement,
|
||||
TokenParserMode,
|
||||
} from "./utils/types/stackElement.ts";
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.ts";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.ts";
|
||||
|
||||
// Parser States
|
||||
const enum TokenParserState {
|
||||
VALUE,
|
||||
KEY,
|
||||
COLON,
|
||||
COMMA,
|
||||
ENDED,
|
||||
ERROR,
|
||||
SEPARATOR,
|
||||
}
|
||||
|
||||
function TokenParserStateToString(state: TokenParserState): string {
|
||||
return ["VALUE", "KEY", "COLON", "COMMA", "ENDED", "ERROR", "SEPARATOR"][
|
||||
state
|
||||
];
|
||||
}
|
||||
|
||||
export interface TokenParserOptions {
|
||||
paths?: string[];
|
||||
keepStack?: boolean;
|
||||
separator?: string;
|
||||
emitPartialValues?: boolean;
|
||||
}
|
||||
|
||||
const defaultOpts: TokenParserOptions = {
|
||||
paths: undefined,
|
||||
keepStack: true,
|
||||
separator: undefined,
|
||||
emitPartialValues: false,
|
||||
};
|
||||
|
||||
export class TokenParserError extends Error {
|
||||
constructor(message: string) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenParserError.prototype);
|
||||
}
|
||||
}
|
||||
|
||||
export default class TokenParser {
|
||||
private readonly paths?: (string[] | undefined)[];
|
||||
private readonly keepStack: boolean;
|
||||
private readonly separator?: string;
|
||||
private state: TokenParserState = TokenParserState.VALUE;
|
||||
private mode: TokenParserMode | undefined = undefined;
|
||||
private key: JsonKey = undefined;
|
||||
private value: JsonStruct | undefined = undefined;
|
||||
private stack: StackElement[] = [];
|
||||
|
||||
constructor(opts?: TokenParserOptions) {
|
||||
opts = { ...defaultOpts, ...opts };
|
||||
|
||||
if (opts.paths) {
|
||||
this.paths = opts.paths.map((path) => {
|
||||
if (path === undefined || path === "$*") return undefined;
|
||||
|
||||
if (!path.startsWith("$"))
|
||||
throw new TokenParserError(
|
||||
`Invalid selector "${path}". Should start with "$".`,
|
||||
);
|
||||
const pathParts = path.split(".").slice(1);
|
||||
if (pathParts.includes(""))
|
||||
throw new TokenParserError(
|
||||
`Invalid selector "${path}". ".." syntax not supported.`,
|
||||
);
|
||||
return pathParts;
|
||||
});
|
||||
}
|
||||
|
||||
this.keepStack = opts.keepStack || false;
|
||||
this.separator = opts.separator;
|
||||
if (!opts.emitPartialValues) {
|
||||
this.emitPartial = () => {};
|
||||
}
|
||||
}
|
||||
|
||||
private shouldEmit(): boolean {
|
||||
if (!this.paths) return true;
|
||||
|
||||
return this.paths.some((path) => {
|
||||
if (path === undefined) return true;
|
||||
if (path.length !== this.stack.length) return false;
|
||||
|
||||
for (let i = 0; i < path.length - 1; i++) {
|
||||
const selector = path[i];
|
||||
const key = this.stack[i + 1].key;
|
||||
if (selector === "*") continue;
|
||||
if (selector !== key?.toString()) return false;
|
||||
}
|
||||
|
||||
const selector = path[path.length - 1];
|
||||
if (selector === "*") return true;
|
||||
return selector === this.key?.toString();
|
||||
});
|
||||
}
|
||||
|
||||
private push(): void {
|
||||
this.stack.push({
|
||||
key: this.key,
|
||||
value: this.value as JsonStruct,
|
||||
mode: this.mode,
|
||||
emit: this.shouldEmit(),
|
||||
});
|
||||
}
|
||||
|
||||
private pop(): void {
|
||||
const value = this.value;
|
||||
|
||||
let emit;
|
||||
({
|
||||
key: this.key,
|
||||
value: this.value,
|
||||
mode: this.mode,
|
||||
emit,
|
||||
} = this.stack.pop() as StackElement);
|
||||
|
||||
this.state =
|
||||
this.mode !== undefined ? TokenParserState.COMMA : TokenParserState.VALUE;
|
||||
|
||||
this.emit(value as JsonPrimitive | JsonStruct, emit);
|
||||
}
|
||||
|
||||
private emit(value: JsonPrimitive | JsonStruct, emit: boolean): void {
|
||||
if (
|
||||
!this.keepStack &&
|
||||
this.value &&
|
||||
this.stack.every((item) => !item.emit)
|
||||
) {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
delete (this.value as JsonStruct as any)[this.key as string | number];
|
||||
}
|
||||
|
||||
if (emit) {
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
});
|
||||
}
|
||||
|
||||
if (this.stack.length === 0) {
|
||||
if (this.separator) {
|
||||
this.state = TokenParserState.SEPARATOR;
|
||||
} else if (this.separator === undefined) {
|
||||
this.end();
|
||||
}
|
||||
// else if separator === '', expect next JSON object.
|
||||
}
|
||||
}
|
||||
|
||||
private emitPartial(value?: JsonPrimitive): void {
|
||||
if (!this.shouldEmit()) return;
|
||||
|
||||
if (this.state === TokenParserState.KEY) {
|
||||
this.onValue({
|
||||
value: undefined,
|
||||
key: value as JsonKey,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
}
|
||||
|
||||
public get isEnded(): boolean {
|
||||
return this.state === TokenParserState.ENDED;
|
||||
}
|
||||
|
||||
public write({
|
||||
token,
|
||||
value,
|
||||
partial,
|
||||
}: Omit<ParsedTokenInfo, "offset">): void {
|
||||
try {
|
||||
if (partial) {
|
||||
this.emitPartial(value);
|
||||
return;
|
||||
}
|
||||
|
||||
if (this.state === TokenParserState.VALUE) {
|
||||
if (
|
||||
token === TokenType.STRING ||
|
||||
token === TokenType.NUMBER ||
|
||||
token === TokenType.TRUE ||
|
||||
token === TokenType.FALSE ||
|
||||
token === TokenType.NULL
|
||||
) {
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
(this.value as JsonObject)[this.key as string] = value;
|
||||
this.state = TokenParserState.COMMA;
|
||||
} else if (this.mode === TokenParserMode.ARRAY) {
|
||||
(this.value as JsonArray).push(value);
|
||||
this.state = TokenParserState.COMMA;
|
||||
}
|
||||
|
||||
this.emit(value, this.shouldEmit());
|
||||
return;
|
||||
}
|
||||
|
||||
if (token === TokenType.LEFT_BRACE) {
|
||||
this.push();
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.value = (this.value as JsonObject)[this.key as string] = {};
|
||||
} else if (this.mode === TokenParserMode.ARRAY) {
|
||||
const val = {};
|
||||
(this.value as JsonArray).push(val);
|
||||
this.value = val;
|
||||
} else {
|
||||
this.value = {};
|
||||
}
|
||||
this.mode = TokenParserMode.OBJECT;
|
||||
this.state = TokenParserState.KEY;
|
||||
this.key = undefined;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
|
||||
if (token === TokenType.LEFT_BRACKET) {
|
||||
this.push();
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.value = (this.value as JsonObject)[this.key as string] = [];
|
||||
} else if (this.mode === TokenParserMode.ARRAY) {
|
||||
const val: JsonArray = [];
|
||||
(this.value as JsonArray).push(val);
|
||||
this.value = val;
|
||||
} else {
|
||||
this.value = [];
|
||||
}
|
||||
this.mode = TokenParserMode.ARRAY;
|
||||
this.state = TokenParserState.VALUE;
|
||||
this.key = 0;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
|
||||
if (
|
||||
this.mode === TokenParserMode.ARRAY &&
|
||||
token === TokenType.RIGHT_BRACKET &&
|
||||
(this.value as JsonArray).length === 0
|
||||
) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (this.state === TokenParserState.KEY) {
|
||||
if (token === TokenType.STRING) {
|
||||
this.key = value as string;
|
||||
this.state = TokenParserState.COLON;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
|
||||
if (
|
||||
token === TokenType.RIGHT_BRACE &&
|
||||
Object.keys(this.value as JsonObject).length === 0
|
||||
) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (this.state === TokenParserState.COLON) {
|
||||
if (token === TokenType.COLON) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (this.state === TokenParserState.COMMA) {
|
||||
if (token === TokenType.COMMA) {
|
||||
if (this.mode === TokenParserMode.ARRAY) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
(this.key as number) += 1;
|
||||
return;
|
||||
}
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.state = TokenParserState.KEY;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (
|
||||
(token === TokenType.RIGHT_BRACE &&
|
||||
this.mode === TokenParserMode.OBJECT) ||
|
||||
(token === TokenType.RIGHT_BRACKET &&
|
||||
this.mode === TokenParserMode.ARRAY)
|
||||
) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (this.state === TokenParserState.SEPARATOR) {
|
||||
if (token === TokenType.SEPARATOR && value === this.separator) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Edge case in which the separator is just whitespace and it's found in the middle of the JSON
|
||||
if (
|
||||
token === TokenType.SEPARATOR &&
|
||||
this.state !== TokenParserState.SEPARATOR &&
|
||||
Array.from(value as string)
|
||||
.map((n) => n.charCodeAt(0))
|
||||
.every(
|
||||
(n) =>
|
||||
n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB,
|
||||
)
|
||||
) {
|
||||
// whitespace
|
||||
return;
|
||||
}
|
||||
|
||||
throw new TokenParserError(
|
||||
`Unexpected ${TokenType[token]} (${JSON.stringify(
|
||||
value,
|
||||
)}) in state ${TokenParserStateToString(this.state)}`,
|
||||
);
|
||||
} catch (err: unknown) {
|
||||
this.error(err as Error);
|
||||
}
|
||||
}
|
||||
|
||||
public error(err: Error): void {
|
||||
if (this.state !== TokenParserState.ENDED) {
|
||||
this.state = TokenParserState.ERROR;
|
||||
}
|
||||
|
||||
this.onError(err);
|
||||
}
|
||||
|
||||
public end(): void {
|
||||
if (
|
||||
(this.state !== TokenParserState.VALUE &&
|
||||
this.state !== TokenParserState.SEPARATOR) ||
|
||||
this.stack.length > 0
|
||||
) {
|
||||
this.error(
|
||||
new Error(
|
||||
`Parser ended in mid-parsing (state: ${TokenParserStateToString(
|
||||
this.state,
|
||||
)}). Either not all the data was received or the data was invalid.`,
|
||||
),
|
||||
);
|
||||
} else {
|
||||
this.state = TokenParserState.ENDED;
|
||||
this.onEnd();
|
||||
}
|
||||
}
|
||||
|
||||
/* eslint-disable-next-line @typescript-eslint/no-unused-vars */
|
||||
public onValue(parsedElementInfo: ParsedElementInfo): void {
|
||||
// Override me
|
||||
throw new TokenParserError(
|
||||
'Can\'t emit data before the "onValue" callback has been set up.',
|
||||
);
|
||||
}
|
||||
|
||||
public onError(err: Error): void {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
|
||||
public onEnd(): void {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
75
dev/env/node_modules/@streamparser/json/dist/deno/utils/bufferedString.ts
generated
vendored
Executable file
75
dev/env/node_modules/@streamparser/json/dist/deno/utils/bufferedString.ts
generated
vendored
Executable file
@@ -0,0 +1,75 @@
|
||||
export interface StringBuilder {
|
||||
byteLength: number;
|
||||
appendChar: (char: number) => void;
|
||||
appendBuf: (buf: Uint8Array, start?: number, end?: number) => void;
|
||||
reset: () => void;
|
||||
toString: () => string;
|
||||
}
|
||||
|
||||
export class NonBufferedString implements StringBuilder {
|
||||
private decoder = new TextDecoder("utf-8");
|
||||
private strings: string[] = [];
|
||||
public byteLength = 0;
|
||||
|
||||
public appendChar(char: number): void {
|
||||
this.strings.push(String.fromCharCode(char));
|
||||
this.byteLength += 1;
|
||||
}
|
||||
|
||||
public appendBuf(buf: Uint8Array, start = 0, end: number = buf.length): void {
|
||||
this.strings.push(this.decoder.decode(buf.subarray(start, end)));
|
||||
this.byteLength += end - start;
|
||||
}
|
||||
|
||||
public reset(): void {
|
||||
this.strings = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
|
||||
public toString(): string {
|
||||
return this.strings.join("");
|
||||
}
|
||||
}
|
||||
|
||||
export class BufferedString implements StringBuilder {
|
||||
private decoder = new TextDecoder("utf-8");
|
||||
private buffer: Uint8Array;
|
||||
private bufferOffset = 0;
|
||||
private string = "";
|
||||
public byteLength = 0;
|
||||
|
||||
public constructor(bufferSize: number) {
|
||||
this.buffer = new Uint8Array(bufferSize);
|
||||
}
|
||||
|
||||
public appendChar(char: number): void {
|
||||
if (this.bufferOffset >= this.buffer.length) this.flushStringBuffer();
|
||||
this.buffer[this.bufferOffset++] = char;
|
||||
this.byteLength += 1;
|
||||
}
|
||||
|
||||
public appendBuf(buf: Uint8Array, start = 0, end: number = buf.length): void {
|
||||
const size = end - start;
|
||||
if (this.bufferOffset + size > this.buffer.length) this.flushStringBuffer();
|
||||
this.buffer.set(buf.subarray(start, end), this.bufferOffset);
|
||||
this.bufferOffset += size;
|
||||
this.byteLength += size;
|
||||
}
|
||||
|
||||
private flushStringBuffer(): void {
|
||||
this.string += this.decoder.decode(
|
||||
this.buffer.subarray(0, this.bufferOffset),
|
||||
);
|
||||
this.bufferOffset = 0;
|
||||
}
|
||||
|
||||
public reset(): void {
|
||||
this.string = "";
|
||||
this.bufferOffset = 0;
|
||||
this.byteLength = 0;
|
||||
}
|
||||
public toString(): string {
|
||||
this.flushStringBuffer();
|
||||
return this.string;
|
||||
}
|
||||
}
|
||||
5
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/jsonTypes.ts
generated
vendored
Executable file
5
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/jsonTypes.ts
generated
vendored
Executable file
@@ -0,0 +1,5 @@
|
||||
export type JsonPrimitive = string | number | boolean | null;
|
||||
export type JsonKey = string | number | undefined;
|
||||
export type JsonObject = { [key: string]: JsonPrimitive | JsonStruct };
|
||||
export type JsonArray = (JsonPrimitive | JsonStruct)[];
|
||||
export type JsonStruct = JsonObject | JsonArray;
|
||||
37
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/parsedElementInfo.ts
generated
vendored
Executable file
37
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/parsedElementInfo.ts
generated
vendored
Executable file
@@ -0,0 +1,37 @@
|
||||
import type { StackElement } from "./stackElement.ts";
|
||||
import type {
|
||||
JsonPrimitive,
|
||||
JsonKey,
|
||||
JsonObject,
|
||||
JsonArray,
|
||||
JsonStruct,
|
||||
} from "./jsonTypes.ts";
|
||||
|
||||
export interface ParsedElementInfo {
|
||||
value?: JsonPrimitive | JsonStruct;
|
||||
parent?: JsonStruct;
|
||||
key?: JsonKey;
|
||||
stack: StackElement[];
|
||||
partial?: boolean;
|
||||
}
|
||||
|
||||
export interface ParsedArrayElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonArray;
|
||||
key: number;
|
||||
stack: StackElement[];
|
||||
}
|
||||
|
||||
export interface ParsedObjectProperty extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonObject;
|
||||
key: string;
|
||||
stack: StackElement[];
|
||||
}
|
||||
|
||||
export interface ParsedTopLevelElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: undefined;
|
||||
key: undefined;
|
||||
stack: [];
|
||||
}
|
||||
58
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/parsedTokenInfo.ts
generated
vendored
Executable file
58
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/parsedTokenInfo.ts
generated
vendored
Executable file
@@ -0,0 +1,58 @@
|
||||
import TokenType from "./tokenType.ts";
|
||||
import type { JsonPrimitive } from "./jsonTypes.ts";
|
||||
|
||||
export interface ParsedTokenInfo {
|
||||
token: TokenType;
|
||||
value: JsonPrimitive;
|
||||
offset: number;
|
||||
partial?: boolean;
|
||||
}
|
||||
|
||||
export interface ParsedLeftBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACE;
|
||||
value: "{";
|
||||
}
|
||||
export interface ParsedRightBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACE;
|
||||
value: "}";
|
||||
}
|
||||
export interface ParsedLeftBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACKET;
|
||||
value: "[";
|
||||
}
|
||||
export interface ParsedRighBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACKET;
|
||||
value: "]";
|
||||
}
|
||||
export interface ParsedColonTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COLON;
|
||||
value: ":";
|
||||
}
|
||||
export interface ParsedCommaTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COMMA;
|
||||
value: ",";
|
||||
}
|
||||
export interface ParsedTrueTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.TRUE;
|
||||
value: true;
|
||||
}
|
||||
export interface ParsedFalseTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.FALSE;
|
||||
value: false;
|
||||
}
|
||||
export interface ParsedNullTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NULL;
|
||||
value: null;
|
||||
}
|
||||
export interface ParsedStringTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.STRING;
|
||||
value: string;
|
||||
}
|
||||
export interface ParsedNumberTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NUMBER;
|
||||
value: number;
|
||||
}
|
||||
export interface ParsedSeparatorTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.SEPARATOR;
|
||||
value: string;
|
||||
}
|
||||
13
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/stackElement.ts
generated
vendored
Executable file
13
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/stackElement.ts
generated
vendored
Executable file
@@ -0,0 +1,13 @@
|
||||
import type { JsonKey, JsonStruct } from "./jsonTypes.ts";
|
||||
|
||||
export const enum TokenParserMode {
|
||||
OBJECT,
|
||||
ARRAY,
|
||||
}
|
||||
|
||||
export interface StackElement {
|
||||
key: JsonKey;
|
||||
value: JsonStruct;
|
||||
mode?: TokenParserMode;
|
||||
emit: boolean;
|
||||
}
|
||||
16
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/tokenType.ts
generated
vendored
Executable file
16
dev/env/node_modules/@streamparser/json/dist/deno/utils/types/tokenType.ts
generated
vendored
Executable file
@@ -0,0 +1,16 @@
|
||||
enum TokenType {
|
||||
LEFT_BRACE,
|
||||
RIGHT_BRACE,
|
||||
LEFT_BRACKET,
|
||||
RIGHT_BRACKET,
|
||||
COLON,
|
||||
COMMA,
|
||||
TRUE,
|
||||
FALSE,
|
||||
NULL,
|
||||
STRING,
|
||||
NUMBER,
|
||||
SEPARATOR,
|
||||
}
|
||||
|
||||
export default TokenType;
|
||||
113
dev/env/node_modules/@streamparser/json/dist/deno/utils/utf-8.ts
generated
vendored
Executable file
113
dev/env/node_modules/@streamparser/json/dist/deno/utils/utf-8.ts
generated
vendored
Executable file
@@ -0,0 +1,113 @@
|
||||
export const enum charset {
|
||||
BACKSPACE = 0x8, // "\b"
|
||||
FORM_FEED = 0xc, // "\f"
|
||||
NEWLINE = 0xa, // "\n"
|
||||
CARRIAGE_RETURN = 0xd, // "\r"
|
||||
TAB = 0x9, // "\t"
|
||||
SPACE = 0x20, //
|
||||
EXCLAMATION_MARK = 0x21, // !
|
||||
QUOTATION_MARK = 0x22, // "
|
||||
NUMBER_SIGN = 0x23, // #
|
||||
DOLLAR_SIGN = 0x24, // $
|
||||
PERCENT_SIGN = 0x25, // %
|
||||
AMPERSAND = 0x26, // &
|
||||
APOSTROPHE = 0x27, // '
|
||||
LEFT_PARENTHESIS = 0x28, // (
|
||||
RIGHT_PARENTHESIS = 0x29, // )
|
||||
ASTERISK = 0x2a, // *
|
||||
PLUS_SIGN = 0x2b, // +
|
||||
COMMA = 0x2c, // ,
|
||||
HYPHEN_MINUS = 0x2d, // -
|
||||
FULL_STOP = 0x2e, // .
|
||||
SOLIDUS = 0x2f, // /
|
||||
DIGIT_ZERO = 0x30, // 0
|
||||
DIGIT_ONE = 0x31, // 1
|
||||
DIGIT_TWO = 0x32, // 2
|
||||
DIGIT_THREE = 0x33, // 3
|
||||
DIGIT_FOUR = 0x34, // 4
|
||||
DIGIT_FIVE = 0x35, // 5
|
||||
DIGIT_SIX = 0x36, // 6
|
||||
DIGIT_SEVEN = 0x37, // 7
|
||||
DIGIT_EIGHT = 0x38, // 8
|
||||
DIGIT_NINE = 0x39, // 9
|
||||
COLON = 0x3a, // =
|
||||
SEMICOLON = 0x3b, // ;
|
||||
LESS_THAN_SIGN = 0x3c, // <
|
||||
EQUALS_SIGN = 0x3d, // =
|
||||
GREATER_THAN_SIGN = 0x3e, // >
|
||||
QUESTION_MARK = 0x3f, // ?
|
||||
COMMERCIAL_AT = 0x40, // @
|
||||
LATIN_CAPITAL_LETTER_A = 0x41, // A
|
||||
LATIN_CAPITAL_LETTER_B = 0x42, // B
|
||||
LATIN_CAPITAL_LETTER_C = 0x43, // C
|
||||
LATIN_CAPITAL_LETTER_D = 0x44, // D
|
||||
LATIN_CAPITAL_LETTER_E = 0x45, // E
|
||||
LATIN_CAPITAL_LETTER_F = 0x46, // F
|
||||
LATIN_CAPITAL_LETTER_G = 0x47, // G
|
||||
LATIN_CAPITAL_LETTER_H = 0x48, // H
|
||||
LATIN_CAPITAL_LETTER_I = 0x49, // I
|
||||
LATIN_CAPITAL_LETTER_J = 0x4a, // J
|
||||
LATIN_CAPITAL_LETTER_K = 0x4b, // K
|
||||
LATIN_CAPITAL_LETTER_L = 0x4c, // L
|
||||
LATIN_CAPITAL_LETTER_M = 0x4d, // M
|
||||
LATIN_CAPITAL_LETTER_N = 0x4e, // N
|
||||
LATIN_CAPITAL_LETTER_O = 0x4f, // O
|
||||
LATIN_CAPITAL_LETTER_P = 0x50, // P
|
||||
LATIN_CAPITAL_LETTER_Q = 0x51, // Q
|
||||
LATIN_CAPITAL_LETTER_R = 0x52, // R
|
||||
LATIN_CAPITAL_LETTER_S = 0x53, // S
|
||||
LATIN_CAPITAL_LETTER_T = 0x54, // T
|
||||
LATIN_CAPITAL_LETTER_U = 0x55, // U
|
||||
LATIN_CAPITAL_LETTER_V = 0x56, // V
|
||||
LATIN_CAPITAL_LETTER_W = 0x57, // W
|
||||
LATIN_CAPITAL_LETTER_X = 0x58, // X
|
||||
LATIN_CAPITAL_LETTER_Y = 0x59, // Y
|
||||
LATIN_CAPITAL_LETTER_Z = 0x5a, // Z
|
||||
LEFT_SQUARE_BRACKET = 0x5b, // [
|
||||
REVERSE_SOLIDUS = 0x5c, // \
|
||||
RIGHT_SQUARE_BRACKET = 0x5d, // ]
|
||||
CIRCUMFLEX_ACCENT = 0x5e, // ^
|
||||
LOW_LINE = 0x5f, // _
|
||||
GRAVE_ACCENT = 0x60, // `
|
||||
LATIN_SMALL_LETTER_A = 0x61, // a
|
||||
LATIN_SMALL_LETTER_B = 0x62, // b
|
||||
LATIN_SMALL_LETTER_C = 0x63, // c
|
||||
LATIN_SMALL_LETTER_D = 0x64, // d
|
||||
LATIN_SMALL_LETTER_E = 0x65, // e
|
||||
LATIN_SMALL_LETTER_F = 0x66, // f
|
||||
LATIN_SMALL_LETTER_G = 0x67, // g
|
||||
LATIN_SMALL_LETTER_H = 0x68, // h
|
||||
LATIN_SMALL_LETTER_I = 0x69, // i
|
||||
LATIN_SMALL_LETTER_J = 0x6a, // j
|
||||
LATIN_SMALL_LETTER_K = 0x6b, // k
|
||||
LATIN_SMALL_LETTER_L = 0x6c, // l
|
||||
LATIN_SMALL_LETTER_M = 0x6d, // m
|
||||
LATIN_SMALL_LETTER_N = 0x6e, // n
|
||||
LATIN_SMALL_LETTER_O = 0x6f, // o
|
||||
LATIN_SMALL_LETTER_P = 0x70, // p
|
||||
LATIN_SMALL_LETTER_Q = 0x71, // q
|
||||
LATIN_SMALL_LETTER_R = 0x72, // r
|
||||
LATIN_SMALL_LETTER_S = 0x73, // s
|
||||
LATIN_SMALL_LETTER_T = 0x74, // t
|
||||
LATIN_SMALL_LETTER_U = 0x75, // u
|
||||
LATIN_SMALL_LETTER_V = 0x76, // v
|
||||
LATIN_SMALL_LETTER_W = 0x77, // w
|
||||
LATIN_SMALL_LETTER_X = 0x78, // x
|
||||
LATIN_SMALL_LETTER_Y = 0x79, // y
|
||||
LATIN_SMALL_LETTER_Z = 0x7a, // z
|
||||
LEFT_CURLY_BRACKET = 0x7b, // {
|
||||
VERTICAL_LINE = 0x7c, // |
|
||||
RIGHT_CURLY_BRACKET = 0x7d, // }
|
||||
TILDE = 0x7e, // ~
|
||||
}
|
||||
|
||||
export const escapedSequences: { [key: number]: number } = {
|
||||
[charset.QUOTATION_MARK]: charset.QUOTATION_MARK,
|
||||
[charset.REVERSE_SOLIDUS]: charset.REVERSE_SOLIDUS,
|
||||
[charset.SOLIDUS]: charset.SOLIDUS,
|
||||
[charset.LATIN_SMALL_LETTER_B]: charset.BACKSPACE,
|
||||
[charset.LATIN_SMALL_LETTER_F]: charset.FORM_FEED,
|
||||
[charset.LATIN_SMALL_LETTER_N]: charset.NEWLINE,
|
||||
[charset.LATIN_SMALL_LETTER_R]: charset.CARRIAGE_RETURN,
|
||||
[charset.LATIN_SMALL_LETTER_T]: charset.TAB,
|
||||
};
|
||||
10
dev/env/node_modules/@streamparser/json/dist/mjs/index.d.ts
generated
vendored
Executable file
10
dev/env/node_modules/@streamparser/json/dist/mjs/index.d.ts
generated
vendored
Executable file
@@ -0,0 +1,10 @@
|
||||
export { default as JSONParser, type JSONParserOptions } from "./jsonparser.js";
|
||||
export { default as Tokenizer, type TokenizerOptions, TokenizerError, } from "./tokenizer.js";
|
||||
export { default as TokenParser, type TokenParserOptions, TokenParserError, } from "./tokenparser.js";
|
||||
export * as utf8 from "./utils/utf-8.js";
|
||||
export * as JsonTypes from "./utils/types/jsonTypes.js";
|
||||
export * as ParsedTokenInfo from "./utils/types/parsedTokenInfo.js";
|
||||
export * as ParsedElementInfo from "./utils/types/parsedElementInfo.js";
|
||||
export { TokenParserMode, type StackElement, } from "./utils/types/stackElement.js";
|
||||
export { default as TokenType } from "./utils/types/tokenType.js";
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/index.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/index.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,KAAK,iBAAiB,EAAE,MAAM,iBAAiB,CAAC;AAChF,OAAO,EACL,OAAO,IAAI,SAAS,EACpB,KAAK,gBAAgB,EACrB,cAAc,GACf,MAAM,gBAAgB,CAAC;AACxB,OAAO,EACL,OAAO,IAAI,WAAW,EACtB,KAAK,kBAAkB,EACvB,gBAAgB,GACjB,MAAM,kBAAkB,CAAC;AAE1B,OAAO,KAAK,IAAI,MAAM,kBAAkB,CAAC;AACzC,OAAO,KAAK,SAAS,MAAM,4BAA4B,CAAC;AACxD,OAAO,KAAK,eAAe,MAAM,kCAAkC,CAAC;AACpE,OAAO,KAAK,iBAAiB,MAAM,oCAAoC,CAAC;AACxE,OAAO,EACL,eAAe,EACf,KAAK,YAAY,GAClB,MAAM,+BAA+B,CAAC;AACvC,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,4BAA4B,CAAC"}
|
||||
10
dev/env/node_modules/@streamparser/json/dist/mjs/index.js
generated
vendored
Executable file
10
dev/env/node_modules/@streamparser/json/dist/mjs/index.js
generated
vendored
Executable file
@@ -0,0 +1,10 @@
|
||||
export { default as JSONParser } from "./jsonparser.js";
|
||||
export { default as Tokenizer, TokenizerError, } from "./tokenizer.js";
|
||||
export { default as TokenParser, TokenParserError, } from "./tokenparser.js";
|
||||
export * as utf8 from "./utils/utf-8.js";
|
||||
export * as JsonTypes from "./utils/types/jsonTypes.js";
|
||||
export * as ParsedTokenInfo from "./utils/types/parsedTokenInfo.js";
|
||||
export * as ParsedElementInfo from "./utils/types/parsedElementInfo.js";
|
||||
export { TokenParserMode, } from "./utils/types/stackElement.js";
|
||||
export { default as TokenType } from "./utils/types/tokenType.js";
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/index.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/index.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,OAAO,IAAI,UAAU,EAA0B,MAAM,iBAAiB,CAAC;AAChF,OAAO,EACL,OAAO,IAAI,SAAS,EAEpB,cAAc,GACf,MAAM,gBAAgB,CAAC;AACxB,OAAO,EACL,OAAO,IAAI,WAAW,EAEtB,gBAAgB,GACjB,MAAM,kBAAkB,CAAC;AAE1B,OAAO,KAAK,IAAI,MAAM,kBAAkB,CAAC;AACzC,OAAO,KAAK,SAAS,MAAM,4BAA4B,CAAC;AACxD,OAAO,KAAK,eAAe,MAAM,kCAAkC,CAAC;AACpE,OAAO,KAAK,iBAAiB,MAAM,oCAAoC,CAAC;AACxE,OAAO,EACL,eAAe,GAEhB,MAAM,+BAA+B,CAAC;AACvC,OAAO,EAAE,OAAO,IAAI,SAAS,EAAE,MAAM,4BAA4B,CAAC"}
|
||||
19
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.d.ts
generated
vendored
Executable file
19
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,19 @@
|
||||
import { type TokenizerOptions } from "./tokenizer.js";
|
||||
import { type TokenParserOptions } from "./tokenparser.js";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.js";
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
export interface JSONParserOptions extends TokenizerOptions, TokenParserOptions {
|
||||
}
|
||||
export default class JSONParser {
|
||||
private tokenizer;
|
||||
private tokenParser;
|
||||
constructor(opts?: JSONParserOptions);
|
||||
get isEnded(): boolean;
|
||||
write(input: Iterable<number> | string): void;
|
||||
end(): void;
|
||||
set onToken(cb: (parsedTokenInfo: ParsedTokenInfo) => void);
|
||||
set onValue(cb: (parsedElementInfo: ParsedElementInfo) => void);
|
||||
set onError(cb: (err: Error) => void);
|
||||
set onEnd(cb: () => void);
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.d.ts","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":"AAAA,OAAkB,EAAE,KAAK,gBAAgB,EAAE,MAAM,gBAAgB,CAAC;AAClE,OAAoB,EAAE,KAAK,kBAAkB,EAAE,MAAM,kBAAkB,CAAC;AACxE,OAAO,KAAK,EAAE,iBAAiB,EAAE,MAAM,oCAAoC,CAAC;AAC5E,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AAExE,MAAM,WAAW,iBACf,SAAQ,gBAAgB,EACtB,kBAAkB;CAAG;AAEzB,MAAM,CAAC,OAAO,OAAO,UAAU;IAC7B,OAAO,CAAC,SAAS,CAAY;IAC7B,OAAO,CAAC,WAAW,CAAc;gBAErB,IAAI,GAAE,iBAAsB;IAexC,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,KAAK,EAAE,QAAQ,CAAC,MAAM,CAAC,GAAG,MAAM,GAAG,IAAI;IAI7C,GAAG,IAAI,IAAI;IAIlB,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,eAAe,EAAE,eAAe,KAAK,IAAI,EAKhE;IAED,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,iBAAiB,EAAE,iBAAiB,KAAK,IAAI,EAEpE;IAED,IAAW,OAAO,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,KAAK,KAAK,IAAI,EAE1C;IAED,IAAW,KAAK,CAAC,EAAE,EAAE,MAAM,IAAI,EAK9B;CACF"}
|
||||
47
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.js
generated
vendored
Executable file
47
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.js
generated
vendored
Executable file
@@ -0,0 +1,47 @@
|
||||
import Tokenizer, {} from "./tokenizer.js";
|
||||
import TokenParser, {} from "./tokenparser.js";
|
||||
export default class JSONParser {
|
||||
constructor(opts = {}) {
|
||||
this.tokenizer = new Tokenizer(opts);
|
||||
this.tokenParser = new TokenParser(opts);
|
||||
this.tokenizer.onToken = this.tokenParser.write.bind(this.tokenParser);
|
||||
this.tokenizer.onEnd = () => {
|
||||
if (!this.tokenParser.isEnded)
|
||||
this.tokenParser.end();
|
||||
};
|
||||
this.tokenParser.onError = this.tokenizer.error.bind(this.tokenizer);
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
};
|
||||
}
|
||||
get isEnded() {
|
||||
return this.tokenizer.isEnded && this.tokenParser.isEnded;
|
||||
}
|
||||
write(input) {
|
||||
this.tokenizer.write(input);
|
||||
}
|
||||
end() {
|
||||
this.tokenizer.end();
|
||||
}
|
||||
set onToken(cb) {
|
||||
this.tokenizer.onToken = (parsedToken) => {
|
||||
cb(parsedToken);
|
||||
this.tokenParser.write(parsedToken);
|
||||
};
|
||||
}
|
||||
set onValue(cb) {
|
||||
this.tokenParser.onValue = cb;
|
||||
}
|
||||
set onError(cb) {
|
||||
this.tokenizer.onError = cb;
|
||||
}
|
||||
set onEnd(cb) {
|
||||
this.tokenParser.onEnd = () => {
|
||||
if (!this.tokenizer.isEnded)
|
||||
this.tokenizer.end();
|
||||
cb.call(this.tokenParser);
|
||||
};
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=jsonparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/jsonparser.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonparser.js","sourceRoot":"","sources":["../../src/jsonparser.ts"],"names":[],"mappings":"AAAA,OAAO,SAAS,EAAE,EAAyB,MAAM,gBAAgB,CAAC;AAClE,OAAO,WAAW,EAAE,EAA2B,MAAM,kBAAkB,CAAC;AAQxE,MAAM,CAAC,OAAO,OAAO,UAAU;IAI7B,YAAY,OAA0B,EAAE;QACtC,IAAI,CAAC,SAAS,GAAG,IAAI,SAAS,CAAC,IAAI,CAAC,CAAC;QACrC,IAAI,CAAC,WAAW,GAAG,IAAI,WAAW,CAAC,IAAI,CAAC,CAAC;QAEzC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QACvE,IAAI,CAAC,SAAS,CAAC,KAAK,GAAG,GAAG,EAAE;YAC1B,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,OAAO;gBAAE,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,CAAC;QACxD,CAAC,CAAC;QAEF,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;QACrE,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;QACpD,CAAC,CAAC;IACJ,CAAC;IAED,IAAW,OAAO;QAChB,OAAO,IAAI,CAAC,SAAS,CAAC,OAAO,IAAI,IAAI,CAAC,WAAW,CAAC,OAAO,CAAC;IAC5D,CAAC;IAEM,KAAK,CAAC,KAAgC;QAC3C,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;IAC9B,CAAC;IAEM,GAAG;QACR,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;IACvB,CAAC;IAED,IAAW,OAAO,CAAC,EAA8C;QAC/D,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,CAAC,WAAW,EAAE,EAAE;YACvC,EAAE,CAAC,WAAW,CAAC,CAAC;YAChB,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACtC,CAAC,CAAC;IACJ,CAAC;IAED,IAAW,OAAO,CAAC,EAAkD;QACnE,IAAI,CAAC,WAAW,CAAC,OAAO,GAAG,EAAE,CAAC;IAChC,CAAC;IAED,IAAW,OAAO,CAAC,EAAwB;QACzC,IAAI,CAAC,SAAS,CAAC,OAAO,GAAG,EAAE,CAAC;IAC9B,CAAC;IAED,IAAW,KAAK,CAAC,EAAc;QAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,GAAG,GAAG,EAAE;YAC5B,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,OAAO;gBAAE,IAAI,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;YAClD,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QAC5B,CAAC,CAAC;IACJ,CAAC;CACF"}
|
||||
40
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.d.ts
generated
vendored
Executable file
40
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.d.ts
generated
vendored
Executable file
@@ -0,0 +1,40 @@
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
export interface TokenizerOptions {
|
||||
stringBufferSize?: number;
|
||||
numberBufferSize?: number;
|
||||
separator?: string;
|
||||
emitPartialTokens?: boolean;
|
||||
}
|
||||
export declare class TokenizerError extends Error {
|
||||
constructor(message: string);
|
||||
}
|
||||
export default class Tokenizer {
|
||||
private state;
|
||||
private bom?;
|
||||
private bomIndex;
|
||||
private emitPartialTokens;
|
||||
private separator?;
|
||||
private separatorBytes?;
|
||||
private separatorIndex;
|
||||
private escapedCharsByteLength;
|
||||
private bufferedString;
|
||||
private bufferedNumber;
|
||||
private unicode?;
|
||||
private highSurrogate?;
|
||||
private bytes_remaining;
|
||||
private bytes_in_sequence;
|
||||
private char_split_buffer;
|
||||
private encoder;
|
||||
private offset;
|
||||
constructor(opts?: TokenizerOptions);
|
||||
get isEnded(): boolean;
|
||||
write(input: Iterable<number> | string): void;
|
||||
private emitNumber;
|
||||
protected parseNumber(numberStr: string): number;
|
||||
error(err: Error): void;
|
||||
end(): void;
|
||||
onToken(parsedToken: ParsedTokenInfo): void;
|
||||
onError(err: Error): void;
|
||||
onEnd(): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenizer.d.ts","sourceRoot":"","sources":["../../src/tokenizer.ts"],"names":[],"mappings":"AAOA,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AAyExE,MAAM,WAAW,gBAAgB;IAC/B,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,iBAAiB,CAAC,EAAE,OAAO,CAAC;CAC7B;AASD,qBAAa,cAAe,SAAQ,KAAK;gBAC3B,OAAO,EAAE,MAAM;CAK5B;AAED,MAAM,CAAC,OAAO,OAAO,SAAS;IAC5B,OAAO,CAAC,KAAK,CAAgC;IAE7C,OAAO,CAAC,GAAG,CAAC,CAAW;IACvB,OAAO,CAAC,QAAQ,CAAK;IAErB,OAAO,CAAC,iBAAiB,CAAU;IACnC,OAAO,CAAC,SAAS,CAAC,CAAS;IAC3B,OAAO,CAAC,cAAc,CAAC,CAAa;IACpC,OAAO,CAAC,cAAc,CAAK;IAC3B,OAAO,CAAC,sBAAsB,CAAK;IACnC,OAAO,CAAC,cAAc,CAAgB;IACtC,OAAO,CAAC,cAAc,CAAgB;IAEtC,OAAO,CAAC,OAAO,CAAC,CAAS;IACzB,OAAO,CAAC,aAAa,CAAC,CAAS;IAC/B,OAAO,CAAC,eAAe,CAAK;IAC5B,OAAO,CAAC,iBAAiB,CAAK;IAC9B,OAAO,CAAC,iBAAiB,CAAqB;IAC9C,OAAO,CAAC,OAAO,CAAqB;IACpC,OAAO,CAAC,MAAM,CAAM;gBAER,IAAI,CAAC,EAAE,gBAAgB;IAmBnC,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,KAAK,EAAE,QAAQ,CAAC,MAAM,CAAC,GAAG,MAAM,GAAG,IAAI;IA8nBpD,OAAO,CAAC,UAAU;IASlB,SAAS,CAAC,WAAW,CAAC,SAAS,EAAE,MAAM,GAAG,MAAM;IAIzC,KAAK,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAQvB,GAAG,IAAI,IAAI;IA6BX,OAAO,CAAC,WAAW,EAAE,eAAe,GAAG,IAAI;IAO3C,OAAO,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAKzB,KAAK,IAAI,IAAI;CAGrB"}
|
||||
734
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.js
generated
vendored
Executable file
734
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.js
generated
vendored
Executable file
@@ -0,0 +1,734 @@
|
||||
import { charset, escapedSequences } from "./utils/utf-8.js";
|
||||
import { NonBufferedString, BufferedString, } from "./utils/bufferedString.js";
|
||||
import TokenType from "./utils/types/tokenType.js";
|
||||
// Tokenizer States
|
||||
var TokenizerStates;
|
||||
(function (TokenizerStates) {
|
||||
TokenizerStates[TokenizerStates["START"] = 0] = "START";
|
||||
TokenizerStates[TokenizerStates["ENDED"] = 1] = "ENDED";
|
||||
TokenizerStates[TokenizerStates["ERROR"] = 2] = "ERROR";
|
||||
TokenizerStates[TokenizerStates["TRUE1"] = 3] = "TRUE1";
|
||||
TokenizerStates[TokenizerStates["TRUE2"] = 4] = "TRUE2";
|
||||
TokenizerStates[TokenizerStates["TRUE3"] = 5] = "TRUE3";
|
||||
TokenizerStates[TokenizerStates["FALSE1"] = 6] = "FALSE1";
|
||||
TokenizerStates[TokenizerStates["FALSE2"] = 7] = "FALSE2";
|
||||
TokenizerStates[TokenizerStates["FALSE3"] = 8] = "FALSE3";
|
||||
TokenizerStates[TokenizerStates["FALSE4"] = 9] = "FALSE4";
|
||||
TokenizerStates[TokenizerStates["NULL1"] = 10] = "NULL1";
|
||||
TokenizerStates[TokenizerStates["NULL2"] = 11] = "NULL2";
|
||||
TokenizerStates[TokenizerStates["NULL3"] = 12] = "NULL3";
|
||||
TokenizerStates[TokenizerStates["STRING_DEFAULT"] = 13] = "STRING_DEFAULT";
|
||||
TokenizerStates[TokenizerStates["STRING_AFTER_BACKSLASH"] = 14] = "STRING_AFTER_BACKSLASH";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_1"] = 15] = "STRING_UNICODE_DIGIT_1";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_2"] = 16] = "STRING_UNICODE_DIGIT_2";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_3"] = 17] = "STRING_UNICODE_DIGIT_3";
|
||||
TokenizerStates[TokenizerStates["STRING_UNICODE_DIGIT_4"] = 18] = "STRING_UNICODE_DIGIT_4";
|
||||
TokenizerStates[TokenizerStates["STRING_INCOMPLETE_CHAR"] = 19] = "STRING_INCOMPLETE_CHAR";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_MINUS"] = 20] = "NUMBER_AFTER_INITIAL_MINUS";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_ZERO"] = 21] = "NUMBER_AFTER_INITIAL_ZERO";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_INITIAL_NON_ZERO"] = 22] = "NUMBER_AFTER_INITIAL_NON_ZERO";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_FULL_STOP"] = 23] = "NUMBER_AFTER_FULL_STOP";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_DECIMAL"] = 24] = "NUMBER_AFTER_DECIMAL";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E"] = 25] = "NUMBER_AFTER_E";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E_AND_SIGN"] = 26] = "NUMBER_AFTER_E_AND_SIGN";
|
||||
TokenizerStates[TokenizerStates["NUMBER_AFTER_E_AND_DIGIT"] = 27] = "NUMBER_AFTER_E_AND_DIGIT";
|
||||
TokenizerStates[TokenizerStates["SEPARATOR"] = 28] = "SEPARATOR";
|
||||
TokenizerStates[TokenizerStates["BOM_OR_START"] = 29] = "BOM_OR_START";
|
||||
TokenizerStates[TokenizerStates["BOM"] = 30] = "BOM";
|
||||
})(TokenizerStates || (TokenizerStates = {}));
|
||||
function TokenizerStateToString(tokenizerState) {
|
||||
return [
|
||||
"START",
|
||||
"ENDED",
|
||||
"ERROR",
|
||||
"TRUE1",
|
||||
"TRUE2",
|
||||
"TRUE3",
|
||||
"FALSE1",
|
||||
"FALSE2",
|
||||
"FALSE3",
|
||||
"FALSE4",
|
||||
"NULL1",
|
||||
"NULL2",
|
||||
"NULL3",
|
||||
"STRING_DEFAULT",
|
||||
"STRING_AFTER_BACKSLASH",
|
||||
"STRING_UNICODE_DIGIT_1",
|
||||
"STRING_UNICODE_DIGIT_2",
|
||||
"STRING_UNICODE_DIGIT_3",
|
||||
"STRING_UNICODE_DIGIT_4",
|
||||
"STRING_INCOMPLETE_CHAR",
|
||||
"NUMBER_AFTER_INITIAL_MINUS",
|
||||
"NUMBER_AFTER_INITIAL_ZERO",
|
||||
"NUMBER_AFTER_INITIAL_NON_ZERO",
|
||||
"NUMBER_AFTER_FULL_STOP",
|
||||
"NUMBER_AFTER_DECIMAL",
|
||||
"NUMBER_AFTER_E",
|
||||
"NUMBER_AFTER_E_AND_SIGN",
|
||||
"NUMBER_AFTER_E_AND_DIGIT",
|
||||
"SEPARATOR",
|
||||
"BOM_OR_START",
|
||||
"BOM",
|
||||
][tokenizerState];
|
||||
}
|
||||
const defaultOpts = {
|
||||
stringBufferSize: 0,
|
||||
numberBufferSize: 0,
|
||||
separator: undefined,
|
||||
emitPartialTokens: false,
|
||||
};
|
||||
export class TokenizerError extends Error {
|
||||
constructor(message) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenizerError.prototype);
|
||||
}
|
||||
}
|
||||
export default class Tokenizer {
|
||||
constructor(opts) {
|
||||
this.state = TokenizerStates.BOM_OR_START;
|
||||
this.bomIndex = 0;
|
||||
this.separatorIndex = 0;
|
||||
this.escapedCharsByteLength = 0;
|
||||
this.bytes_remaining = 0; // number of bytes remaining in multi byte utf8 char to read after split boundary
|
||||
this.bytes_in_sequence = 0; // bytes in multi byte utf8 char to read
|
||||
this.char_split_buffer = new Uint8Array(4); // for rebuilding chars split before boundary is reached
|
||||
this.encoder = new TextEncoder();
|
||||
this.offset = -1;
|
||||
opts = Object.assign(Object.assign({}, defaultOpts), opts);
|
||||
this.emitPartialTokens = opts.emitPartialTokens === true;
|
||||
this.bufferedString =
|
||||
opts.stringBufferSize && opts.stringBufferSize > 4
|
||||
? new BufferedString(opts.stringBufferSize)
|
||||
: new NonBufferedString();
|
||||
this.bufferedNumber =
|
||||
opts.numberBufferSize && opts.numberBufferSize > 0
|
||||
? new BufferedString(opts.numberBufferSize)
|
||||
: new NonBufferedString();
|
||||
this.separator = opts.separator;
|
||||
this.separatorBytes = opts.separator
|
||||
? this.encoder.encode(opts.separator)
|
||||
: undefined;
|
||||
}
|
||||
get isEnded() {
|
||||
return this.state === TokenizerStates.ENDED;
|
||||
}
|
||||
write(input) {
|
||||
try {
|
||||
let buffer;
|
||||
if (input instanceof Uint8Array) {
|
||||
buffer = input;
|
||||
}
|
||||
else if (typeof input === "string") {
|
||||
buffer = this.encoder.encode(input);
|
||||
}
|
||||
else if (Array.isArray(input)) {
|
||||
buffer = Uint8Array.from(input);
|
||||
}
|
||||
else if (ArrayBuffer.isView(input)) {
|
||||
buffer = new Uint8Array(input.buffer, input.byteOffset, input.byteLength);
|
||||
}
|
||||
else {
|
||||
throw new TypeError("Unexpected type. The `write` function only accepts Arrays, TypedArrays and Strings.");
|
||||
}
|
||||
for (let i = 0; i < buffer.length; i += 1) {
|
||||
const n = buffer[i]; // get current byte from buffer
|
||||
switch (this.state) {
|
||||
// @ts-expect-error fall through case
|
||||
case TokenizerStates.BOM_OR_START:
|
||||
if (input instanceof Uint8Array && n === 0xef) {
|
||||
this.bom = [0xef, 0xbb, 0xbf];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
if (input instanceof Uint16Array) {
|
||||
if (n === 0xfe) {
|
||||
this.bom = [0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if (input instanceof Uint32Array) {
|
||||
if (n === 0x00) {
|
||||
this.bom = [0x00, 0x00, 0xfe, 0xff];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
if (n === 0xff) {
|
||||
this.bom = [0xff, 0xfe, 0x00, 0x00];
|
||||
this.bomIndex += 1;
|
||||
this.state = TokenizerStates.BOM;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case TokenizerStates.START:
|
||||
this.offset += 1;
|
||||
if (this.separatorBytes && n === this.separatorBytes[0]) {
|
||||
if (this.separatorBytes.length === 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.SEPARATOR,
|
||||
value: this.separator,
|
||||
offset: this.offset + this.separatorBytes.length - 1,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
this.state = TokenizerStates.SEPARATOR;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LEFT_CURLY_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.LEFT_BRACE,
|
||||
value: "{",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.RIGHT_CURLY_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.RIGHT_BRACE,
|
||||
value: "}",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LEFT_SQUARE_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.LEFT_BRACKET,
|
||||
value: "[",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.RIGHT_SQUARE_BRACKET) {
|
||||
this.onToken({
|
||||
token: TokenType.RIGHT_BRACKET,
|
||||
value: "]",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.COLON) {
|
||||
this.onToken({
|
||||
token: TokenType.COLON,
|
||||
value: ":",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.COMMA) {
|
||||
this.onToken({
|
||||
token: TokenType.COMMA,
|
||||
value: ",",
|
||||
offset: this.offset,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_T) {
|
||||
this.state = TokenizerStates.TRUE1;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_F) {
|
||||
this.state = TokenizerStates.FALSE1;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_N) {
|
||||
this.state = TokenizerStates.NULL1;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.QUOTATION_MARK) {
|
||||
this.bufferedString.reset();
|
||||
this.escapedCharsByteLength = 0;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
if (n >= charset.DIGIT_ONE && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.DIGIT_ZERO) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_ZERO;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.HYPHEN_MINUS) {
|
||||
this.bufferedNumber.reset();
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_MINUS;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// STRING
|
||||
case TokenizerStates.STRING_DEFAULT:
|
||||
if (n === charset.QUOTATION_MARK) {
|
||||
const string = this.bufferedString.toString();
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset +=
|
||||
this.escapedCharsByteLength +
|
||||
this.bufferedString.byteLength +
|
||||
1;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.REVERSE_SOLIDUS) {
|
||||
this.state = TokenizerStates.STRING_AFTER_BACKSLASH;
|
||||
continue;
|
||||
}
|
||||
if (n >= 128) {
|
||||
// Parse multi byte (>=128) chars one at a time
|
||||
if (n >= 194 && n <= 223) {
|
||||
this.bytes_in_sequence = 2;
|
||||
}
|
||||
else if (n <= 239) {
|
||||
this.bytes_in_sequence = 3;
|
||||
}
|
||||
else {
|
||||
this.bytes_in_sequence = 4;
|
||||
}
|
||||
if (this.bytes_in_sequence <= buffer.length - i) {
|
||||
// if bytes needed to complete char fall outside buffer length, we have a boundary split
|
||||
this.bufferedString.appendBuf(buffer, i, i + this.bytes_in_sequence);
|
||||
i += this.bytes_in_sequence - 1;
|
||||
continue;
|
||||
}
|
||||
this.bytes_remaining = i + this.bytes_in_sequence - buffer.length;
|
||||
this.char_split_buffer.set(buffer.subarray(i));
|
||||
i = buffer.length - 1;
|
||||
this.state = TokenizerStates.STRING_INCOMPLETE_CHAR;
|
||||
continue;
|
||||
}
|
||||
if (n >= charset.SPACE) {
|
||||
this.bufferedString.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.STRING_INCOMPLETE_CHAR:
|
||||
// check for carry over of a multi byte char split between data chunks
|
||||
// & fill temp buffer it with start of this data chunk up to the boundary limit set in the last iteration
|
||||
this.char_split_buffer.set(buffer.subarray(i, i + this.bytes_remaining), this.bytes_in_sequence - this.bytes_remaining);
|
||||
this.bufferedString.appendBuf(this.char_split_buffer, 0, this.bytes_in_sequence);
|
||||
i = this.bytes_remaining - 1;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
case TokenizerStates.STRING_AFTER_BACKSLASH:
|
||||
// eslint-disable-next-line no-case-declarations
|
||||
const controlChar = escapedSequences[n];
|
||||
if (controlChar) {
|
||||
this.bufferedString.appendChar(controlChar);
|
||||
this.escapedCharsByteLength += 1; // len(\")=2 minus the fact you're appending len(controlChar)=1
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.unicode = "";
|
||||
this.state = TokenizerStates.STRING_UNICODE_DIGIT_1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_1:
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_2:
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_3:
|
||||
if ((n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) ||
|
||||
(n >= charset.LATIN_CAPITAL_LETTER_A &&
|
||||
n <= charset.LATIN_CAPITAL_LETTER_F) ||
|
||||
(n >= charset.LATIN_SMALL_LETTER_A &&
|
||||
n <= charset.LATIN_SMALL_LETTER_F)) {
|
||||
this.unicode += String.fromCharCode(n);
|
||||
this.state += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.STRING_UNICODE_DIGIT_4:
|
||||
if ((n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) ||
|
||||
(n >= charset.LATIN_CAPITAL_LETTER_A &&
|
||||
n <= charset.LATIN_CAPITAL_LETTER_F) ||
|
||||
(n >= charset.LATIN_SMALL_LETTER_A &&
|
||||
n <= charset.LATIN_SMALL_LETTER_F)) {
|
||||
const intVal = parseInt(this.unicode + String.fromCharCode(n), 16);
|
||||
let unicodeString;
|
||||
if (this.highSurrogate === undefined) {
|
||||
if (intVal >= 0xd800 && intVal <= 0xdbff) {
|
||||
//<55296,56319> - highSurrogate
|
||||
this.highSurrogate = intVal;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
else {
|
||||
unicodeString = String.fromCharCode(intVal);
|
||||
}
|
||||
}
|
||||
else {
|
||||
if (intVal >= 0xdc00 && intVal <= 0xdfff) {
|
||||
//<56320,57343> - lowSurrogate
|
||||
unicodeString = String.fromCharCode(this.highSurrogate, intVal);
|
||||
}
|
||||
else {
|
||||
unicodeString = String.fromCharCode(this.highSurrogate);
|
||||
}
|
||||
this.highSurrogate = undefined;
|
||||
}
|
||||
const unicodeBuffer = this.encoder.encode(unicodeString);
|
||||
this.bufferedString.appendBuf(unicodeBuffer);
|
||||
// len(\u0000)=6 minus the fact you're appending len(buf)
|
||||
this.escapedCharsByteLength += 6 - unicodeBuffer.byteLength;
|
||||
this.state = TokenizerStates.STRING_DEFAULT;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// Number
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_MINUS:
|
||||
if (n === charset.DIGIT_ZERO) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_ZERO;
|
||||
continue;
|
||||
}
|
||||
if (n >= charset.DIGIT_ONE && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
if (n === charset.FULL_STOP) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_FULL_STOP;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
if (n === charset.FULL_STOP) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_FULL_STOP;
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
case TokenizerStates.NUMBER_AFTER_FULL_STOP:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_DECIMAL;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
if (n === charset.LATIN_SMALL_LETTER_E ||
|
||||
n === charset.LATIN_CAPITAL_LETTER_E) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E;
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// @ts-expect-error fall through case
|
||||
case TokenizerStates.NUMBER_AFTER_E:
|
||||
if (n === charset.PLUS_SIGN || n === charset.HYPHEN_MINUS) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E_AND_SIGN;
|
||||
continue;
|
||||
}
|
||||
// eslint-disable-next-line no-fallthrough
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_SIGN:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
this.state = TokenizerStates.NUMBER_AFTER_E_AND_DIGIT;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
if (n >= charset.DIGIT_ZERO && n <= charset.DIGIT_NINE) {
|
||||
this.bufferedNumber.appendChar(n);
|
||||
continue;
|
||||
}
|
||||
i -= 1;
|
||||
this.state = TokenizerStates.START;
|
||||
this.emitNumber();
|
||||
continue;
|
||||
// TRUE
|
||||
case TokenizerStates.TRUE1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_R) {
|
||||
this.state = TokenizerStates.TRUE2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.TRUE2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.state = TokenizerStates.TRUE3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.TRUE3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_E) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// FALSE
|
||||
case TokenizerStates.FALSE1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_A) {
|
||||
this.state = TokenizerStates.FALSE2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.FALSE3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_S) {
|
||||
this.state = TokenizerStates.FALSE4;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.FALSE4:
|
||||
if (n === charset.LATIN_SMALL_LETTER_E) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 4;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
// NULL
|
||||
case TokenizerStates.NULL1:
|
||||
if (n === charset.LATIN_SMALL_LETTER_U) {
|
||||
this.state = TokenizerStates.NULL2;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NULL2:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.NULL3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.NULL3:
|
||||
if (n === charset.LATIN_SMALL_LETTER_L) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.SEPARATOR:
|
||||
this.separatorIndex += 1;
|
||||
if (!this.separatorBytes ||
|
||||
n !== this.separatorBytes[this.separatorIndex]) {
|
||||
break;
|
||||
}
|
||||
if (this.separatorIndex === this.separatorBytes.length - 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.onToken({
|
||||
token: TokenType.SEPARATOR,
|
||||
value: this.separator,
|
||||
offset: this.offset + this.separatorIndex,
|
||||
});
|
||||
this.separatorIndex = 0;
|
||||
}
|
||||
continue;
|
||||
// BOM support
|
||||
case TokenizerStates.BOM:
|
||||
if (n === this.bom[this.bomIndex]) {
|
||||
if (this.bomIndex === this.bom.length - 1) {
|
||||
this.state = TokenizerStates.START;
|
||||
this.bom = undefined;
|
||||
this.bomIndex = 0;
|
||||
continue;
|
||||
}
|
||||
this.bomIndex += 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
case TokenizerStates.ENDED:
|
||||
if (n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB) {
|
||||
// whitespace
|
||||
continue;
|
||||
}
|
||||
}
|
||||
throw new TokenizerError(`Unexpected "${String.fromCharCode(n)}" at position "${i}" in state ${TokenizerStateToString(this.state)}`);
|
||||
}
|
||||
if (this.emitPartialTokens) {
|
||||
switch (this.state) {
|
||||
case TokenizerStates.TRUE1:
|
||||
case TokenizerStates.TRUE2:
|
||||
case TokenizerStates.TRUE3:
|
||||
this.onToken({
|
||||
token: TokenType.TRUE,
|
||||
value: true,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.FALSE1:
|
||||
case TokenizerStates.FALSE2:
|
||||
case TokenizerStates.FALSE3:
|
||||
case TokenizerStates.FALSE4:
|
||||
this.onToken({
|
||||
token: TokenType.FALSE,
|
||||
value: false,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.NULL1:
|
||||
case TokenizerStates.NULL2:
|
||||
case TokenizerStates.NULL3:
|
||||
this.onToken({
|
||||
token: TokenType.NULL,
|
||||
value: null,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
case TokenizerStates.STRING_DEFAULT: {
|
||||
const string = this.bufferedString.toString();
|
||||
this.onToken({
|
||||
token: TokenType.STRING,
|
||||
value: string,
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
break;
|
||||
}
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
try {
|
||||
this.onToken({
|
||||
token: TokenType.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
partial: true,
|
||||
});
|
||||
}
|
||||
catch (_a) {
|
||||
// Number couldn't be parsed. Do nothing.
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (err) {
|
||||
this.error(err);
|
||||
}
|
||||
}
|
||||
emitNumber() {
|
||||
this.onToken({
|
||||
token: TokenType.NUMBER,
|
||||
value: this.parseNumber(this.bufferedNumber.toString()),
|
||||
offset: this.offset,
|
||||
});
|
||||
this.offset += this.bufferedNumber.byteLength - 1;
|
||||
}
|
||||
parseNumber(numberStr) {
|
||||
return Number(numberStr);
|
||||
}
|
||||
error(err) {
|
||||
if (this.state !== TokenizerStates.ENDED) {
|
||||
this.state = TokenizerStates.ERROR;
|
||||
}
|
||||
this.onError(err);
|
||||
}
|
||||
end() {
|
||||
switch (this.state) {
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_INITIAL_NON_ZERO:
|
||||
case TokenizerStates.NUMBER_AFTER_DECIMAL:
|
||||
case TokenizerStates.NUMBER_AFTER_E_AND_DIGIT:
|
||||
this.state = TokenizerStates.ENDED;
|
||||
this.emitNumber();
|
||||
this.onEnd();
|
||||
break;
|
||||
case TokenizerStates.BOM_OR_START:
|
||||
case TokenizerStates.START:
|
||||
case TokenizerStates.ERROR:
|
||||
case TokenizerStates.SEPARATOR:
|
||||
this.state = TokenizerStates.ENDED;
|
||||
this.onEnd();
|
||||
break;
|
||||
default:
|
||||
this.error(new TokenizerError(`Tokenizer ended in the middle of a token (state: ${TokenizerStateToString(this.state)}). Either not all the data was received or the data was invalid.`));
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
onToken(parsedToken) {
|
||||
// Override me
|
||||
throw new TokenizerError('Can\'t emit tokens before the "onToken" callback has been set up.');
|
||||
}
|
||||
onError(err) {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
onEnd() {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=tokenizer.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenizer.js.map
generated
vendored
Executable file
File diff suppressed because one or more lines are too long
35
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.d.ts
generated
vendored
Executable file
35
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.d.ts
generated
vendored
Executable file
@@ -0,0 +1,35 @@
|
||||
import type { ParsedTokenInfo } from "./utils/types/parsedTokenInfo.js";
|
||||
import type { ParsedElementInfo } from "./utils/types/parsedElementInfo.js";
|
||||
export interface TokenParserOptions {
|
||||
paths?: string[];
|
||||
keepStack?: boolean;
|
||||
separator?: string;
|
||||
emitPartialValues?: boolean;
|
||||
}
|
||||
export declare class TokenParserError extends Error {
|
||||
constructor(message: string);
|
||||
}
|
||||
export default class TokenParser {
|
||||
private readonly paths?;
|
||||
private readonly keepStack;
|
||||
private readonly separator?;
|
||||
private state;
|
||||
private mode;
|
||||
private key;
|
||||
private value;
|
||||
private stack;
|
||||
constructor(opts?: TokenParserOptions);
|
||||
private shouldEmit;
|
||||
private push;
|
||||
private pop;
|
||||
private emit;
|
||||
private emitPartial;
|
||||
get isEnded(): boolean;
|
||||
write({ token, value, partial, }: Omit<ParsedTokenInfo, "offset">): void;
|
||||
error(err: Error): void;
|
||||
end(): void;
|
||||
onValue(parsedElementInfo: ParsedElementInfo): void;
|
||||
onError(err: Error): void;
|
||||
onEnd(): void;
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenparser.d.ts","sourceRoot":"","sources":["../../src/tokenparser.ts"],"names":[],"mappings":"AAaA,OAAO,KAAK,EAAE,eAAe,EAAE,MAAM,kCAAkC,CAAC;AACxE,OAAO,KAAK,EAAE,iBAAiB,EAAE,MAAM,oCAAoC,CAAC;AAmB5E,MAAM,WAAW,kBAAkB;IACjC,KAAK,CAAC,EAAE,MAAM,EAAE,CAAC;IACjB,SAAS,CAAC,EAAE,OAAO,CAAC;IACpB,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,iBAAiB,CAAC,EAAE,OAAO,CAAC;CAC7B;AASD,qBAAa,gBAAiB,SAAQ,KAAK;gBAC7B,OAAO,EAAE,MAAM;CAK5B;AAED,MAAM,CAAC,OAAO,OAAO,WAAW;IAC9B,OAAO,CAAC,QAAQ,CAAC,KAAK,CAAC,CAA2B;IAClD,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAU;IACpC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAC,CAAS;IACpC,OAAO,CAAC,KAAK,CAA4C;IACzD,OAAO,CAAC,IAAI,CAA0C;IACtD,OAAO,CAAC,GAAG,CAAsB;IACjC,OAAO,CAAC,KAAK,CAAqC;IAClD,OAAO,CAAC,KAAK,CAAsB;gBAEvB,IAAI,CAAC,EAAE,kBAAkB;IA2BrC,OAAO,CAAC,UAAU;IAoBlB,OAAO,CAAC,IAAI;IASZ,OAAO,CAAC,GAAG;IAiBX,OAAO,CAAC,IAAI;IA6BZ,OAAO,CAAC,WAAW;IAuBnB,IAAW,OAAO,IAAI,OAAO,CAE5B;IAEM,KAAK,CAAC,EACX,KAAK,EACL,KAAK,EACL,OAAO,GACR,EAAE,IAAI,CAAC,eAAe,EAAE,QAAQ,CAAC,GAAG,IAAI;IA8JlC,KAAK,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAQvB,GAAG,IAAI,IAAI;IAoBX,OAAO,CAAC,iBAAiB,EAAE,iBAAiB,GAAG,IAAI;IAOnD,OAAO,CAAC,GAAG,EAAE,KAAK,GAAG,IAAI;IAKzB,KAAK,IAAI,IAAI;CAGrB"}
|
||||
312
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.js
generated
vendored
Executable file
312
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.js
generated
vendored
Executable file
@@ -0,0 +1,312 @@
|
||||
import { charset } from "./utils/utf-8.js";
|
||||
import TokenType from "./utils/types/tokenType.js";
|
||||
import { TokenParserMode, } from "./utils/types/stackElement.js";
|
||||
// Parser States
|
||||
var TokenParserState;
|
||||
(function (TokenParserState) {
|
||||
TokenParserState[TokenParserState["VALUE"] = 0] = "VALUE";
|
||||
TokenParserState[TokenParserState["KEY"] = 1] = "KEY";
|
||||
TokenParserState[TokenParserState["COLON"] = 2] = "COLON";
|
||||
TokenParserState[TokenParserState["COMMA"] = 3] = "COMMA";
|
||||
TokenParserState[TokenParserState["ENDED"] = 4] = "ENDED";
|
||||
TokenParserState[TokenParserState["ERROR"] = 5] = "ERROR";
|
||||
TokenParserState[TokenParserState["SEPARATOR"] = 6] = "SEPARATOR";
|
||||
})(TokenParserState || (TokenParserState = {}));
|
||||
function TokenParserStateToString(state) {
|
||||
return ["VALUE", "KEY", "COLON", "COMMA", "ENDED", "ERROR", "SEPARATOR"][state];
|
||||
}
|
||||
const defaultOpts = {
|
||||
paths: undefined,
|
||||
keepStack: true,
|
||||
separator: undefined,
|
||||
emitPartialValues: false,
|
||||
};
|
||||
export class TokenParserError extends Error {
|
||||
constructor(message) {
|
||||
super(message);
|
||||
// Typescript is broken. This is a workaround
|
||||
Object.setPrototypeOf(this, TokenParserError.prototype);
|
||||
}
|
||||
}
|
||||
export default class TokenParser {
|
||||
constructor(opts) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
this.mode = undefined;
|
||||
this.key = undefined;
|
||||
this.value = undefined;
|
||||
this.stack = [];
|
||||
opts = Object.assign(Object.assign({}, defaultOpts), opts);
|
||||
if (opts.paths) {
|
||||
this.paths = opts.paths.map((path) => {
|
||||
if (path === undefined || path === "$*")
|
||||
return undefined;
|
||||
if (!path.startsWith("$"))
|
||||
throw new TokenParserError(`Invalid selector "${path}". Should start with "$".`);
|
||||
const pathParts = path.split(".").slice(1);
|
||||
if (pathParts.includes(""))
|
||||
throw new TokenParserError(`Invalid selector "${path}". ".." syntax not supported.`);
|
||||
return pathParts;
|
||||
});
|
||||
}
|
||||
this.keepStack = opts.keepStack || false;
|
||||
this.separator = opts.separator;
|
||||
if (!opts.emitPartialValues) {
|
||||
this.emitPartial = () => { };
|
||||
}
|
||||
}
|
||||
shouldEmit() {
|
||||
if (!this.paths)
|
||||
return true;
|
||||
return this.paths.some((path) => {
|
||||
var _a;
|
||||
if (path === undefined)
|
||||
return true;
|
||||
if (path.length !== this.stack.length)
|
||||
return false;
|
||||
for (let i = 0; i < path.length - 1; i++) {
|
||||
const selector = path[i];
|
||||
const key = this.stack[i + 1].key;
|
||||
if (selector === "*")
|
||||
continue;
|
||||
if (selector !== (key === null || key === void 0 ? void 0 : key.toString()))
|
||||
return false;
|
||||
}
|
||||
const selector = path[path.length - 1];
|
||||
if (selector === "*")
|
||||
return true;
|
||||
return selector === ((_a = this.key) === null || _a === void 0 ? void 0 : _a.toString());
|
||||
});
|
||||
}
|
||||
push() {
|
||||
this.stack.push({
|
||||
key: this.key,
|
||||
value: this.value,
|
||||
mode: this.mode,
|
||||
emit: this.shouldEmit(),
|
||||
});
|
||||
}
|
||||
pop() {
|
||||
const value = this.value;
|
||||
let emit;
|
||||
({
|
||||
key: this.key,
|
||||
value: this.value,
|
||||
mode: this.mode,
|
||||
emit,
|
||||
} = this.stack.pop());
|
||||
this.state =
|
||||
this.mode !== undefined ? TokenParserState.COMMA : TokenParserState.VALUE;
|
||||
this.emit(value, emit);
|
||||
}
|
||||
emit(value, emit) {
|
||||
if (!this.keepStack &&
|
||||
this.value &&
|
||||
this.stack.every((item) => !item.emit)) {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
delete this.value[this.key];
|
||||
}
|
||||
if (emit) {
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
});
|
||||
}
|
||||
if (this.stack.length === 0) {
|
||||
if (this.separator) {
|
||||
this.state = TokenParserState.SEPARATOR;
|
||||
}
|
||||
else if (this.separator === undefined) {
|
||||
this.end();
|
||||
}
|
||||
// else if separator === '', expect next JSON object.
|
||||
}
|
||||
}
|
||||
emitPartial(value) {
|
||||
if (!this.shouldEmit())
|
||||
return;
|
||||
if (this.state === TokenParserState.KEY) {
|
||||
this.onValue({
|
||||
value: undefined,
|
||||
key: value,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
return;
|
||||
}
|
||||
this.onValue({
|
||||
value: value,
|
||||
key: this.key,
|
||||
parent: this.value,
|
||||
stack: this.stack,
|
||||
partial: true,
|
||||
});
|
||||
}
|
||||
get isEnded() {
|
||||
return this.state === TokenParserState.ENDED;
|
||||
}
|
||||
write({ token, value, partial, }) {
|
||||
try {
|
||||
if (partial) {
|
||||
this.emitPartial(value);
|
||||
return;
|
||||
}
|
||||
if (this.state === TokenParserState.VALUE) {
|
||||
if (token === TokenType.STRING ||
|
||||
token === TokenType.NUMBER ||
|
||||
token === TokenType.TRUE ||
|
||||
token === TokenType.FALSE ||
|
||||
token === TokenType.NULL) {
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.value[this.key] = value;
|
||||
this.state = TokenParserState.COMMA;
|
||||
}
|
||||
else if (this.mode === TokenParserMode.ARRAY) {
|
||||
this.value.push(value);
|
||||
this.state = TokenParserState.COMMA;
|
||||
}
|
||||
this.emit(value, this.shouldEmit());
|
||||
return;
|
||||
}
|
||||
if (token === TokenType.LEFT_BRACE) {
|
||||
this.push();
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.value = this.value[this.key] = {};
|
||||
}
|
||||
else if (this.mode === TokenParserMode.ARRAY) {
|
||||
const val = {};
|
||||
this.value.push(val);
|
||||
this.value = val;
|
||||
}
|
||||
else {
|
||||
this.value = {};
|
||||
}
|
||||
this.mode = TokenParserMode.OBJECT;
|
||||
this.state = TokenParserState.KEY;
|
||||
this.key = undefined;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (token === TokenType.LEFT_BRACKET) {
|
||||
this.push();
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.value = this.value[this.key] = [];
|
||||
}
|
||||
else if (this.mode === TokenParserMode.ARRAY) {
|
||||
const val = [];
|
||||
this.value.push(val);
|
||||
this.value = val;
|
||||
}
|
||||
else {
|
||||
this.value = [];
|
||||
}
|
||||
this.mode = TokenParserMode.ARRAY;
|
||||
this.state = TokenParserState.VALUE;
|
||||
this.key = 0;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (this.mode === TokenParserMode.ARRAY &&
|
||||
token === TokenType.RIGHT_BRACKET &&
|
||||
this.value.length === 0) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === TokenParserState.KEY) {
|
||||
if (token === TokenType.STRING) {
|
||||
this.key = value;
|
||||
this.state = TokenParserState.COLON;
|
||||
this.emitPartial();
|
||||
return;
|
||||
}
|
||||
if (token === TokenType.RIGHT_BRACE &&
|
||||
Object.keys(this.value).length === 0) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === TokenParserState.COLON) {
|
||||
if (token === TokenType.COLON) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === TokenParserState.COMMA) {
|
||||
if (token === TokenType.COMMA) {
|
||||
if (this.mode === TokenParserMode.ARRAY) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
this.key += 1;
|
||||
return;
|
||||
}
|
||||
/* istanbul ignore else */
|
||||
if (this.mode === TokenParserMode.OBJECT) {
|
||||
this.state = TokenParserState.KEY;
|
||||
return;
|
||||
}
|
||||
}
|
||||
if ((token === TokenType.RIGHT_BRACE &&
|
||||
this.mode === TokenParserMode.OBJECT) ||
|
||||
(token === TokenType.RIGHT_BRACKET &&
|
||||
this.mode === TokenParserMode.ARRAY)) {
|
||||
this.pop();
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this.state === TokenParserState.SEPARATOR) {
|
||||
if (token === TokenType.SEPARATOR && value === this.separator) {
|
||||
this.state = TokenParserState.VALUE;
|
||||
return;
|
||||
}
|
||||
}
|
||||
// Edge case in which the separator is just whitespace and it's found in the middle of the JSON
|
||||
if (token === TokenType.SEPARATOR &&
|
||||
this.state !== TokenParserState.SEPARATOR &&
|
||||
Array.from(value)
|
||||
.map((n) => n.charCodeAt(0))
|
||||
.every((n) => n === charset.SPACE ||
|
||||
n === charset.NEWLINE ||
|
||||
n === charset.CARRIAGE_RETURN ||
|
||||
n === charset.TAB)) {
|
||||
// whitespace
|
||||
return;
|
||||
}
|
||||
throw new TokenParserError(`Unexpected ${TokenType[token]} (${JSON.stringify(value)}) in state ${TokenParserStateToString(this.state)}`);
|
||||
}
|
||||
catch (err) {
|
||||
this.error(err);
|
||||
}
|
||||
}
|
||||
error(err) {
|
||||
if (this.state !== TokenParserState.ENDED) {
|
||||
this.state = TokenParserState.ERROR;
|
||||
}
|
||||
this.onError(err);
|
||||
}
|
||||
end() {
|
||||
if ((this.state !== TokenParserState.VALUE &&
|
||||
this.state !== TokenParserState.SEPARATOR) ||
|
||||
this.stack.length > 0) {
|
||||
this.error(new Error(`Parser ended in mid-parsing (state: ${TokenParserStateToString(this.state)}). Either not all the data was received or the data was invalid.`));
|
||||
}
|
||||
else {
|
||||
this.state = TokenParserState.ENDED;
|
||||
this.onEnd();
|
||||
}
|
||||
}
|
||||
/* eslint-disable-next-line @typescript-eslint/no-unused-vars */
|
||||
onValue(parsedElementInfo) {
|
||||
// Override me
|
||||
throw new TokenParserError('Can\'t emit data before the "onValue" callback has been set up.');
|
||||
}
|
||||
onError(err) {
|
||||
// Override me
|
||||
throw err;
|
||||
}
|
||||
onEnd() {
|
||||
// Override me
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=tokenparser.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/tokenparser.js.map
generated
vendored
Executable file
File diff suppressed because one or more lines are too long
30
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.d.ts
generated
vendored
Executable file
30
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.d.ts
generated
vendored
Executable file
@@ -0,0 +1,30 @@
|
||||
export interface StringBuilder {
|
||||
byteLength: number;
|
||||
appendChar: (char: number) => void;
|
||||
appendBuf: (buf: Uint8Array, start?: number, end?: number) => void;
|
||||
reset: () => void;
|
||||
toString: () => string;
|
||||
}
|
||||
export declare class NonBufferedString implements StringBuilder {
|
||||
private decoder;
|
||||
private strings;
|
||||
byteLength: number;
|
||||
appendChar(char: number): void;
|
||||
appendBuf(buf: Uint8Array, start?: number, end?: number): void;
|
||||
reset(): void;
|
||||
toString(): string;
|
||||
}
|
||||
export declare class BufferedString implements StringBuilder {
|
||||
private decoder;
|
||||
private buffer;
|
||||
private bufferOffset;
|
||||
private string;
|
||||
byteLength: number;
|
||||
constructor(bufferSize: number);
|
||||
appendChar(char: number): void;
|
||||
appendBuf(buf: Uint8Array, start?: number, end?: number): void;
|
||||
private flushStringBuffer;
|
||||
reset(): void;
|
||||
toString(): string;
|
||||
}
|
||||
//# sourceMappingURL=bufferedString.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"bufferedString.d.ts","sourceRoot":"","sources":["../../../src/utils/bufferedString.ts"],"names":[],"mappings":"AAAA,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,MAAM,CAAC;IACnB,UAAU,EAAE,CAAC,IAAI,EAAE,MAAM,KAAK,IAAI,CAAC;IACnC,SAAS,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,CAAC,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,MAAM,KAAK,IAAI,CAAC;IACnE,KAAK,EAAE,MAAM,IAAI,CAAC;IAClB,QAAQ,EAAE,MAAM,MAAM,CAAC;CACxB;AAED,qBAAa,iBAAkB,YAAW,aAAa;IACrD,OAAO,CAAC,OAAO,CAA4B;IAC3C,OAAO,CAAC,OAAO,CAAgB;IACxB,UAAU,SAAK;IAEf,UAAU,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAK9B,SAAS,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,SAAI,EAAE,GAAG,GAAE,MAAmB,GAAG,IAAI;IAKrE,KAAK,IAAI,IAAI;IAKb,QAAQ,IAAI,MAAM;CAG1B;AAED,qBAAa,cAAe,YAAW,aAAa;IAClD,OAAO,CAAC,OAAO,CAA4B;IAC3C,OAAO,CAAC,MAAM,CAAa;IAC3B,OAAO,CAAC,YAAY,CAAK;IACzB,OAAO,CAAC,MAAM,CAAM;IACb,UAAU,SAAK;gBAEH,UAAU,EAAE,MAAM;IAI9B,UAAU,CAAC,IAAI,EAAE,MAAM,GAAG,IAAI;IAM9B,SAAS,CAAC,GAAG,EAAE,UAAU,EAAE,KAAK,SAAI,EAAE,GAAG,GAAE,MAAmB,GAAG,IAAI;IAQ5E,OAAO,CAAC,iBAAiB;IAOlB,KAAK,IAAI,IAAI;IAKb,QAAQ,IAAI,MAAM;CAI1B"}
|
||||
59
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.js
generated
vendored
Executable file
59
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.js
generated
vendored
Executable file
@@ -0,0 +1,59 @@
|
||||
export class NonBufferedString {
|
||||
constructor() {
|
||||
this.decoder = new TextDecoder("utf-8");
|
||||
this.strings = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
appendChar(char) {
|
||||
this.strings.push(String.fromCharCode(char));
|
||||
this.byteLength += 1;
|
||||
}
|
||||
appendBuf(buf, start = 0, end = buf.length) {
|
||||
this.strings.push(this.decoder.decode(buf.subarray(start, end)));
|
||||
this.byteLength += end - start;
|
||||
}
|
||||
reset() {
|
||||
this.strings = [];
|
||||
this.byteLength = 0;
|
||||
}
|
||||
toString() {
|
||||
return this.strings.join("");
|
||||
}
|
||||
}
|
||||
export class BufferedString {
|
||||
constructor(bufferSize) {
|
||||
this.decoder = new TextDecoder("utf-8");
|
||||
this.bufferOffset = 0;
|
||||
this.string = "";
|
||||
this.byteLength = 0;
|
||||
this.buffer = new Uint8Array(bufferSize);
|
||||
}
|
||||
appendChar(char) {
|
||||
if (this.bufferOffset >= this.buffer.length)
|
||||
this.flushStringBuffer();
|
||||
this.buffer[this.bufferOffset++] = char;
|
||||
this.byteLength += 1;
|
||||
}
|
||||
appendBuf(buf, start = 0, end = buf.length) {
|
||||
const size = end - start;
|
||||
if (this.bufferOffset + size > this.buffer.length)
|
||||
this.flushStringBuffer();
|
||||
this.buffer.set(buf.subarray(start, end), this.bufferOffset);
|
||||
this.bufferOffset += size;
|
||||
this.byteLength += size;
|
||||
}
|
||||
flushStringBuffer() {
|
||||
this.string += this.decoder.decode(this.buffer.subarray(0, this.bufferOffset));
|
||||
this.bufferOffset = 0;
|
||||
}
|
||||
reset() {
|
||||
this.string = "";
|
||||
this.bufferOffset = 0;
|
||||
this.byteLength = 0;
|
||||
}
|
||||
toString() {
|
||||
this.flushStringBuffer();
|
||||
return this.string;
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=bufferedString.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/bufferedString.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"bufferedString.js","sourceRoot":"","sources":["../../../src/utils/bufferedString.ts"],"names":[],"mappings":"AAQA,MAAM,OAAO,iBAAiB;IAA9B;QACU,YAAO,GAAG,IAAI,WAAW,CAAC,OAAO,CAAC,CAAC;QACnC,YAAO,GAAa,EAAE,CAAC;QACxB,eAAU,GAAG,CAAC,CAAC;IAoBxB,CAAC;IAlBQ,UAAU,CAAC,IAAY;QAC5B,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,IAAI,CAAC,CAAC,CAAC;QAC7C,IAAI,CAAC,UAAU,IAAI,CAAC,CAAC;IACvB,CAAC;IAEM,SAAS,CAAC,GAAe,EAAE,KAAK,GAAG,CAAC,EAAE,MAAc,GAAG,CAAC,MAAM;QACnE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC,GAAG,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC;QACjE,IAAI,CAAC,UAAU,IAAI,GAAG,GAAG,KAAK,CAAC;IACjC,CAAC;IAEM,KAAK;QACV,IAAI,CAAC,OAAO,GAAG,EAAE,CAAC;QAClB,IAAI,CAAC,UAAU,GAAG,CAAC,CAAC;IACtB,CAAC;IAEM,QAAQ;QACb,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;IAC/B,CAAC;CACF;AAED,MAAM,OAAO,cAAc;IAOzB,YAAmB,UAAkB;QAN7B,YAAO,GAAG,IAAI,WAAW,CAAC,OAAO,CAAC,CAAC;QAEnC,iBAAY,GAAG,CAAC,CAAC;QACjB,WAAM,GAAG,EAAE,CAAC;QACb,eAAU,GAAG,CAAC,CAAC;QAGpB,IAAI,CAAC,MAAM,GAAG,IAAI,UAAU,CAAC,UAAU,CAAC,CAAC;IAC3C,CAAC;IAEM,UAAU,CAAC,IAAY;QAC5B,IAAI,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,MAAM,CAAC,MAAM;YAAE,IAAI,CAAC,iBAAiB,EAAE,CAAC;QACtE,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,YAAY,EAAE,CAAC,GAAG,IAAI,CAAC;QACxC,IAAI,CAAC,UAAU,IAAI,CAAC,CAAC;IACvB,CAAC;IAEM,SAAS,CAAC,GAAe,EAAE,KAAK,GAAG,CAAC,EAAE,MAAc,GAAG,CAAC,MAAM;QACnE,MAAM,IAAI,GAAG,GAAG,GAAG,KAAK,CAAC;QACzB,IAAI,IAAI,CAAC,YAAY,GAAG,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,MAAM;YAAE,IAAI,CAAC,iBAAiB,EAAE,CAAC;QAC5E,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,EAAE,IAAI,CAAC,YAAY,CAAC,CAAC;QAC7D,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC;QAC1B,IAAI,CAAC,UAAU,IAAI,IAAI,CAAC;IAC1B,CAAC;IAEO,iBAAiB;QACvB,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,OAAO,CAAC,MAAM,CAChC,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,EAAE,IAAI,CAAC,YAAY,CAAC,CAC3C,CAAC;QACF,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC;IACxB,CAAC;IAEM,KAAK;QACV,IAAI,CAAC,MAAM,GAAG,EAAE,CAAC;QACjB,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC;QACtB,IAAI,CAAC,UAAU,GAAG,CAAC,CAAC;IACtB,CAAC;IACM,QAAQ;QACb,IAAI,CAAC,iBAAiB,EAAE,CAAC;QACzB,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;CACF"}
|
||||
8
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.d.ts
generated
vendored
Executable file
8
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.d.ts
generated
vendored
Executable file
@@ -0,0 +1,8 @@
|
||||
export type JsonPrimitive = string | number | boolean | null;
|
||||
export type JsonKey = string | number | undefined;
|
||||
export type JsonObject = {
|
||||
[key: string]: JsonPrimitive | JsonStruct;
|
||||
};
|
||||
export type JsonArray = (JsonPrimitive | JsonStruct)[];
|
||||
export type JsonStruct = JsonObject | JsonArray;
|
||||
//# sourceMappingURL=jsonTypes.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonTypes.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/jsonTypes.ts"],"names":[],"mappings":"AAAA,MAAM,MAAM,aAAa,GAAG,MAAM,GAAG,MAAM,GAAG,OAAO,GAAG,IAAI,CAAC;AAC7D,MAAM,MAAM,OAAO,GAAG,MAAM,GAAG,MAAM,GAAG,SAAS,CAAC;AAClD,MAAM,MAAM,UAAU,GAAG;IAAE,CAAC,GAAG,EAAE,MAAM,GAAG,aAAa,GAAG,UAAU,CAAA;CAAE,CAAC;AACvE,MAAM,MAAM,SAAS,GAAG,CAAC,aAAa,GAAG,UAAU,CAAC,EAAE,CAAC;AACvD,MAAM,MAAM,UAAU,GAAG,UAAU,GAAG,SAAS,CAAC"}
|
||||
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.js
generated
vendored
Executable file
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.js
generated
vendored
Executable file
@@ -0,0 +1,2 @@
|
||||
export {};
|
||||
//# sourceMappingURL=jsonTypes.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/jsonTypes.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"jsonTypes.js","sourceRoot":"","sources":["../../../../src/utils/types/jsonTypes.ts"],"names":[],"mappings":""}
|
||||
28
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.d.ts
generated
vendored
Executable file
28
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.d.ts
generated
vendored
Executable file
@@ -0,0 +1,28 @@
|
||||
import type { StackElement } from "./stackElement.js";
|
||||
import type { JsonPrimitive, JsonKey, JsonObject, JsonArray, JsonStruct } from "./jsonTypes.js";
|
||||
export interface ParsedElementInfo {
|
||||
value?: JsonPrimitive | JsonStruct;
|
||||
parent?: JsonStruct;
|
||||
key?: JsonKey;
|
||||
stack: StackElement[];
|
||||
partial?: boolean;
|
||||
}
|
||||
export interface ParsedArrayElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonArray;
|
||||
key: number;
|
||||
stack: StackElement[];
|
||||
}
|
||||
export interface ParsedObjectProperty extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: JsonObject;
|
||||
key: string;
|
||||
stack: StackElement[];
|
||||
}
|
||||
export interface ParsedTopLevelElement extends ParsedElementInfo {
|
||||
value: JsonPrimitive | JsonStruct;
|
||||
parent: undefined;
|
||||
key: undefined;
|
||||
stack: [];
|
||||
}
|
||||
//# sourceMappingURL=parsedElementInfo.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedElementInfo.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/parsedElementInfo.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,mBAAmB,CAAC;AACtD,OAAO,KAAK,EACV,aAAa,EACb,OAAO,EACP,UAAU,EACV,SAAS,EACT,UAAU,EACX,MAAM,gBAAgB,CAAC;AAExB,MAAM,WAAW,iBAAiB;IAChC,KAAK,CAAC,EAAE,aAAa,GAAG,UAAU,CAAC;IACnC,MAAM,CAAC,EAAE,UAAU,CAAC;IACpB,GAAG,CAAC,EAAE,OAAO,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,OAAO,CAAC,EAAE,OAAO,CAAC;CACnB;AAED,MAAM,WAAW,kBAAmB,SAAQ,iBAAiB;IAC3D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,SAAS,CAAC;IAClB,GAAG,EAAE,MAAM,CAAC;IACZ,KAAK,EAAE,YAAY,EAAE,CAAC;CACvB;AAED,MAAM,WAAW,oBAAqB,SAAQ,iBAAiB;IAC7D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,UAAU,CAAC;IACnB,GAAG,EAAE,MAAM,CAAC;IACZ,KAAK,EAAE,YAAY,EAAE,CAAC;CACvB;AAED,MAAM,WAAW,qBAAsB,SAAQ,iBAAiB;IAC9D,KAAK,EAAE,aAAa,GAAG,UAAU,CAAC;IAClC,MAAM,EAAE,SAAS,CAAC;IAClB,GAAG,EAAE,SAAS,CAAC;IACf,KAAK,EAAE,EAAE,CAAC;CACX"}
|
||||
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.js
generated
vendored
Executable file
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.js
generated
vendored
Executable file
@@ -0,0 +1,2 @@
|
||||
export {};
|
||||
//# sourceMappingURL=parsedElementInfo.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedElementInfo.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedElementInfo.js","sourceRoot":"","sources":["../../../../src/utils/types/parsedElementInfo.ts"],"names":[],"mappings":""}
|
||||
57
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.d.ts
generated
vendored
Executable file
57
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.d.ts
generated
vendored
Executable file
@@ -0,0 +1,57 @@
|
||||
import TokenType from "./tokenType.js";
|
||||
import type { JsonPrimitive } from "./jsonTypes.js";
|
||||
export interface ParsedTokenInfo {
|
||||
token: TokenType;
|
||||
value: JsonPrimitive;
|
||||
offset: number;
|
||||
partial?: boolean;
|
||||
}
|
||||
export interface ParsedLeftBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACE;
|
||||
value: "{";
|
||||
}
|
||||
export interface ParsedRightBraceTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACE;
|
||||
value: "}";
|
||||
}
|
||||
export interface ParsedLeftBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.LEFT_BRACKET;
|
||||
value: "[";
|
||||
}
|
||||
export interface ParsedRighBracketTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.RIGHT_BRACKET;
|
||||
value: "]";
|
||||
}
|
||||
export interface ParsedColonTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COLON;
|
||||
value: ":";
|
||||
}
|
||||
export interface ParsedCommaTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.COMMA;
|
||||
value: ",";
|
||||
}
|
||||
export interface ParsedTrueTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.TRUE;
|
||||
value: true;
|
||||
}
|
||||
export interface ParsedFalseTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.FALSE;
|
||||
value: false;
|
||||
}
|
||||
export interface ParsedNullTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NULL;
|
||||
value: null;
|
||||
}
|
||||
export interface ParsedStringTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.STRING;
|
||||
value: string;
|
||||
}
|
||||
export interface ParsedNumberTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.NUMBER;
|
||||
value: number;
|
||||
}
|
||||
export interface ParsedSeparatorTokenInfo extends ParsedTokenInfo {
|
||||
token: TokenType.SEPARATOR;
|
||||
value: string;
|
||||
}
|
||||
//# sourceMappingURL=parsedTokenInfo.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedTokenInfo.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/parsedTokenInfo.ts"],"names":[],"mappings":"AAAA,OAAO,SAAS,MAAM,gBAAgB,CAAC;AACvC,OAAO,KAAK,EAAE,aAAa,EAAE,MAAM,gBAAgB,CAAC;AAEpD,MAAM,WAAW,eAAe;IAC9B,KAAK,EAAE,SAAS,CAAC;IACjB,KAAK,EAAE,aAAa,CAAC;IACrB,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,OAAO,CAAC;CACnB;AAED,MAAM,WAAW,wBAAyB,SAAQ,eAAe;IAC/D,KAAK,EAAE,SAAS,CAAC,UAAU,CAAC;IAC5B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,yBAA0B,SAAQ,eAAe;IAChE,KAAK,EAAE,SAAS,CAAC,WAAW,CAAC;IAC7B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,0BAA2B,SAAQ,eAAe;IACjE,KAAK,EAAE,SAAS,CAAC,YAAY,CAAC;IAC9B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,0BAA2B,SAAQ,eAAe;IACjE,KAAK,EAAE,SAAS,CAAC,aAAa,CAAC;IAC/B,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,GAAG,CAAC;CACZ;AACD,MAAM,WAAW,mBAAoB,SAAQ,eAAe;IAC1D,KAAK,EAAE,SAAS,CAAC,IAAI,CAAC;IACtB,KAAK,EAAE,IAAI,CAAC;CACb;AACD,MAAM,WAAW,oBAAqB,SAAQ,eAAe;IAC3D,KAAK,EAAE,SAAS,CAAC,KAAK,CAAC;IACvB,KAAK,EAAE,KAAK,CAAC;CACd;AACD,MAAM,WAAW,mBAAoB,SAAQ,eAAe;IAC1D,KAAK,EAAE,SAAS,CAAC,IAAI,CAAC;IACtB,KAAK,EAAE,IAAI,CAAC;CACb;AACD,MAAM,WAAW,qBAAsB,SAAQ,eAAe;IAC5D,KAAK,EAAE,SAAS,CAAC,MAAM,CAAC;IACxB,KAAK,EAAE,MAAM,CAAC;CACf;AACD,MAAM,WAAW,qBAAsB,SAAQ,eAAe;IAC5D,KAAK,EAAE,SAAS,CAAC,MAAM,CAAC;IACxB,KAAK,EAAE,MAAM,CAAC;CACf;AACD,MAAM,WAAW,wBAAyB,SAAQ,eAAe;IAC/D,KAAK,EAAE,SAAS,CAAC,SAAS,CAAC;IAC3B,KAAK,EAAE,MAAM,CAAC;CACf"}
|
||||
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.js
generated
vendored
Executable file
2
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.js
generated
vendored
Executable file
@@ -0,0 +1,2 @@
|
||||
import TokenType from "./tokenType.js";
|
||||
//# sourceMappingURL=parsedTokenInfo.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/parsedTokenInfo.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"parsedTokenInfo.js","sourceRoot":"","sources":["../../../../src/utils/types/parsedTokenInfo.ts"],"names":[],"mappings":"AAAA,OAAO,SAAS,MAAM,gBAAgB,CAAC"}
|
||||
12
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.d.ts
generated
vendored
Executable file
12
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.d.ts
generated
vendored
Executable file
@@ -0,0 +1,12 @@
|
||||
import type { JsonKey, JsonStruct } from "./jsonTypes.js";
|
||||
export declare const enum TokenParserMode {
|
||||
OBJECT = 0,
|
||||
ARRAY = 1
|
||||
}
|
||||
export interface StackElement {
|
||||
key: JsonKey;
|
||||
value: JsonStruct;
|
||||
mode?: TokenParserMode;
|
||||
emit: boolean;
|
||||
}
|
||||
//# sourceMappingURL=stackElement.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stackElement.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/stackElement.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EAAE,OAAO,EAAE,UAAU,EAAE,MAAM,gBAAgB,CAAC;AAE1D,0BAAkB,eAAe;IAC/B,MAAM,IAAA;IACN,KAAK,IAAA;CACN;AAED,MAAM,WAAW,YAAY;IAC3B,GAAG,EAAE,OAAO,CAAC;IACb,KAAK,EAAE,UAAU,CAAC;IAClB,IAAI,CAAC,EAAE,eAAe,CAAC;IACvB,IAAI,EAAE,OAAO,CAAC;CACf"}
|
||||
6
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.js
generated
vendored
Executable file
6
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.js
generated
vendored
Executable file
@@ -0,0 +1,6 @@
|
||||
export var TokenParserMode;
|
||||
(function (TokenParserMode) {
|
||||
TokenParserMode[TokenParserMode["OBJECT"] = 0] = "OBJECT";
|
||||
TokenParserMode[TokenParserMode["ARRAY"] = 1] = "ARRAY";
|
||||
})(TokenParserMode || (TokenParserMode = {}));
|
||||
//# sourceMappingURL=stackElement.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/stackElement.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stackElement.js","sourceRoot":"","sources":["../../../../src/utils/types/stackElement.ts"],"names":[],"mappings":"AAEA,MAAM,CAAN,IAAkB,eAGjB;AAHD,WAAkB,eAAe;IAC/B,yDAAM,CAAA;IACN,uDAAK,CAAA;AACP,CAAC,EAHiB,eAAe,KAAf,eAAe,QAGhC"}
|
||||
16
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.d.ts
generated
vendored
Executable file
16
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.d.ts
generated
vendored
Executable file
@@ -0,0 +1,16 @@
|
||||
declare enum TokenType {
|
||||
LEFT_BRACE = 0,
|
||||
RIGHT_BRACE = 1,
|
||||
LEFT_BRACKET = 2,
|
||||
RIGHT_BRACKET = 3,
|
||||
COLON = 4,
|
||||
COMMA = 5,
|
||||
TRUE = 6,
|
||||
FALSE = 7,
|
||||
NULL = 8,
|
||||
STRING = 9,
|
||||
NUMBER = 10,
|
||||
SEPARATOR = 11
|
||||
}
|
||||
export default TokenType;
|
||||
//# sourceMappingURL=tokenType.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenType.d.ts","sourceRoot":"","sources":["../../../../src/utils/types/tokenType.ts"],"names":[],"mappings":"AAAA,aAAK,SAAS;IACZ,UAAU,IAAA;IACV,WAAW,IAAA;IACX,YAAY,IAAA;IACZ,aAAa,IAAA;IACb,KAAK,IAAA;IACL,KAAK,IAAA;IACL,IAAI,IAAA;IACJ,KAAK,IAAA;IACL,IAAI,IAAA;IACJ,MAAM,IAAA;IACN,MAAM,KAAA;IACN,SAAS,KAAA;CACV;AAED,eAAe,SAAS,CAAC"}
|
||||
17
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.js
generated
vendored
Executable file
17
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.js
generated
vendored
Executable file
@@ -0,0 +1,17 @@
|
||||
var TokenType;
|
||||
(function (TokenType) {
|
||||
TokenType[TokenType["LEFT_BRACE"] = 0] = "LEFT_BRACE";
|
||||
TokenType[TokenType["RIGHT_BRACE"] = 1] = "RIGHT_BRACE";
|
||||
TokenType[TokenType["LEFT_BRACKET"] = 2] = "LEFT_BRACKET";
|
||||
TokenType[TokenType["RIGHT_BRACKET"] = 3] = "RIGHT_BRACKET";
|
||||
TokenType[TokenType["COLON"] = 4] = "COLON";
|
||||
TokenType[TokenType["COMMA"] = 5] = "COMMA";
|
||||
TokenType[TokenType["TRUE"] = 6] = "TRUE";
|
||||
TokenType[TokenType["FALSE"] = 7] = "FALSE";
|
||||
TokenType[TokenType["NULL"] = 8] = "NULL";
|
||||
TokenType[TokenType["STRING"] = 9] = "STRING";
|
||||
TokenType[TokenType["NUMBER"] = 10] = "NUMBER";
|
||||
TokenType[TokenType["SEPARATOR"] = 11] = "SEPARATOR";
|
||||
})(TokenType || (TokenType = {}));
|
||||
export default TokenType;
|
||||
//# sourceMappingURL=tokenType.js.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.js.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/types/tokenType.js.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"tokenType.js","sourceRoot":"","sources":["../../../../src/utils/types/tokenType.ts"],"names":[],"mappings":"AAAA,IAAK,SAaJ;AAbD,WAAK,SAAS;IACZ,qDAAU,CAAA;IACV,uDAAW,CAAA;IACX,yDAAY,CAAA;IACZ,2DAAa,CAAA;IACb,2CAAK,CAAA;IACL,2CAAK,CAAA;IACL,yCAAI,CAAA;IACJ,2CAAK,CAAA;IACL,yCAAI,CAAA;IACJ,6CAAM,CAAA;IACN,8CAAM,CAAA;IACN,oDAAS,CAAA;AACX,CAAC,EAbI,SAAS,KAAT,SAAS,QAab;AAED,eAAe,SAAS,CAAC"}
|
||||
106
dev/env/node_modules/@streamparser/json/dist/mjs/utils/utf-8.d.ts
generated
vendored
Executable file
106
dev/env/node_modules/@streamparser/json/dist/mjs/utils/utf-8.d.ts
generated
vendored
Executable file
@@ -0,0 +1,106 @@
|
||||
export declare const enum charset {
|
||||
BACKSPACE = 8,
|
||||
FORM_FEED = 12,
|
||||
NEWLINE = 10,
|
||||
CARRIAGE_RETURN = 13,
|
||||
TAB = 9,
|
||||
SPACE = 32,
|
||||
EXCLAMATION_MARK = 33,
|
||||
QUOTATION_MARK = 34,
|
||||
NUMBER_SIGN = 35,
|
||||
DOLLAR_SIGN = 36,
|
||||
PERCENT_SIGN = 37,
|
||||
AMPERSAND = 38,
|
||||
APOSTROPHE = 39,
|
||||
LEFT_PARENTHESIS = 40,
|
||||
RIGHT_PARENTHESIS = 41,
|
||||
ASTERISK = 42,
|
||||
PLUS_SIGN = 43,
|
||||
COMMA = 44,
|
||||
HYPHEN_MINUS = 45,
|
||||
FULL_STOP = 46,
|
||||
SOLIDUS = 47,
|
||||
DIGIT_ZERO = 48,
|
||||
DIGIT_ONE = 49,
|
||||
DIGIT_TWO = 50,
|
||||
DIGIT_THREE = 51,
|
||||
DIGIT_FOUR = 52,
|
||||
DIGIT_FIVE = 53,
|
||||
DIGIT_SIX = 54,
|
||||
DIGIT_SEVEN = 55,
|
||||
DIGIT_EIGHT = 56,
|
||||
DIGIT_NINE = 57,
|
||||
COLON = 58,
|
||||
SEMICOLON = 59,
|
||||
LESS_THAN_SIGN = 60,
|
||||
EQUALS_SIGN = 61,
|
||||
GREATER_THAN_SIGN = 62,
|
||||
QUESTION_MARK = 63,
|
||||
COMMERCIAL_AT = 64,
|
||||
LATIN_CAPITAL_LETTER_A = 65,
|
||||
LATIN_CAPITAL_LETTER_B = 66,
|
||||
LATIN_CAPITAL_LETTER_C = 67,
|
||||
LATIN_CAPITAL_LETTER_D = 68,
|
||||
LATIN_CAPITAL_LETTER_E = 69,
|
||||
LATIN_CAPITAL_LETTER_F = 70,
|
||||
LATIN_CAPITAL_LETTER_G = 71,
|
||||
LATIN_CAPITAL_LETTER_H = 72,
|
||||
LATIN_CAPITAL_LETTER_I = 73,
|
||||
LATIN_CAPITAL_LETTER_J = 74,
|
||||
LATIN_CAPITAL_LETTER_K = 75,
|
||||
LATIN_CAPITAL_LETTER_L = 76,
|
||||
LATIN_CAPITAL_LETTER_M = 77,
|
||||
LATIN_CAPITAL_LETTER_N = 78,
|
||||
LATIN_CAPITAL_LETTER_O = 79,
|
||||
LATIN_CAPITAL_LETTER_P = 80,
|
||||
LATIN_CAPITAL_LETTER_Q = 81,
|
||||
LATIN_CAPITAL_LETTER_R = 82,
|
||||
LATIN_CAPITAL_LETTER_S = 83,
|
||||
LATIN_CAPITAL_LETTER_T = 84,
|
||||
LATIN_CAPITAL_LETTER_U = 85,
|
||||
LATIN_CAPITAL_LETTER_V = 86,
|
||||
LATIN_CAPITAL_LETTER_W = 87,
|
||||
LATIN_CAPITAL_LETTER_X = 88,
|
||||
LATIN_CAPITAL_LETTER_Y = 89,
|
||||
LATIN_CAPITAL_LETTER_Z = 90,
|
||||
LEFT_SQUARE_BRACKET = 91,
|
||||
REVERSE_SOLIDUS = 92,
|
||||
RIGHT_SQUARE_BRACKET = 93,
|
||||
CIRCUMFLEX_ACCENT = 94,
|
||||
LOW_LINE = 95,
|
||||
GRAVE_ACCENT = 96,
|
||||
LATIN_SMALL_LETTER_A = 97,
|
||||
LATIN_SMALL_LETTER_B = 98,
|
||||
LATIN_SMALL_LETTER_C = 99,
|
||||
LATIN_SMALL_LETTER_D = 100,
|
||||
LATIN_SMALL_LETTER_E = 101,
|
||||
LATIN_SMALL_LETTER_F = 102,
|
||||
LATIN_SMALL_LETTER_G = 103,
|
||||
LATIN_SMALL_LETTER_H = 104,
|
||||
LATIN_SMALL_LETTER_I = 105,
|
||||
LATIN_SMALL_LETTER_J = 106,
|
||||
LATIN_SMALL_LETTER_K = 107,
|
||||
LATIN_SMALL_LETTER_L = 108,
|
||||
LATIN_SMALL_LETTER_M = 109,
|
||||
LATIN_SMALL_LETTER_N = 110,
|
||||
LATIN_SMALL_LETTER_O = 111,
|
||||
LATIN_SMALL_LETTER_P = 112,
|
||||
LATIN_SMALL_LETTER_Q = 113,
|
||||
LATIN_SMALL_LETTER_R = 114,
|
||||
LATIN_SMALL_LETTER_S = 115,
|
||||
LATIN_SMALL_LETTER_T = 116,
|
||||
LATIN_SMALL_LETTER_U = 117,
|
||||
LATIN_SMALL_LETTER_V = 118,
|
||||
LATIN_SMALL_LETTER_W = 119,
|
||||
LATIN_SMALL_LETTER_X = 120,
|
||||
LATIN_SMALL_LETTER_Y = 121,
|
||||
LATIN_SMALL_LETTER_Z = 122,
|
||||
LEFT_CURLY_BRACKET = 123,
|
||||
VERTICAL_LINE = 124,
|
||||
RIGHT_CURLY_BRACKET = 125,
|
||||
TILDE = 126
|
||||
}
|
||||
export declare const escapedSequences: {
|
||||
[key: number]: number;
|
||||
};
|
||||
//# sourceMappingURL=utf-8.d.ts.map
|
||||
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/utf-8.d.ts.map
generated
vendored
Executable file
1
dev/env/node_modules/@streamparser/json/dist/mjs/utils/utf-8.d.ts.map
generated
vendored
Executable file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"utf-8.d.ts","sourceRoot":"","sources":["../../../src/utils/utf-8.ts"],"names":[],"mappings":"AAAA,0BAAkB,OAAO;IACvB,SAAS,IAAM;IACf,SAAS,KAAM;IACf,OAAO,KAAM;IACb,eAAe,KAAM;IACrB,GAAG,IAAM;IACT,KAAK,KAAO;IACZ,gBAAgB,KAAO;IACvB,cAAc,KAAO;IACrB,WAAW,KAAO;IAClB,WAAW,KAAO;IAClB,YAAY,KAAO;IACnB,SAAS,KAAO;IAChB,UAAU,KAAO;IACjB,gBAAgB,KAAO;IACvB,iBAAiB,KAAO;IACxB,QAAQ,KAAO;IACf,SAAS,KAAO;IAChB,KAAK,KAAO;IACZ,YAAY,KAAO;IACnB,SAAS,KAAO;IAChB,OAAO,KAAO;IACd,UAAU,KAAO;IACjB,SAAS,KAAO;IAChB,SAAS,KAAO;IAChB,WAAW,KAAO;IAClB,UAAU,KAAO;IACjB,UAAU,KAAO;IACjB,SAAS,KAAO;IAChB,WAAW,KAAO;IAClB,WAAW,KAAO;IAClB,UAAU,KAAO;IACjB,KAAK,KAAO;IACZ,SAAS,KAAO;IAChB,cAAc,KAAO;IACrB,WAAW,KAAO;IAClB,iBAAiB,KAAO;IACxB,aAAa,KAAO;IACpB,aAAa,KAAO;IACpB,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,sBAAsB,KAAO;IAC7B,mBAAmB,KAAO;IAC1B,eAAe,KAAO;IACtB,oBAAoB,KAAO;IAC3B,iBAAiB,KAAO;IACxB,QAAQ,KAAO;IACf,YAAY,KAAO;IACnB,oBAAoB,KAAO;IAC3B,oBAAoB,KAAO;IAC3B,oBAAoB,KAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,oBAAoB,MAAO;IAC3B,kBAAkB,MAAO;IACzB,aAAa,MAAO;IACpB,mBAAmB,MAAO;IAC1B,KAAK,MAAO;CACb;AAED,eAAO,MAAM,gBAAgB,EAAE;IAAE,CAAC,GAAG,EAAE,MAAM,GAAG,MAAM,CAAA;CASrD,CAAC"}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user