diff --git a/.env.template b/.env.template new file mode 100644 index 0000000..1ddc78d --- /dev/null +++ b/.env.template @@ -0,0 +1,32 @@ +BLOCKCHAIN_ENDPOINT='http://....' +BLOCKCHAIN_CHAIN_ID='' +HYPERION_URL='' +SCANNER_NODES_MAX_CHUNK_SIZE=100 +SCANNER_SCAN_KEY='test' +ABIS_SERVICE_LIMIT=100 +ABIS_SERVICE_FILTER='eosio:setabi' +BLOCK_READER_ENDPOINTS='ws://...' +BLOCK_READER_FETCH_BLOCK=1 +BLOCK_READER_FETCH_DELTAS=1 +BLOCK_READER_FETCH_TRACES=1 +READER_MAX_THREADS=1 +READER_INVIOLABLE_THREADS_COUNT=0 +PROCESSOR_MAX_THREADS=1 +PROCESSOR_INVIOLABLE_THREADS_COUNT=0 +FILTER_MAX_THREADS=1 +FILTER_INVIOLABLE_THREADS_COUNT=0 +PROCESSOR_TASK_QUEUE_CHECK_INTERVAL=5000 +API_PORT=8080 +START_BLOCK=238580000 +END_BLOCK=238581000 +START_FROM_HEAD=0 +MODE='default' +MAX_BLOCK_NUMBER=0xffffffff +UNPROCESSED_BLOCK_QUEUE_MAX_BYTES_SIZE=256000000 +UNPROCESSED_BLOCK_QUEUE_SIZE_CHECK_INTERVAL=2000 +UNPROCESSED_BLOCK_QUEUE_BATCH_SIZE=100 +BROADCAST_PORT=9000 +BROADCAST_HOST='localhost' +MONGO_HOSTS='localhost' +MONGO_PORTS='27017' +MONGO_DB_NAME='history_tools' \ No newline at end of file diff --git a/README.md b/README.md index 279e997..5fe0afd 100644 --- a/README.md +++ b/README.md @@ -1,9 +1,211 @@ -# api-history-tools +# History Tools + +This project is part of the Alien Worlds open source initiative, offering a set of tools and processes for downloading and processing blockchain data (transactions and deltas). It is designed to operate in two modes: default (live) and replay. The default mode continuously downloads current block data, while replay mode is designed for data retrieval from a specific range of blocks. + +This package encapsulates the core mechanism, however, complete functionality requires packages that contain implementations of common components and third-party elements. + +## Dependencies + +- [@alien-worlds/aw-core](https://github.com/Alien-Worlds/api-core) +- [@alien-worlds/aw-broadcast](https://github.com/Alien-Worlds/broadcast) +- [@alien-worlds/aw-workers](https://github.com/Alien-Worlds/workers) +- [async](https://github.com/caolan/async) +- [commander](https://github.com/tj/commander.js) + + +## Table of Contents + +- [Installation](#installation) +- [Processes](#processes) + - [API](#api) + - [Broadcasting](#broadcasting) + - [Bootstrap](#bootstrap) + - [Reader](#reader) + - [Filter](#filter) + - [Processor](#processor) +- [Common components](#common-components) + - [Abis](#abis) + - [BlockRangeScanner](#blockrangescanner) + - [BlockState](#blockstate) + - [Featured](#featured) + - [ProcessorTaskQueue](#processortaskqueue) + - [UnprocessedBlockQueue](#unprocessedblockqueue) +- [Additional Tools](#additional-tools) + - [Config](#config) +- [Tutorials](#tutorials) +- [Contributing](#contributing) +- [License](#license) + +## Installation + +To add History Tools to your project, use the following command with your favorite package manager: + +```bash +yarn add @alien-worlds/aw-history + +``` + +## Processes + +All processes utilize the commander, enabling specific value assignments for individual options or the use of environment variables. + +### API + +The API process, currently under development, is intended to provide easy access to downloaded data. This API allows viewing of blockchain data, offering endpoints to retrieve block-specific data, transactions, tables, or data from a specific range according to selected criteria. The API is read-only, it doesn't contain methods that modify the content. + +### Broadcasting + +The broadcasting process creates a server for communication between processes, leveraging the @alien-worlds/aw-broadcast module. Processes inform each other about their state, enabling efficient work coordination. The server facilitates the dispatch of messages defining tasks and states, improving performance when processing a high volume of blockchain data. + +### Bootstrap + +The bootstrap process prepares input data according to provided guidelines and dispatches tasks to other processes using broadcasting, thereby initiating the blockchain data download. + +### Reader + +The Reader process downloads blocks in their undecrypted form based on the given operating mode (live/replay) and block interval. The primary function is to fetch blocks and store them in the database. Multiple reader instances can be created or workers can be used for scaling if not using Docker. + +### Filter + +The Filter process retrieves blocks from the database, verifying their content. If a block contains specific contracts, actions, or tables listed in the configuration file, these data are decoded using a dedicated blockchain serializer and saved as tasks for the Processor in the database. ### Processor -Lorem +The Processor process retrieves tasks from the database, generating appropriate action or delta data based on their content. It coordinates the work of workers who perform the processing. There are two categories of processors: DeltaProcessor and ActionTraceProcessor. The processor creates separate collections in the database for each contract, stock, and delta e.g. `dao.worlds_actions` and `dao.worlds_deltas`. Any processing failures are stored in a separate collection for subsequent review and analysis. + +## Common components + +In addition to the processes that make up the base of the History Tools package, there are also common components used between these processes. These components provide essential functionality and are shared among the processes. To a large extent, this repository contains high/domain level implementations of these components, and any code that directly depends on external libraries is placed in a separate repository such as the **Starter Kit**. common components consist of: + +### Abis + +The Abis component serves as a repository for storing contract ABIs (Application Binary Interfaces). It includes a service for downloading new ABIs and retrieving existing ABIs. The Abis component is primarily used in the Bootstrap process to download ABIs for listed contracts. The downloaded ABIs are saved in the database and can be fetched when needed. + +```javascript +{ + block_number, // numer bloku w ktorym zainicjonowano kontrakt + contract, // nazwa kontraktu + hex // ABI w postaci HEX +} +``` + +We also use Addis in **Filter** when creating tasks for the processor. At that time we get ABI (for the concrete contract) from the database and if it does not exist, try to fetch it. + +#### Methods +- `getAbis(options?)`: Retrieves the ABIs (Application Binary Interfaces) for the specified options. +- `getAbi(blockNumber, contract, fetch?)`: Retrieves the single ABI for the specified block number and contract address. +- `storeAbi(blockNumber, contract, hex)`: Stores the ABI with data. +- `fetchAbis(contracts?)`: Fetch via service ABIs. Optionally you can specify which contracts you want to use +- `cacheAbis(contracts?)`: Caches the ABIs for the specified contracts. + + +### BlockRangeScanner +The BlockRangeScanner is used in the Reader process during replay mode. It creates sets of blocks to be scanned within a specific range, allowing for organized data download. The BlockRangeScanner divides the range into subgroups containing an equal number of blocks to be scanned. This helps distribute the workload among multiple instances of the Reader process or workers. + +#### Methods + +- `createScanNodes(key, startBlock, endBlock)`: Creates range scan sets under specified label. +- `getNextScanNode(key)`: Returns the next (not currently scanned) set of the range. +- `hasUnscannedBlocks(key, startBlock, endBlock)`: Returns a boolean after checking whether the condition is true or not. +- `updateScanProgress(scanKey, blockNumber)`: Updates the number of the currently scanned block in the corresponding set. + + + +### BlockState +The BlockState component is a service for updating the current status of the History Tools. It stores various statistics in the database, including the number of the last read block. After reading a block, the BlockState is updated so that in case of restarting the History Tools, they can resume from the last processed block. + +#### Methods + +- `getState()`: Returns the current statistics. +- `getBlockNumber()`: Returns the number of the last read block. +- `updateBlockNumber(blockNumber)`: Updates the block number value in the statistics. + +### Featured +The Featured component includes a repository for storing information about "featured" contracts. These contracts are included in a list and have certain criteria for choosing the right processor for their actions and deltas. The Featured component is used in the Bootstrap process to download data for the featured contracts, and it is also used in the Filter process to determine the appropriate ABI based on the block number when reading block data. + +#### FeaturedContracts Methods + +- `readContracts(data)`: reads a json object or list of strings and retrieves the previously mentioned data. After they are downloaded from the web, they are stored in the database and cache. +- `isFeatured(contract)`: Checks if the given name is on the list of blocks of interest. + +We build `Featured` class instances based on the data contained in the list of "featured" contracts (e.g. in the form of a JSON object). Each object in the list contains not only the names of the contracts, but also criteria on choosing the right processor for individual actions and contract deltas. For more information, see the [Tutorials](#tutorials) section. + + +#### Featured Methods + +- `getCriteria(criteria)`: Gets all match criteria that match the given criteria. +- `getProcessor(label)`: Gets the processor for the given label and criteria. +- `getContracts()`: Lists all contracts in the processor. + + +### ProcessorTaskQueue + +The ProcessorTaskQueue is a queue/repository of processor tasks. These tasks are generated by the Filter process and saved in the database by the ProcessorTaskQueue. The Processor process retrieves the tasks from the queue, removes them from the list, and processes them accordingly. Each task contains encrypted data that needs to be decoded using native blockchain deserializers. If a task fails, it is sent to a separate list `unsuccessful_processor_tasks` for subsequent attempts or analysis. + +The task document diagram is as follows: + +```typescript +{ + _id : "6494e07f7fcfdd1bb21c8da6", + abi : "...", + short_id : "uspts.worlds:addpoints", + label : "transaction_trace_v0:action_trace_v1:uspts.worlds:addpoints", + type : "action", + mode : "default", + content: "... binary ..."; + hash: "db1a9ef8ddf670313b6868760283e42b0cc21176"; + block_number : 252016298, + block_timestamp : "2023-06-22T23:59:12.000Z", + is_fork : false, + error?; +}; +``` + +#### Methods + +- `nextTask(mode)`: Returns the next unassigned task from the list. +- `addTasks(tasks)`: Adds tasks to the list. +- `stashUnsuccessfulTask(task, error)`: Method to put tasks into the failed list. + +### UnprocessedBlockQueue + +UnprocessedBlockQueue is a queue/repository of blocks to be read. We create these collections to speed up the process of reading and filtering blocks. Filtering takes more time than reading, so we separated the two processes, thus allowing you to set the resources appropriately for the fastest result. UnprocessedBlockQueue is used in the reader process to add more blocks to the list and filter to filter them and extract interesting data. The UnprocessedBlockQueue allows setting limits on the number of unread blocks or their total size to prevent overloading the system. + +#### Methods + +- `getBytesSize()`: Gets the size in bytes of the queue of unread blocks. +- `add(block, isLast)`: Adds a block to the list. +- `next()`: Returns the next block in the list. +- `beforeSendBatch(handler)`: Sets the handler to run before sending blocks to the database. +- `afterSendBatch(handler)`: Sets the handler to start after sending blocks to the database. +- `onOverload(handler)`: Sets the handler to start when the number or total size of blocks in the list is exceeded. + + + + +## Additional Tools + +The History Tools package also includes additional tools that can be helpful in various scenarios: + +### Config + +The Config tools are used for generating configuration objects based on values stored in the .env file or environment variables. The list of required options can be found in the file [.env.template](./.env.template). + +## Tutorials + +For tutorials on creating and using the history tools for your specific needs, see the tutorials in the [History Tools Starter Kit](https://github.com/Alien-Worlds/aw-history-starter-kit) repository. If you want to create history tools with `mongodb` and `eosjs` tools, you should go to the mentioned repository. + + +If you want to extend the capabilities of the history tools or take advantage of other third-party resources, please refer to the following tutorial. + +- [Extending history tools](./tutorials/extending-history-tools.md) +- [What is "featured" content?](./tutorials/what-is-featured-content.md) +- [Description of configuration variables](./tutorials/config-vars.md) + +## Contributing + +We welcome contributions from the community. Before contributing, please read through the existing issues on this repository to prevent duplicate submissions. New feature requests and bug reports can be submitted as an issue. If you would like to contribute code, please open a pull request. -### Block Range +## License -Lorem \ No newline at end of file +This project is licensed under the terms of the MIT license. For more information, refer to the [LICENSE](./LICENSE) file. diff --git a/package.json b/package.json index 358a39b..a48a9f3 100644 --- a/package.json +++ b/package.json @@ -1,10 +1,14 @@ { - "name": "@alien-worlds/api-history-tools", - "version": "0.0.114", + "name": "@alien-worlds/aw-history", + "version": "0.0.8", "description": "", "packageManager": "yarn@3.2.3", "main": "build/index.js", "types": "build/index.d.ts", + "repository": { + "type": "git", + "url": "https://github.com/Alien-Worlds/aw-history" + }, "files": [ "build" ], @@ -20,7 +24,6 @@ }, "license": "ISC", "devDependencies": { - "@types/express": "^4.17.17", "@types/jest": "^27.0.3", "@types/node": "^18.7.14", "@types/node-fetch": "2.x", @@ -35,18 +38,12 @@ "typescript": "^4.8.2" }, "dependencies": { - "@alien-worlds/api-core": "^0.0.101", - "@eosrio/node-abieos": "^1", - "amqplib": "^0.10.3", + "@alien-worlds/aw-broadcast": "^0.0.6", + "@alien-worlds/aw-core": "^0.0.13", + "@alien-worlds/aw-workers": "^0.0.2", "async": "^3.2.4", + "commander": "^10.0.1", "crypto": "^1.0.1", - "eosjs": "^22.1.0", - "express": "^4.18.2", - "nanoid": "^3.0.0", - "node-fetch": "2.6.6", - "reflect-metadata": "^0.1.13", - "text-encoding": "^0.7.0", - "ts-node": "^10.9.1", - "ws": "^8.12.0" + "ts-node": "^10.9.1" } } diff --git a/src/api/api.command.ts b/src/api/api.command.ts new file mode 100644 index 0000000..f4dc7bf --- /dev/null +++ b/src/api/api.command.ts @@ -0,0 +1,9 @@ +import { Command } from 'commander'; + +export const apiCommand = new Command(); + +apiCommand + .version('1.0', '-v, --version') + .option('-h, --host ') + .option('-p, --port ') + .parse(process.argv); diff --git a/src/api/api.dependencies.ts b/src/api/api.dependencies.ts new file mode 100644 index 0000000..0f646f7 --- /dev/null +++ b/src/api/api.dependencies.ts @@ -0,0 +1,21 @@ +import { Container, Result, Route, UnknownObject } from '@alien-worlds/aw-core'; +import { DatabaseConfigBuilder, Dependencies } from '../common'; +import { Api } from './api'; +import { ApiConfig } from './api.types'; + +/** + * An abstract class representing a Api dependencies. + * @class ApiDependencies + */ +export abstract class ApiDependencies extends Dependencies { + public api: Api; + public ioc: Container; + public setupIoc: (config: ApiConfig, container: Container) => Promise; + public routesProvider: (container: Container) => Route[]; + public databaseConfigBuilder: DatabaseConfigBuilder; + + public abstract initialize( + setupIoc: (config: UnknownObject, container: Container) => Promise, + routesProvider: (container: Container) => Route[] + ): Promise; +} diff --git a/src/api/api.ts b/src/api/api.ts index 2a9f49c..cf2db65 100644 --- a/src/api/api.ts +++ b/src/api/api.ts @@ -1,21 +1,19 @@ /* eslint-disable @typescript-eslint/no-unused-vars */ -import { log, Route } from '@alien-worlds/api-core'; -import express, { Express } from 'express'; import { ApiConfig } from './api.types'; -export class Api { - private app: Express; +export class Api { + protected app: WebFramework; + protected config: ApiConfig; - constructor(private config: ApiConfig) { - this.app = express(); + public setup(config: ApiConfig) { + this.config = config; } public async start() { - const { - config: { port }, - } = this; - this.app.listen(port, () => { - log(`Server is running at http://localhost:${port}`); - }); + throw new Error('Method "start" not implemented'); + } + + public get framework(): WebFramework { + return this.app; } } diff --git a/src/api/api.types.ts b/src/api/api.types.ts index 4a02dfb..50ff5a9 100644 --- a/src/api/api.types.ts +++ b/src/api/api.types.ts @@ -1,6 +1,12 @@ -import { MongoConfig } from '@alien-worlds/api-core'; +import { UnknownObject } from '@alien-worlds/aw-core'; -export type ApiConfig = { +export type ApiCommandOptions = { + host: string; port: number; - mongo: MongoConfig; +}; + +export type ApiConfig = { + host: string; + port: number; + database: DatabaseConfig; }; diff --git a/src/api/endpoints/actions/data/data-sources/contract-action.mongo.source.ts b/src/api/endpoints/actions/data/data-sources/contract-action.mongo.source.ts deleted file mode 100644 index 8a9a367..0000000 --- a/src/api/endpoints/actions/data/data-sources/contract-action.mongo.source.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { - CollectionMongoSource, - ContractAction, - MongoSource, -} from '@alien-worlds/api-core'; - -/** - * @class - */ -export class ContractActionMongoSource extends CollectionMongoSource { - /** - * @constructor - * @param {MongoSource} mongoSource - */ - constructor(mongoSource: MongoSource) { - super(mongoSource, 'actions'); - } -} diff --git a/src/api/endpoints/actions/data/mappers/contract-action.mapper.ts b/src/api/endpoints/actions/data/mappers/contract-action.mapper.ts deleted file mode 100644 index 5c891ee..0000000 --- a/src/api/endpoints/actions/data/mappers/contract-action.mapper.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { Entity, Mapper } from '@alien-worlds/api-core'; -import { ContractAction, ContractActionDocument } from '@alien-worlds/api-core'; - -/*imports*/ - -type MapperType = (data: object) => Entity; - -export class ContractActionMapper - implements Mapper -{ - private mapper: MapperType; - - constructor(mapper?: MapperType) { - if (mapper) { - this.mapper = mapper; - } else { - this.mapper = (data: object) => - ({ - toDocument: () => data, - } as Entity); - } - } - - public toEntity(document: ContractActionDocument): ContractAction { - return ContractAction.fromDocument(document, this.mapper); - } - public toDataObject(entity: ContractAction): ContractActionDocument { - return entity.toDocument(); - } -} diff --git a/src/api/endpoints/actions/data/dtos/actions.dto.ts b/src/api/endpoints/actions/domain/actions.types.ts similarity index 100% rename from src/api/endpoints/actions/data/dtos/actions.dto.ts rename to src/api/endpoints/actions/domain/actions.types.ts diff --git a/src/api/endpoints/actions/domain/list-actions.controller.ts b/src/api/endpoints/actions/domain/list-actions.controller.ts index 77517ad..310af89 100644 --- a/src/api/endpoints/actions/domain/list-actions.controller.ts +++ b/src/api/endpoints/actions/domain/list-actions.controller.ts @@ -1,4 +1,4 @@ -import { inject, injectable, Result } from '@alien-worlds/api-core'; +import { inject, injectable, Result } from '@alien-worlds/aw-core'; import { ListActionsInput } from './models/list-actions.input'; import { ListActionsOutput } from './models/list-actions.output'; import { ListActionsUseCase } from './use-cases/list-actions.use-case'; diff --git a/src/api/endpoints/actions/domain/models/list-actions.input.ts b/src/api/endpoints/actions/domain/models/list-actions.input.ts index fe3475f..c301031 100644 --- a/src/api/endpoints/actions/domain/models/list-actions.input.ts +++ b/src/api/endpoints/actions/domain/models/list-actions.input.ts @@ -1,71 +1,14 @@ -import { ListActionsQueryParams } from '../../data/dtos/actions.dto'; -import { - Request, - parseToBigInt, - QueryModel, - MongoAggregateParams, -} from '@alien-worlds/api-core'; +import { IO, UnknownObject } from '@alien-worlds/aw-core'; /** * @class */ -export class ListActionsInput implements QueryModel { - /** - * - * @param {ListActionsRequestDto} dto - * @returns {ListActionsInput} - */ - public static fromRequest( - request: Request - ): ListActionsInput { - const { - query: { contracts, names, accounts, from, to, limit, offset, block_numbers }, - } = request; - - let fromBlock: bigint; - let toBlock: bigint; - let fromDate: Date; - let toDate: Date; - let blockNumbers = []; - - if (from) { - if (/^[0-9]+$/.test(from)) { - fromBlock = parseToBigInt(from); - } else { - fromDate = new Date(from); - } - } - - if (to) { - if (/^[0-9]+$/.test(to)) { - toBlock = parseToBigInt(to); - } else { - toDate = new Date(to); - } - } - - if (block_numbers) { - blockNumbers = block_numbers.split(',').map(parseToBigInt); - } - - return new ListActionsInput( - contracts ? contracts.split(',') : [], - names ? names.split(',') : [], - accounts ? accounts.split(',') : [], - fromBlock, - toBlock, - fromDate, - toDate, - blockNumbers, - offset || 0, - limit || 10 - ); - } +export class ListActionsInput implements IO { /** * * @constructor * @private */ - private constructor( + constructor( public readonly contracts: string[], public readonly names: string[], public readonly accounts: string[], @@ -78,42 +21,27 @@ export class ListActionsInput implements QueryModel { public readonly limit: number ) {} - public toQueryParams(): MongoAggregateParams { + public toJSON(): UnknownObject { const { contracts, names, accounts, + blockNumbers, startBlock, endBlock, - startTimestamp, - endTimestamp, offset, limit, } = this; - // TODO: use unions and represent it in special collection called ActionRepository - // it should contain all structs - const pipeline = [ - { $match: { field: 'value' } }, - { $project: { field: 1 } }, - { $skip: 1 }, - { $limit: 5 }, - { - $unionWith: { - coll: 'collection2', - pipeline: [ - { $match: { otherField: 'otherValue' } }, - { $project: { otherField: 1 } }, - { $skip: 1 }, - { $limit: 5 }, - ], - }, - }, - ]; - const options = {}; return { - pipeline, - options, + contracts, + names, + accounts, + block_numbers: blockNumbers.map(blockNumber => blockNumber.toString()), + from: startBlock.toString(), + to: endBlock.toString(), + offset, + limit, }; } } diff --git a/src/api/endpoints/actions/domain/models/list-actions.output.ts b/src/api/endpoints/actions/domain/models/list-actions.output.ts index c8dbc67..f70aeeb 100644 --- a/src/api/endpoints/actions/domain/models/list-actions.output.ts +++ b/src/api/endpoints/actions/domain/models/list-actions.output.ts @@ -1,29 +1,21 @@ -import { ContractAction, Result } from '@alien-worlds/api-core'; +import { ContractAction, IO, Result, UnknownObject } from '@alien-worlds/aw-core'; -export class ListActionsOutput { +export class ListActionsOutput implements IO { public static create(result: Result): ListActionsOutput { return new ListActionsOutput(result); } - private constructor(public readonly result: Result) {} + constructor(public readonly result: Result) {} - public toResponse() { + toJSON(): UnknownObject { const { result } = this; + if (result.isFailure) { - const { - failure: { error }, - } = result; - if (error) { - return { - status: 500, - body: [], - }; - } + return {}; } - + return { - status: 200, - body: result.content + result: result.content.map(r => r.toJSON()), }; } } diff --git a/src/api/endpoints/actions/domain/repositories/contract-action.repository.ts b/src/api/endpoints/actions/domain/repositories/contract-action.repository.ts index ae804a8..f6b52c5 100644 --- a/src/api/endpoints/actions/domain/repositories/contract-action.repository.ts +++ b/src/api/endpoints/actions/domain/repositories/contract-action.repository.ts @@ -1,22 +1,10 @@ -import { - injectable, - Repository, - ContractAction, - ContractActionDocument, - Entity, -} from '@alien-worlds/api-core'; +import { injectable, Repository, ContractAction } from '@alien-worlds/aw-core'; /** * @abstract * @class */ @injectable() -export abstract class ContractActionRepository< - DataEntityType extends Entity = Entity, - DataDocumentType = object -> extends Repository< - ContractAction, - ContractActionDocument -> { +export abstract class ContractActionRepository extends Repository { public static Token = 'CONTRACT_ACTION_REPOSITORY'; } diff --git a/src/api/endpoints/actions/domain/use-cases/list-actions.use-case.ts b/src/api/endpoints/actions/domain/use-cases/list-actions.use-case.ts index 3bae1af..d60c6c5 100644 --- a/src/api/endpoints/actions/domain/use-cases/list-actions.use-case.ts +++ b/src/api/endpoints/actions/domain/use-cases/list-actions.use-case.ts @@ -1,11 +1,13 @@ +/* eslint-disable @typescript-eslint/no-unused-vars */ import { ListActionsInput } from './../models/list-actions.input'; import { ContractAction, + FindParams, inject, injectable, Result, UseCase, -} from '@alien-worlds/api-core'; +} from '@alien-worlds/aw-core'; import { ContractActionRepository } from '../repositories/contract-action.repository'; /*imports*/ @@ -26,7 +28,11 @@ export class ListActionsUseCase implements UseCase { * @returns {Promise>} */ public async execute(input: ListActionsInput): Promise> { - return this.contractActionRepository.find(input); + const result = await this.contractActionRepository.find( + FindParams.create({ limit: 1 }) + ); + + return result; } /*methods*/ diff --git a/src/api/endpoints/actions/index.ts b/src/api/endpoints/actions/index.ts index a96a67d..a841e6a 100644 --- a/src/api/endpoints/actions/index.ts +++ b/src/api/endpoints/actions/index.ts @@ -1,10 +1,7 @@ -export * from './data/data-sources/contract-action.mongo.source'; -export * from './data/dtos/actions.dto'; -export * from './data/mappers/contract-action.mapper'; +export * from './domain/actions.types'; export * from './domain/list-actions.controller'; export * from './domain/models/list-actions.input'; export * from './domain/models/list-actions.output'; export * from './domain/repositories/contract-action.repository'; export * from './domain/use-cases/list-actions.use-case'; -export * from './ioc.config'; export * from './routes/list-actions.route'; diff --git a/src/api/endpoints/actions/ioc.config.ts b/src/api/endpoints/actions/ioc.config.ts deleted file mode 100644 index 4957072..0000000 --- a/src/api/endpoints/actions/ioc.config.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { - MongoConfig, - MongoSource, - RepositoryImpl, - Container, -} from '@alien-worlds/api-core'; - -import { ContractActionMongoSource } from './data/data-sources/contract-action.mongo.source'; -import { ContractActionMapper } from './data/mappers/contract-action.mapper'; -import { ContractActionRepository } from './domain/repositories/contract-action.repository'; - -export const setupContractActionRepository = async ( - mongo: MongoSource | MongoConfig, - container?: Container -): Promise => { - let mongoSource: MongoSource; - if (mongo instanceof MongoSource) { - mongoSource = mongo; - } else { - mongoSource = await MongoSource.create(mongo); - } - - const repo: ContractActionRepository = new RepositoryImpl( - new ContractActionMongoSource(mongoSource), - new ContractActionMapper() - ); - - if (container) { - container - .bind(ContractActionRepository.Token) - .toConstantValue(repo); - } - - return repo; -}; diff --git a/src/api/endpoints/actions/routes/list-actions.route-io.ts b/src/api/endpoints/actions/routes/list-actions.route-io.ts new file mode 100644 index 0000000..7df8f46 --- /dev/null +++ b/src/api/endpoints/actions/routes/list-actions.route-io.ts @@ -0,0 +1,81 @@ +import { + IO, + Response, + RouteIO, + UnknownObject, + Request, + parseToBigInt, +} from '@alien-worlds/aw-core'; +import { ListActionsInput } from '../domain/models/list-actions.input'; +import { ListActionsQueryParams } from '../domain/actions.types'; +import { ListActionsOutput } from '../domain/models/list-actions.output'; + +export class ListActionsRouteIO implements RouteIO { + public toResponse(output: ListActionsOutput): Response { + const { result } = output; + + if (result.isFailure) { + const { + failure: { error }, + } = result; + if (error) { + return { + status: 500, + body: [], + }; + } + } + + return { + status: 200, + body: output.toJSON(), + }; + } + + public fromRequest( + request: Request + ): ListActionsInput { + const { + query: { contracts, names, accounts, from, to, limit, offset, block_numbers }, + } = request; + + let fromBlock: bigint; + let toBlock: bigint; + let fromDate: Date; + let toDate: Date; + let blockNumbers = []; + + if (from) { + if (/^[0-9]+$/.test(from)) { + fromBlock = parseToBigInt(from); + } else { + fromDate = new Date(from); + } + } + + if (to) { + if (/^[0-9]+$/.test(to)) { + toBlock = parseToBigInt(to); + } else { + toDate = new Date(to); + } + } + + if (block_numbers) { + blockNumbers = block_numbers.split(',').map(parseToBigInt); + } + + return new ListActionsInput( + contracts ? contracts.split(',') : [], + names ? names.split(',') : [], + accounts ? accounts.split(',') : [], + fromBlock, + toBlock, + fromDate, + toDate, + blockNumbers, + offset || 0, + limit || 10 + ); + } +} diff --git a/src/api/endpoints/actions/routes/list-actions.route.ts b/src/api/endpoints/actions/routes/list-actions.route.ts index 896fd0a..3e10557 100644 --- a/src/api/endpoints/actions/routes/list-actions.route.ts +++ b/src/api/endpoints/actions/routes/list-actions.route.ts @@ -1,14 +1,6 @@ -import { GetRoute, RouteHandler } from '@alien-worlds/api-core'; -import { ListActionsInput } from '../domain/models/list-actions.input'; -import { ListActionsOutput } from '../domain/models/list-actions.output'; +import { GetRoute, RouteHandler } from '@alien-worlds/aw-core'; +import { ListActionsRouteIO } from './list-actions.route-io'; -/*imports*/ - -/** - * @class - * - * - */ export class ListActionsRoute extends GetRoute { public static create(handler: RouteHandler) { return new ListActionsRoute(handler); @@ -16,10 +8,7 @@ export class ListActionsRoute extends GetRoute { private constructor(handler: RouteHandler) { super('actions', handler, { - hooks: { - pre: ListActionsInput.fromRequest, - post: (output: ListActionsOutput) => output.toResponse(), - }, + io: new ListActionsRouteIO(), }); } } diff --git a/src/api/index.ts b/src/api/index.ts index ce894e8..70bae94 100644 --- a/src/api/index.ts +++ b/src/api/index.ts @@ -1,4 +1,6 @@ -export * from './api'; +export * from './api.command'; +export * from './api.dependencies'; export * from './api.types'; +export * from './api'; export * from './start-api'; export * as ApiEndpoints from './endpoints'; diff --git a/src/api/start-api.ts b/src/api/start-api.ts index 6d8e53c..504f052 100644 --- a/src/api/start-api.ts +++ b/src/api/start-api.ts @@ -1,12 +1,23 @@ -import { Route } from '@alien-worlds/api-core'; -import { Api } from './api'; -import { ApiConfig } from './api.types'; +import 'reflect-metadata'; -export const startApi = async (config: ApiConfig, routes: Route[] = []) => { - const api = new Api(config); +import { ConfigVars, Route } from '@alien-worlds/aw-core'; +import { apiCommand } from './api.command'; +import { ApiCommandOptions } from './api.types'; +import { buildApiConfig } from '../config'; +import { ApiDependencies } from './api.dependencies'; - routes.forEach(route => { - Route.mount(api, route); +export const startApi = async (dependencies: ApiDependencies, ...args: string[]) => { + const { api, ioc, databaseConfigBuilder, routesProvider, setupIoc } = dependencies; + const vars = new ConfigVars(); + const options = apiCommand.parse(args).opts(); + const config = buildApiConfig(vars, databaseConfigBuilder, options); + + await setupIoc(config, ioc); + + api.setup(config); + + routesProvider(ioc).forEach(route => { + Route.mount(api.framework, route); }); return api.start(); diff --git a/src/bootstrap/__tests__/bootstrap.utils.unit.test.ts b/src/bootstrap/__tests__/bootstrap.utils.unit.test.ts new file mode 100644 index 0000000..ed94e33 --- /dev/null +++ b/src/bootstrap/__tests__/bootstrap.utils.unit.test.ts @@ -0,0 +1,194 @@ +import { Result } from '@alien-worlds/aw-core'; +import { + createDefaultModeBlockRange, + createReplayModeBlockRange, + createTestModeBlockRange, +} from '../bootstrap.utils'; +import { Mode } from '../../common'; +import { + EndBlockOutOfRangeError, + StartBlockHigherThanEndBlockError, + UndefinedStartBlockError, +} from '../bootstrap.errors'; + +describe('createDefaultModeBlockRange', () => { + const originalLog = console.log; + beforeEach(() => { + jest.clearAllMocks(); + console.log = jest.fn(); // mock console.log to prevent log outputs during tests + }); + + afterEach(() => { + console.log = originalLog; // restore original console.log after tests + }); + + it('should create a block range in default mode with positive startBlock', async () => { + const blockState = { + getBlockNumber: jest.fn().mockResolvedValue({ content: 0n }), + } as any; + const blockchain = { + getLastIrreversibleBlockNumber: jest + .fn() + .mockResolvedValue(Result.withContent(100n)), + getHeadBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + } as any; + const config = { + blockchain: {}, + startBlock: 5n, + endBlock: 5n, + mode: Mode.Default, + scanner: { scanKey: 'scanKey' }, + startFromHead: true, + maxBlockNumber: 10n, + } as any; + + const result = await createDefaultModeBlockRange(blockState, blockchain, config); + + expect(result).toEqual({ + startBlock: 5n, + endBlock: 5n, + mode: Mode.Default, + scanKey: 'scanKey', + }); + }); + + it('should throw error when highEdge < lowEdge', async () => { + const blockState = { + getBlockNumber: jest.fn().mockResolvedValue({ content: 0n }), + } as any; + const blockchain = { + getLastIrreversibleBlockNumber: jest + .fn() + .mockResolvedValue(Result.withContent(100n)), + getHeadBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + } as any; + const config = { + blockchain: {}, + startBlock: 5n, + endBlock: 3n, + mode: Mode.Default, + scanner: { scanKey: 'scanKey' }, + startFromHead: true, + maxBlockNumber: 10n, + } as any; + + await expect( + createDefaultModeBlockRange(blockState, blockchain, config) + ).rejects.toThrow(StartBlockHigherThanEndBlockError); + }); +}); + +describe('createTestModeBlockRange', () => { + const originalLog = console.log; + const blockchain = { + getLastIrreversibleBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + getHeadBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + } as any; + beforeEach(() => { + jest.clearAllMocks(); + console.log = jest.fn(); + }); + + afterEach(() => { + console.log = originalLog; + }); + + it('should create a block range in test mode when startBlock is not a bigint', async () => { + const config = { + blockchain: {}, + startBlock: null, + mode: Mode.Test, + scanner: { scanKey: 'scanKey' }, + startFromHead: true, + } as any; + + const result = await createTestModeBlockRange(blockchain, config); + + expect(result).toEqual({ + startBlock: 99n, + endBlock: 100n, + mode: Mode.Test, + scanKey: 'scanKey', + }); + }); + + it('should create a block range in test mode when startBlock is a bigint', async () => { + const config = { + blockchain: {}, + startBlock: 50n, + mode: Mode.Test, + scanner: { scanKey: 'scanKey' }, + startFromHead: true, + } as any; + + const result = await createTestModeBlockRange(blockchain, config); + + expect(result).toEqual({ + startBlock: 50n, + endBlock: 51n, + mode: Mode.Test, + scanKey: 'scanKey', + }); + }); +}); + +describe('createReplayModeBlockRange', () => { + const originalLog = console.log; + const blockchain = { + getLastIrreversibleBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + getHeadBlockNumber: jest.fn().mockResolvedValue(Result.withContent(100n)), + } as any; + beforeEach(() => { + jest.clearAllMocks(); + console.log = jest.fn(); // mock console.log to prevent log outputs during tests + }); + + afterEach(() => { + console.log = originalLog; // restore original console.log after tests + }); + + it('should throw an error when startBlock is not defined', async () => { + const scanner = { hasUnscannedBlocks: jest.fn().mockResolvedValue(false) } as any; + const config = { + blockchain: {}, + startBlock: null, + endBlock: null, + mode: Mode.Replay, + scanner: { scanKey: 'scanKey' }, + } as any; + + await expect(createReplayModeBlockRange(scanner, blockchain, config)).rejects.toThrow( + UndefinedStartBlockError + ); + }); + + it('should throw an error when endBlock > lastIrreversibleBlock', async () => { + const scanner = { hasUnscannedBlocks: jest.fn().mockResolvedValue(false) } as any; + const config = { + blockchain: {}, + startBlock: 50n, + endBlock: 101n, + mode: Mode.Replay, + scanner: { scanKey: 'scanKey' }, + } as any; + + await expect(createReplayModeBlockRange(scanner, blockchain, config)).rejects.toThrow( + EndBlockOutOfRangeError + ); + }); + + it('should throw an error when startBlock > endBlock', async () => { + const scanner = { hasUnscannedBlocks: jest.fn().mockResolvedValue(false) } as any; + const config = { + blockchain: {}, + startBlock: 50n, + endBlock: 49n, + mode: Mode.Replay, + scanner: { scanKey: 'scanKey' }, + } as any; + + await expect(createReplayModeBlockRange(scanner, blockchain, config)).rejects.toThrow( + StartBlockHigherThanEndBlockError + ); + }); +}); diff --git a/src/bootstrap/__tests__/start-bootstrap.unit.test.ts b/src/bootstrap/__tests__/start-bootstrap.unit.test.ts new file mode 100644 index 0000000..74d7634 --- /dev/null +++ b/src/bootstrap/__tests__/start-bootstrap.unit.test.ts @@ -0,0 +1,106 @@ +import { bootstrap } from '../start-bootstrap'; +import { NoAbisError } from '../bootstrap.errors'; +import { InternalBroadcastMessageName } from '../../broadcast/internal-broadcast.enums'; +import { ReaderBroadcastMessage } from '../../broadcast/messages'; +import { Mode } from '../../common'; +import { BroadcastTcpClient } from '@alien-worlds/aw-broadcast'; + +jest.mock('@alien-worlds/history-tools-common', () => ({ + Abis: { + create: jest + .fn() + .mockResolvedValue({ fetchAbis: jest.fn().mockResolvedValue([1, 2, 3]) }), + }, + BlockState: { create: jest.fn().mockResolvedValue({}) }, + ContractReader: { create: jest.fn().mockResolvedValue({ readContracts: jest.fn() }) }, + BlockRangeScanner: { create: jest.fn().mockResolvedValue({}) }, + BroadcastTcpClient: jest.fn().mockImplementation(() => { + return { + onMessage: jest.fn(), + sendMessage: jest.fn(), + connect: jest.fn(), + }; + }), + MongoSource: { create: jest.fn().mockResolvedValue({}) }, +})); + +jest.mock('../bootstrap.utils', () => ({ + createDefaultModeBlockRange: jest.fn().mockResolvedValue({}), + createReplayModeBlockRange: jest.fn().mockResolvedValue({}), + createTestModeBlockRange: jest.fn().mockResolvedValue({}), +})); + +const featuredCriteria = { + traces: [ + { + shipTraceMessageName: ['transaction_trace_v0'], + shipActionTraceMessageName: ['action_trace_v0', 'action_trace_v1'], + contract: ['uspts.worlds'], + action: ['addpoints'], + processor: 'USPTS_WORLDS_ACTION_PROCESSOR', + }, + { + shipTraceMessageName: ['transaction_trace_v0'], + shipActionTraceMessageName: ['action_trace_v0', 'action_trace_v1'], + contract: ['notify.world'], + action: ['logmine'], + processor: 'NOTIFY_WORLD_ACTION_PROCESSOR', + }, + ], + deltas: [ + { + shipDeltaMessageName: ['table_delta_v0'], + name: ['contract_row'], + code: ['msig.worlds'], + scope: ['*'], + table: ['*'], + processor: 'MSIG_WORLDS_DELTA_PROCESSOR', + }, + ], +} as any; + +const config = { + mode: Mode.Default, + broadcast: {}, + mongo: {}, + contractReader: {}, + abis: {}, + scanner: {}, + featured: featuredCriteria, +} as any; + +const dependencies = {} as any; + +describe('bootstrap', () => { + let mockBroadcast; + + beforeEach(() => { + mockBroadcast = new BroadcastTcpClient({}); + }); + + it.skip('handles DefaultModeReaderReady message correctly', async () => { + await bootstrap(config, dependencies, featuredCriteria); + const message = { name: InternalBroadcastMessageName.DefaultModeReaderReady }; + const messageHandler = mockBroadcast.onMessage.mock.calls[0][1]; + await messageHandler(message); + + // Replace the following expect lines with your actual testing assertions + expect(mockBroadcast.sendMessage).toHaveBeenCalledWith( + ReaderBroadcastMessage.newDefaultModeTask(expect.anything()) + ); + }); + + it.skip('handles TestModeReaderReady message correctly', async () => { + await bootstrap(config, dependencies, featuredCriteria); + + const message = { name: InternalBroadcastMessageName.DefaultModeReaderReady }; + + const messageHandler = mockBroadcast.onMessage.mock.calls[0][1]; + await messageHandler(message); + + // Replace the following expect lines with your actual testing assertions + expect(mockBroadcast.sendMessage).toHaveBeenCalledWith( + ReaderBroadcastMessage.newDefaultModeTask(expect.anything()) + ); + }); +}); diff --git a/src/bootstrap/bootstrap.command.ts b/src/bootstrap/bootstrap.command.ts new file mode 100644 index 0000000..13c2211 --- /dev/null +++ b/src/bootstrap/bootstrap.command.ts @@ -0,0 +1,10 @@ +import { Command } from 'commander'; + +export const bootstrapCommand = new Command(); + +bootstrapCommand + .version('1.0', '-v, --version') + .option('-k, --scan-key ', 'Scan key') + .option('-s, --start-block ', 'Start at this block') + .option('-m, --mode ', 'Mode (default/replay/test)') + .option('-e, --end-block ', 'End block (exclusive)'); diff --git a/src/bootstrap/bootstrap.config.ts b/src/bootstrap/bootstrap.config.ts new file mode 100644 index 0000000..58e60bb --- /dev/null +++ b/src/bootstrap/bootstrap.config.ts @@ -0,0 +1,24 @@ +import { UnknownObject } from '@alien-worlds/aw-core'; +import { AbisServiceConfig } from '../common'; +import { BlockRangeScanConfig } from '../common/block-range-scanner'; +import { FeaturedConfig } from '../common/featured'; +import { BroadcastConfig } from '@alien-worlds/aw-broadcast'; + +export type BlockchainConfig = { + endpoint: string; + chainId: string; +}; + +export type BootstrapConfig = { + database: DatabaseConfig; + broadcast: BroadcastConfig; + scanner: BlockRangeScanConfig; + startBlock?: bigint; + endBlock?: bigint; + startFromHead?: boolean; + mode: string; + featured: FeaturedConfig; + abis: AbisServiceConfig; + blockchain: BlockchainConfig; + maxBlockNumber?: number; +}; diff --git a/src/bootstrap/bootstrap.dependencies.ts b/src/bootstrap/bootstrap.dependencies.ts new file mode 100644 index 0000000..9eba3d1 --- /dev/null +++ b/src/bootstrap/bootstrap.dependencies.ts @@ -0,0 +1,56 @@ +import { BlockchainService, Result } from '@alien-worlds/aw-core'; +import { Dependencies } from '../common/dependencies'; +import { FeaturedContracts, FeaturedContractDataCriteria } from '../common/featured'; +import { BroadcastClient } from '@alien-worlds/aw-broadcast'; +import { Abis, BlockRangeScanner, BlockState, DatabaseConfigBuilder } from '../common'; +import { BootstrapConfig } from './bootstrap.config'; + +/** + * An abstract class representing a Bootstrap dependencies. + * @export + * @abstract + * @class BootstrapDependencies + */ +export abstract class BootstrapDependencies extends Dependencies { + /** + * The broadcast client used for communication. + * @type {BroadcastClient} + */ + public broadcastClient: BroadcastClient; + /** + * The ABIs (Application Binary Interfaces) for contracts. + * @type {Abis} + */ + public abis: Abis; + /** + * The block range scanner for scanning blocks. + * @type {BlockRangeScanner} + */ + public scanner: BlockRangeScanner; + /** + * The featured contract service. + * @type {FeaturedContracts} + */ + public featuredContracts: FeaturedContracts; + /** + * The block state for maintaining blockchain state. + * @type {BlockState} + */ + public blockState: BlockState; + + /** + * The blockchain service for interacting with the blockchain. + * @type {BlockchainService} + */ + public blockchain: BlockchainService; + + /** + * @type {DatabaseConfigBuilder} + */ + public databaseConfigBuilder: DatabaseConfigBuilder; + + public abstract initialize( + config: BootstrapConfig, + featuredCriteria: FeaturedContractDataCriteria + ): Promise; +} diff --git a/src/bootstrap/bootstrap.errors.ts b/src/bootstrap/bootstrap.errors.ts index 61e7163..44bf200 100644 --- a/src/bootstrap/bootstrap.errors.ts +++ b/src/bootstrap/bootstrap.errors.ts @@ -1,10 +1,28 @@ +/** + * Represents an error when the start block is undefined. + * + * @class + * @extends {Error} + */ export class UndefinedStartBlockError extends Error { constructor() { super(`Undefined start block. `); } } +/** + * Represents an error when the end block is out of range. + * + * @class + * @extends {Error} + */ export class EndBlockOutOfRangeError extends Error { + /** + * Constructs a new EndBlockOutOfRangeError. + * + * @param {bigint} endBlock - The end block that is out of range + * @param {bigint} lib - The last irreversible block number + */ constructor(endBlock: bigint, lib: bigint) { super( `End block (${endBlock.toString()}) out of range. A value greater than last irreversible block number (${lib.toString()})` @@ -12,7 +30,18 @@ export class EndBlockOutOfRangeError extends Error { } } +/** + * Represents an error when the start block is higher than the end block. + * + * @class + * @extends {Error} + */ export class StartBlockHigherThanEndBlockError extends Error { + /** + * Constructs a new StartBlockHigherThanEndBlockError + * @param {bigint} startBlock - The start block that is higher than the end block + * @param {bigint} endBlock - The end block + */ constructor(startBlock: bigint, endBlock: bigint) { super( `Error in the given range (${startBlock.toString()}-${endBlock.toString()}), the startBlock cannot be greater than the endBlock` @@ -20,6 +49,12 @@ export class StartBlockHigherThanEndBlockError extends Error { } } +/** + * Represents an error when there are no ABIs stored in the database. + * + * @class + * @extends {Error} + */ export class NoAbisError extends Error { constructor() { super(`There are no ABIs stored in the database`); diff --git a/src/bootstrap/bootstrap.types.ts b/src/bootstrap/bootstrap.types.ts index b930c58..acbcf54 100644 --- a/src/bootstrap/bootstrap.types.ts +++ b/src/bootstrap/bootstrap.types.ts @@ -1,24 +1,4 @@ -import { MongoConfig, BroadcastConfig } from '@alien-worlds/api-core'; -import { AbisServiceConfig } from '../common/abis'; -import { BlockRangeScanConfig } from '../reader/block-range-scanner'; -import { BlockchainConfig, ContractReaderConfig } from '../common/blockchain'; -import { FeaturedConfig } from '../common/featured'; -import { Mode } from '../common'; - -export type BootstrapConfig = { - broadcast: BroadcastConfig; - blockchain: BlockchainConfig; - contractReader: ContractReaderConfig; - scanner: BlockRangeScanConfig; - mongo: MongoConfig; - startBlock?: bigint; - endBlock?: bigint; - startFromHead?: boolean; - mode: string; - featured: FeaturedConfig; - abis: AbisServiceConfig; - maxBlockNumber?: number; -}; +import { Mode } from "../common"; export type BootstrapCommandOptions = { scanKey: string; diff --git a/src/bootstrap/bootstrap.utils.ts b/src/bootstrap/bootstrap.utils.ts index 3af9f4a..a590ddc 100644 --- a/src/bootstrap/bootstrap.utils.ts +++ b/src/bootstrap/bootstrap.utils.ts @@ -1,30 +1,38 @@ -import { log, parseToBigInt } from '@alien-worlds/api-core'; -import { BlockRangeScanner } from '../reader/block-range-scanner'; -import { BlockState } from '../common/block-state'; -import { Mode } from '../common/common.enums'; -import { UnknownModeError } from '../common/common.errors'; -import { BlockRangeData, BootstrapConfig } from './bootstrap.types'; +import { BlockchainService, log, parseToBigInt } from '@alien-worlds/aw-core'; +import { BlockRangeData } from './bootstrap.types'; import { StartBlockHigherThanEndBlockError, UndefinedStartBlockError, EndBlockOutOfRangeError, } from './bootstrap.errors'; -import { Blockchain } from '../common'; +import { BlockRangeScanner, BlockState, Mode, UnknownModeError } from '../common'; +import { BootstrapConfig } from './bootstrap.config'; -export const createBlockRangeTaskInput = ( +/** + * Creates a block range task input based on the provided configuration and mode. + * + * @async + * @param {BlockState} blockState - The current block state. + * @param {BlockRangeScanner} scanner - The block range scanner. + * @param {BlockchainService} blockchain - The blockchain service. + * @param {BootstrapConfig} config - The bootstrap configuration. + * @returns {Promise} The block range task input. + */ +export const createBlockRangeTaskInput = async ( blockState: BlockState, scanner: BlockRangeScanner, + blockchain: BlockchainService, config: BootstrapConfig ) => { const { mode } = config; if (mode === Mode.Default) { - return createDefaultModeBlockRange(blockState, config); + return createDefaultModeBlockRange(blockState, blockchain, config); } else if (mode === Mode.Replay) { // - return createReplayModeBlockRange(scanner, config); + return createReplayModeBlockRange(scanner, blockchain, config); } else if (mode === Mode.Test) { // - return createTestModeBlockRange(config); + return createTestModeBlockRange(blockchain, config); } else { // throw new UnknownModeError(mode); @@ -32,12 +40,17 @@ export const createBlockRangeTaskInput = ( }; /** + * Creates a block range in default mode. * - * @param {Broadcast} broadcast - * @param {BootstrapConfig} config + * @async + * @param {BlockState} blockState - The current block state. + * @param {BlockchainService} blockchain - The blockchain service. + * @param {BootstrapConfig} config - The bootstrap configuration. + * @returns {Promise} The block range data. */ export const createDefaultModeBlockRange = async ( blockState: BlockState, + blockchain: BlockchainService, config: BootstrapConfig ): Promise => { const { @@ -48,9 +61,9 @@ export const createDefaultModeBlockRange = async ( startFromHead, maxBlockNumber, } = config; - const blockchain = await Blockchain.create(config.blockchain); - const lastIrreversibleBlock = await blockchain.getLastIrreversibleBlockNumber(); - const headBlock = await blockchain.getHeadBlockNumber(); + const { content: lastIrreversibleBlock } = + await blockchain.getLastIrreversibleBlockNumber(); + const { content: headBlock } = await blockchain.getHeadBlockNumber(); const { content: currentBlockNumber } = await blockState.getBlockNumber(); log(` Current head block number: ${headBlock.toString()}`); @@ -62,7 +75,9 @@ export const createDefaultModeBlockRange = async ( if (currentBlockNumber > 0n) { lowEdge = currentBlockNumber + 1n; - log(` Using the current state block number (+1) ${lowEdge.toString()} as a start block`); + log( + ` Using the current state block number (+1) ${lowEdge.toString()} as a start block` + ); } else { if (startBlock < 0n) { if (startFromHead) { @@ -106,11 +121,15 @@ export const createDefaultModeBlockRange = async ( }; /** + * Creates a block range in test mode. * - * @param {Broadcast} broadcast - * @param {BootstrapConfig} config + * @async + * @param {BlockchainService} blockchain - The blockchain service. + * @param {BootstrapConfig} config - The bootstrap configuration. + * @returns {Promise} The block range data. */ export const createTestModeBlockRange = async ( + blockchain: BlockchainService, config: BootstrapConfig ): Promise => { const { @@ -120,9 +139,9 @@ export const createTestModeBlockRange = async ( startFromHead, } = config; - const blockchain = await Blockchain.create(config.blockchain); - const lastIrreversibleBlock = await blockchain.getLastIrreversibleBlockNumber(); - const headBlock = await blockchain.getHeadBlockNumber(); + const { content: lastIrreversibleBlock } = + await blockchain.getLastIrreversibleBlockNumber(); + const { content: headBlock } = await blockchain.getHeadBlockNumber(); let highEdge: bigint; let lowEdge: bigint; @@ -139,12 +158,17 @@ export const createTestModeBlockRange = async ( }; /** + * Creates a block range in replay mode. * - * @param {Broadcast} broadcast - * @param {BootstrapConfig} config + * @async + * @param {BlockRangeScanner} scanner - The block range scanner. + * @param {BlockchainService} blockchain - The blockchain service. + * @param {BootstrapConfig} config - The bootstrap configuration. + * @returns {Promise} The block range data. */ export const createReplayModeBlockRange = async ( scanner: BlockRangeScanner, + blockchain: BlockchainService, config: BootstrapConfig ): Promise => { const { @@ -157,8 +181,8 @@ export const createReplayModeBlockRange = async ( const lowEdge = startBlock; let highEdge = endBlock; - const blockchain = await Blockchain.create(config.blockchain); - const lastIrreversibleBlock = await blockchain.getLastIrreversibleBlockNumber(); + const { content: lastIrreversibleBlock } = + await blockchain.getLastIrreversibleBlockNumber(); if (typeof lowEdge !== 'bigint') { throw new UndefinedStartBlockError(); diff --git a/src/bootstrap/index.ts b/src/bootstrap/index.ts index 96c62e6..0b57388 100644 --- a/src/bootstrap/index.ts +++ b/src/bootstrap/index.ts @@ -1,4 +1,7 @@ -export * from './bootstrap.types'; +export * from './bootstrap.config'; +export * from './bootstrap.command'; +export * from './bootstrap.dependencies'; export * from './bootstrap.errors'; +export * from './bootstrap.types'; export * from './bootstrap.utils'; export * from './start-bootstrap'; diff --git a/src/bootstrap/start-bootstrap.ts b/src/bootstrap/start-bootstrap.ts index 24216f5..7292639 100644 --- a/src/bootstrap/start-bootstrap.ts +++ b/src/bootstrap/start-bootstrap.ts @@ -7,49 +7,73 @@ import { createReplayModeBlockRange, createTestModeBlockRange, } from './bootstrap.utils'; -import { Broadcast, log, MongoSource } from '@alien-worlds/api-core'; -import { BootstrapConfig } from './bootstrap.types'; +import { BootstrapCommandOptions } from './bootstrap.types'; import { NoAbisError } from './bootstrap.errors'; import { InternalBroadcastChannel, - InternalBroadcastClientName, InternalBroadcastMessageName, } from '../broadcast/internal-broadcast.enums'; -import { FeaturedContractContent } from '../common/featured'; -import { Mode } from '../common/common.enums'; -import { InternalBroadcastMessage } from '../broadcast'; -import { Abis, BlockRangeScanner, BlockState, ContractReader } from '../common'; +import { ConfigVars, log } from '@alien-worlds/aw-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; +import { buildBootstrapConfig } from '../config'; +import { bootstrapCommand } from './bootstrap.command'; +import { BootstrapConfig } from './bootstrap.config'; +import { BootstrapDependencies } from './bootstrap.dependencies'; +import { FeaturedUtils, MissingCriteriaError, Mode } from '../common'; /** + * The bootstrap function initiates the bootstrap process based on the configuration provided. + * Depending on the mode of operation (default, replay, or test), it prepares the necessary + * resources such as a broadcast client, mongo source, contract reader, ABI data, scanner, + * featured contract details, and block state. + * It also sets up a message handler for the bootstrap broadcast channel to handle various + * messages related to the reader readiness in different modes. * - * @param broadcastMessageMapper - * @param config - * @returns + * @param {BootstrapConfig} config The bootstrap configuration object. + * @throws {NoAbisError} When no ABIs are fetched. + * + * @returns {Promise} The Promise that resolves when the bootstrap process has been initiated successfully. */ -export const startBootstrap = async (config: BootstrapConfig) => { +export const bootstrap = async ( + config: BootstrapConfig, + dependencies: BootstrapDependencies, + featuredCriteriaPath: string +): Promise => { const { mode } = config; log(`Bootstrap "${mode}" mode ... [starting]`); - const broadcast = await Broadcast.createClient({ - ...config.broadcast, - clientName: InternalBroadcastClientName.Bootstrap, - }); - const mongo = await MongoSource.create(config.mongo); - const contractReader = await ContractReader.create(config.contractReader, mongo); - const abis = await Abis.create(mongo, config.abis, config.featured); - const scanner = await BlockRangeScanner.create(mongo, config.scanner); - const featured = new FeaturedContractContent(config.featured); - const blockState = await BlockState.create(mongo); + const featuredCriteria = await FeaturedUtils.fetchCriteria(featuredCriteriaPath); + + if (!featuredCriteria) { + throw new MissingCriteriaError(featuredCriteriaPath); + } + + const contractNames = FeaturedUtils.readFeaturedContracts(featuredCriteria); + + const initResult = await dependencies.initialize(config, featuredCriteria); + + if (initResult.isFailure) { + throw initResult.failure.error; + } + + const { abis, broadcastClient, blockState, blockchain, featuredContracts, scanner } = + dependencies; let blockRange: ReaderBroadcastMessageData; // fetch latest abis to make sure that the blockchain data will be correctly deserialized log(` * Fetch featured contracts details ... [starting]`); - await contractReader.readContracts(featured.listContracts()); + await featuredContracts.readContracts(contractNames); log(` * Fetch featured contracts details ... [done]`); // fetch latest abis to make sure that the blockchain data will be correctly deserialized log(` * Fetch abis ... [starting]`); - const abisCount = (await abis.fetchAbis()).length; + const { content: fetchedAbis, failure: fetchAbisFailure } = await abis.fetchAbis(); + + if (fetchAbisFailure) { + throw fetchAbisFailure.error; + } + + const abisCount = fetchedAbis.length; log(` * Fetch abis ... [done]`); if (abisCount === 0) { @@ -57,33 +81,54 @@ export const startBootstrap = async (config: BootstrapConfig) => { } if (config.mode === Mode.Replay) { - blockRange = await createReplayModeBlockRange(scanner, config); + blockRange = await createReplayModeBlockRange(scanner, blockchain, config); } - broadcast.onMessage( + broadcastClient.onMessage( InternalBroadcastChannel.Bootstrap, - async (message: InternalBroadcastMessage) => { - if (message.content.name === InternalBroadcastMessageName.DefaultModeReaderReady) { + async (message: BroadcastMessage) => { + if (message.name === InternalBroadcastMessageName.DefaultModeReaderReady) { if (config.mode === Mode.Default) { - blockRange = await createDefaultModeBlockRange(blockState, config); - broadcast.sendMessage(ReaderBroadcastMessage.newDefaultModeTask(blockRange)); + blockRange = await createDefaultModeBlockRange(blockState, blockchain, config); + broadcastClient.sendMessage( + ReaderBroadcastMessage.newDefaultModeTask(blockRange) + ); } if (config.mode === Mode.Test) { - blockRange = await createTestModeBlockRange(config); - broadcast.sendMessage(ReaderBroadcastMessage.newDefaultModeTask(blockRange)); + blockRange = await createTestModeBlockRange(blockchain, config); + broadcastClient.sendMessage( + ReaderBroadcastMessage.newDefaultModeTask(blockRange) + ); } - } else if ( - message.content.name === InternalBroadcastMessageName.ReplayModeReaderReady - ) { - broadcast.sendMessage(ReaderBroadcastMessage.newReplayModeTask(blockRange)); + } else if (message.name === InternalBroadcastMessageName.ReplayModeReaderReady) { + broadcastClient.sendMessage(ReaderBroadcastMessage.newReplayModeTask(blockRange)); } else { // } } ); - broadcast.connect(); + broadcastClient.connect(); log(`Bootstrap ${mode} mode ... [ready]`); }; + +/** + * startBootstrap is a function that takes command line options to build a bootstrap configuration + * and then initiate the bootstrap process. It throws an error if the bootstrap process fails. + * + * @param {string[]} args The command line args for bootstrap. + * @param {BootstrapDependencies} dependencies The bootstrap process dependencies. + * @param {string} featuredCriteriaPath + */ +export const startBootstrap = ( + args: string[], + dependencies: BootstrapDependencies, + featuredCriteriaPath: string +) => { + const vars = new ConfigVars(); + const options = bootstrapCommand.parse(args).opts(); + const config = buildBootstrapConfig(vars, dependencies.databaseConfigBuilder, options); + bootstrap(config, dependencies, featuredCriteriaPath).catch(log); +}; diff --git a/src/broadcast/__tests__/internal-broadcast.message.unit.test.ts b/src/broadcast/__tests__/internal-broadcast.message.unit.test.ts new file mode 100644 index 0000000..66b6056 --- /dev/null +++ b/src/broadcast/__tests__/internal-broadcast.message.unit.test.ts @@ -0,0 +1,23 @@ +import { BroadcastTcpMessageType } from '@alien-worlds/aw-broadcast'; +import { InternalBroadcastMessage } from '../internal-broadcast.message'; + +describe('InternalBroadcastMessage', () => { + describe('create', () => { + it('should create an internal broadcast message', () => { + const content = { + sender: null, + channel: null, + name: null, + data: null, + } as any; + const message = InternalBroadcastMessage.create(content); + + expect(message).toBeInstanceOf(InternalBroadcastMessage); + expect(message.sender).toBeNull(); + expect(message.channel).toBeNull(); + expect(message.type).toBe(BroadcastTcpMessageType.Data); + expect(message.name).toBeNull(); + expect(message.data).toBeNull(); + }); + }); +}); diff --git a/src/broadcast/index.ts b/src/broadcast/index.ts index 5bf2f7c..003c2d9 100644 --- a/src/broadcast/index.ts +++ b/src/broadcast/index.ts @@ -1,4 +1,4 @@ export * from './internal-broadcast.enums'; export * from './internal-broadcast.message'; export * from './messages'; - +export * from './start-broadcast'; diff --git a/src/broadcast/internal-broadcast.message.ts b/src/broadcast/internal-broadcast.message.ts index f40688b..06e813f 100644 --- a/src/broadcast/internal-broadcast.message.ts +++ b/src/broadcast/internal-broadcast.message.ts @@ -1,21 +1,35 @@ -import { BroadcastTcpMessage, BroadcastTcpMessageType } from '@alien-worlds/api-core'; -import { InternalBroadcastMessageName } from './internal-broadcast.enums'; +import { + BroadcastTcpMessage, + BroadcastTcpMessageContent, + BroadcastTcpMessageType, +} from '@alien-worlds/aw-broadcast'; +/** + * Represents an internal broadcast message. + * + * @template DataType - The type of data for the message. + */ export class InternalBroadcastMessage< DataType = unknown > extends BroadcastTcpMessage { + /** + * Creates an internal broadcast message. + * + * @param {BroadcastTcpMessageContent} content - The content for the message. + * @returns {InternalBroadcastMessage} The created internal broadcast message. + */ public static create( - sender: string, - channel: string, - name: InternalBroadcastMessageName, - data: DataType + content: BroadcastTcpMessageContent ) { - return new InternalBroadcastMessage({ + const { sender, channel, name, data } = content; + return new InternalBroadcastMessage( + null, sender, channel, + BroadcastTcpMessageType.Data, name, - type: BroadcastTcpMessageType.Data, - data, - }); + null, + data + ); } } diff --git a/src/broadcast/messages/__tests__/filter-broadcast.message.unit.test.ts b/src/broadcast/messages/__tests__/filter-broadcast.message.unit.test.ts new file mode 100644 index 0000000..2380135 --- /dev/null +++ b/src/broadcast/messages/__tests__/filter-broadcast.message.unit.test.ts @@ -0,0 +1,19 @@ +import { FilterBroadcastMessage } from '../filter-broadcast.message'; + +describe('FilterBroadcastMessage', () => { + describe('ready', () => { + it('should create a ready broadcast message', () => { + const message = FilterBroadcastMessage.ready(); + + expect(message).toBeDefined(); + }); + }); + + describe('refresh', () => { + it('should create a refresh broadcast message', () => { + const message = FilterBroadcastMessage.refresh(); + + expect(message).toBeDefined(); + }); + }); +}); diff --git a/src/broadcast/messages/__tests__/processor-broadcast.message.unit.test.ts b/src/broadcast/messages/__tests__/processor-broadcast.message.unit.test.ts new file mode 100644 index 0000000..f4128ea --- /dev/null +++ b/src/broadcast/messages/__tests__/processor-broadcast.message.unit.test.ts @@ -0,0 +1,28 @@ +import { ProcessorBroadcastMessage } from '../processor-broadcast.message'; +import { + InternalBroadcastChannel, + InternalBroadcastMessageName, +} from '../../internal-broadcast.enums'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; + +describe('ProcessorBroadcastMessage', () => { + describe('ready', () => { + it('should create a ready broadcast message', () => { + const message = ProcessorBroadcastMessage.ready(); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.Processor); + expect(message.name).toBe(InternalBroadcastMessageName.ProcessorReady); + }); + }); + + describe('refresh', () => { + it('should create a refresh broadcast message', () => { + const message = ProcessorBroadcastMessage.refresh(); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.Processor); + expect(message.name).toBe(InternalBroadcastMessageName.ProcessorRefresh); + }); + }); +}); diff --git a/src/broadcast/messages/__tests__/reader-broadcast.message.unit.test.ts b/src/broadcast/messages/__tests__/reader-broadcast.message.unit.test.ts new file mode 100644 index 0000000..91ad230 --- /dev/null +++ b/src/broadcast/messages/__tests__/reader-broadcast.message.unit.test.ts @@ -0,0 +1,60 @@ +import { + ReaderBroadcastMessage, + ReaderBroadcastMessageData, +} from '../reader-broadcast.message'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; +import { + InternalBroadcastChannel, + InternalBroadcastMessageName, +} from '../../internal-broadcast.enums'; +import { Mode } from '../../../common'; + +describe('ReaderBroadcastMessage', () => { + describe('newReplayModeTask', () => { + it('should create a new replay mode task broadcast message', () => { + const data: ReaderBroadcastMessageData = { + mode: '', + }; + const message = ReaderBroadcastMessage.newReplayModeTask(data); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.ReplayModeReader); + expect(message.name).toBe(InternalBroadcastMessageName.ReaderTask); + expect(data.mode).toBe(Mode.Replay); + }); + }); + + describe('newDefaultModeTask', () => { + it('should create a new default mode task broadcast message', () => { + const data: ReaderBroadcastMessageData = { + mode: '', + }; + const message = ReaderBroadcastMessage.newDefaultModeTask(data); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.DefaultModeReader); + expect(message.name).toBe(InternalBroadcastMessageName.ReaderTask); + expect(data.mode).toBe(Mode.Default); + }); + }); + + describe('defaultModeReady', () => { + it('should create a default mode ready broadcast message', () => { + const message = ReaderBroadcastMessage.defaultModeReady(); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.Bootstrap); + expect(message.name).toBe(InternalBroadcastMessageName.DefaultModeReaderReady); + }); + }); + + describe('replayModeReady', () => { + it('should create a replay mode ready broadcast message', () => { + const message = ReaderBroadcastMessage.replayModeReady(); + + expect(message).toBeInstanceOf(BroadcastMessage); + expect(message.channel).toBe(InternalBroadcastChannel.Bootstrap); + expect(message.name).toBe(InternalBroadcastMessageName.ReplayModeReaderReady); + }); + }); +}); diff --git a/src/broadcast/messages/filter-broadcast.message.ts b/src/broadcast/messages/filter-broadcast.message.ts index 73e689d..f465238 100644 --- a/src/broadcast/messages/filter-broadcast.message.ts +++ b/src/broadcast/messages/filter-broadcast.message.ts @@ -1,26 +1,38 @@ -import { BroadcastTcpMessageType } from '@alien-worlds/api-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; import { InternalBroadcastChannel, InternalBroadcastMessageName, } from '../internal-broadcast.enums'; /** - * Message content + * Represents a class for filter broadcast messages. */ export class FilterBroadcastMessage { + /** + * Creates a ready broadcast message. + * + * @returns {BroadcastMessage} The ready broadcast message. + */ public static ready() { - return { - channel: InternalBroadcastChannel.Bootstrap, - name: InternalBroadcastMessageName.FilterReady, - type: BroadcastTcpMessageType.Data, - }; + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Bootstrap, + null, + InternalBroadcastMessageName.FilterReady + ); } + /** + * Creates a refresh broadcast message. + * + * @returns {BroadcastMessage} The refresh broadcast message. + */ public static refresh() { - return { - channel: InternalBroadcastChannel.Filter, - name: InternalBroadcastMessageName.FilterRefresh, - type: BroadcastTcpMessageType.Data, - }; + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Filter, + null, + InternalBroadcastMessageName.FilterRefresh + ); } } diff --git a/src/broadcast/messages/processor-broadcast.message.ts b/src/broadcast/messages/processor-broadcast.message.ts index 5c25a59..6b83765 100644 --- a/src/broadcast/messages/processor-broadcast.message.ts +++ b/src/broadcast/messages/processor-broadcast.message.ts @@ -1,26 +1,37 @@ -import { BroadcastTcpMessageType } from '@alien-worlds/api-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; import { InternalBroadcastChannel, InternalBroadcastMessageName, } from '../internal-broadcast.enums'; /** - * Message content + * Represents a class for processor broadcast messages. */ export class ProcessorBroadcastMessage { - public static ready() { - return { - channel: InternalBroadcastChannel.Processor, - name: InternalBroadcastMessageName.ProcessorReady, - type: BroadcastTcpMessageType.Data, - }; + /** + * Creates a ready broadcast message. + * + * @returns {BroadcastMessage} The ready broadcast message. + */ + public static ready(): BroadcastMessage { + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Processor, + null, + InternalBroadcastMessageName.ProcessorReady + ); } - - public static refresh() { - return { - channel: InternalBroadcastChannel.Processor, - name: InternalBroadcastMessageName.ProcessorRefresh, - type: BroadcastTcpMessageType.Data, - }; + /** + * Creates a refresh broadcast message. + * + * @returns {BroadcastMessage} The refresh broadcast message. + */ + public static refresh(): BroadcastMessage { + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Processor, + null, + InternalBroadcastMessageName.ProcessorRefresh + ); } } diff --git a/src/broadcast/messages/reader-broadcast.message.ts b/src/broadcast/messages/reader-broadcast.message.ts index 58a83ac..a1d7cbf 100644 --- a/src/broadcast/messages/reader-broadcast.message.ts +++ b/src/broadcast/messages/reader-broadcast.message.ts @@ -1,10 +1,13 @@ -import { BroadcastTcpMessageType } from '@alien-worlds/api-core'; -import { Mode } from '../../common/common.enums'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; import { InternalBroadcastChannel, InternalBroadcastMessageName, } from '../internal-broadcast.enums'; +import { Mode } from '../../common'; +/** + * Data structure for the reader broadcast message. + */ export type ReaderBroadcastMessageData = { startBlock?: bigint; endBlock?: bigint; @@ -13,42 +16,63 @@ export type ReaderBroadcastMessageData = { }; /** - * Message content + * Represents a class for reader broadcast messages. */ export class ReaderBroadcastMessage { - public static newReplayModeTask(data: ReaderBroadcastMessageData) { + /** + * Creates a new replay mode task broadcast message. + * + * @param {ReaderBroadcastMessageData} data - The data for the message. + * @returns {BroadcastMessage} The new replay mode task broadcast message. + */ + public static newReplayModeTask(data: ReaderBroadcastMessageData): BroadcastMessage { data.mode = Mode.Replay; - return { - channel: InternalBroadcastChannel.ReplayModeReader, - name: InternalBroadcastMessageName.ReaderTask, - type: BroadcastTcpMessageType.Data, + return BroadcastMessage.create( + null, + InternalBroadcastChannel.ReplayModeReader, data, - }; + InternalBroadcastMessageName.ReaderTask + ); } - - public static newDefaultModeTask(data: ReaderBroadcastMessageData) { + /** + * Creates a new default mode task broadcast message. + * + * @param {ReaderBroadcastMessageData} data - The data for the message. + * @returns {BroadcastMessage} The new default mode task broadcast message. + */ + public static newDefaultModeTask(data: ReaderBroadcastMessageData): BroadcastMessage { data.mode = Mode.Default; - return { - channel: InternalBroadcastChannel.DefaultModeReader, - name: InternalBroadcastMessageName.ReaderTask, - type: BroadcastTcpMessageType.Data, + return BroadcastMessage.create( + null, + InternalBroadcastChannel.DefaultModeReader, data, - }; + InternalBroadcastMessageName.ReaderTask + ); } - - public static defaultModeReady() { - return { - channel: InternalBroadcastChannel.Bootstrap, - name: InternalBroadcastMessageName.DefaultModeReaderReady, - type: BroadcastTcpMessageType.Data, - }; + /** + * Creates a default mode ready broadcast message. + * + * @returns {BroadcastMessage} The default mode ready broadcast message. + */ + public static defaultModeReady(): BroadcastMessage { + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Bootstrap, + null, + InternalBroadcastMessageName.DefaultModeReaderReady + ); } - - public static replayModeReady() { - return { - channel: InternalBroadcastChannel.Bootstrap, - name: InternalBroadcastMessageName.ReplayModeReaderReady, - type: BroadcastTcpMessageType.Data, - }; + /** + * Creates a replay mode ready broadcast message. + * + * @returns {BroadcastMessage} The replay mode ready broadcast message. + */ + public static replayModeReady(): BroadcastMessage { + return BroadcastMessage.create( + null, + InternalBroadcastChannel.Bootstrap, + null, + InternalBroadcastMessageName.ReplayModeReaderReady + ); } } diff --git a/src/broadcast/start-broadcast.ts b/src/broadcast/start-broadcast.ts new file mode 100644 index 0000000..bc14f1e --- /dev/null +++ b/src/broadcast/start-broadcast.ts @@ -0,0 +1,10 @@ +import { ConfigVars } from '@alien-worlds/aw-core'; +import { BroadcastTcpServer, buildBroadcastConfig } from '@alien-worlds/aw-broadcast'; + +export const startBroadcast = async () => { + const vars = new ConfigVars(); + const config = buildBroadcastConfig(vars); + const server = new BroadcastTcpServer(config); + + await server.start(); +}; diff --git a/src/common/__tests__/common.utils.unit.test.ts b/src/common/__tests__/common.utils.unit.test.ts new file mode 100644 index 0000000..8818893 --- /dev/null +++ b/src/common/__tests__/common.utils.unit.test.ts @@ -0,0 +1,23 @@ +import { isSetAbiAction } from '../common.utils'; + +describe('isSetAbiAction', () => { + it('should return true for eosio setabi action', () => { + const result = isSetAbiAction('eosio', 'setabi'); + expect(result).toBe(true); + }); + + it('should return false for other contract or action', () => { + const result = isSetAbiAction('eosio', 'otherAction'); + expect(result).toBe(false); + }); + + it('should return false for other contract and setabi action', () => { + const result = isSetAbiAction('otherContract', 'setabi'); + expect(result).toBe(false); + }); + + it('should return false for other contract and action', () => { + const result = isSetAbiAction('otherContract', 'otherAction'); + expect(result).toBe(false); + }); +}); diff --git a/src/common/abis/__tests__/abi.repository-impl.unit.test.ts b/src/common/abis/__tests__/abi.repository-impl.unit.test.ts new file mode 100644 index 0000000..04be6d2 --- /dev/null +++ b/src/common/abis/__tests__/abi.repository-impl.unit.test.ts @@ -0,0 +1,108 @@ +import { AbisRepositoryImpl } from '../abis.repository-impl'; +import { Result, CountParams, ContractEncodedAbi } from '@alien-worlds/aw-core'; +import { AbisCache } from '../abis.cache'; + +jest.mock('../abis.cache'); + +const mockDataSource = { + find: jest.fn(), + count: jest.fn(), + aggregate: jest.fn(), + update: jest.fn(), + insert: jest.fn(), + remove: jest.fn(), + startTransaction: jest.fn(), + commitTransaction: jest.fn(), + rollbackTransaction: jest.fn(), +} as any; + +const mockMapper = { + toEntity: jest.fn(), + fromEntity: jest.fn(), + getEntityKeyMapping: jest.fn(), +} as any; + +const mockQueryBuilders = { + buildFindQuery: jest.fn(), + buildCountQuery: jest.fn(), + buildUpdateQuery: jest.fn(), + buildRemoveQuery: jest.fn(), + buildAggregationQuery: jest.fn(), +} as any; + +describe('AbisRepositoryImpl', () => { + let repository: AbisRepositoryImpl; + let mockCache: jest.Mocked; + + beforeEach(() => { + mockCache = new AbisCache() as jest.Mocked; + repository = new AbisRepositoryImpl(mockDataSource, mockMapper, mockQueryBuilders); + }); + + afterEach(() => { + jest.resetAllMocks(); + }); + + describe('getAbis', () => { + it('should return abis from cache if present', async () => { + const mockAbis: ContractEncodedAbi[] = [ + /* mock data here */ + ]; + mockCache.getAbis.mockReturnValue(mockAbis); + + const result = await repository.getAbis({ contracts: ['contract1'] }); + + expect(result).toEqual(Result.withContent(mockAbis)); + expect(mockCache.getAbis).toHaveBeenCalledWith({ contracts: ['contract1'] }); + }); + }); + + describe('getAbi', () => { + it('should return abi from cache if present', async () => { + const mockAbi: ContractEncodedAbi = {} as any; + const blockNumber = BigInt(10); + const contract = 'contract1'; + + mockCache.getAbi.mockReturnValue(mockAbi); + + const result = await repository.getAbi(blockNumber, contract); + + expect(result).toEqual(Result.withContent(mockAbi)); + expect(mockCache.getAbi).toHaveBeenCalledWith(blockNumber, contract); + }); + }); + + describe('insertAbis', () => { + it('should insert abis into the cache and the database', async () => { + const mockAbis: ContractEncodedAbi[] = [ + /* mock data here */ + ]; + const addSpy = jest.spyOn(repository, 'add'); + addSpy.mockResolvedValue(Result.withContent(mockAbis)); + + const result = await repository.insertAbis(mockAbis); + + expect(result).toEqual(Result.withContent(mockAbis.length > 0)); + expect(mockCache.insertAbis).toHaveBeenCalledWith(mockAbis); + expect(addSpy).toHaveBeenCalledWith(mockAbis); + }); + }); + + describe('countAbis', () => { + it('should count abis based on the startBlock and endBlock', async () => { + const countSpy = jest.spyOn(repository, 'count'); + const mockCount = 5; + countSpy.mockResolvedValue(Result.withContent(mockCount)); + + const startBlock = BigInt(10); + const endBlock = BigInt(20); + + const result = await repository.countAbis(startBlock, endBlock); + + expect(result).toEqual(Result.withContent(mockCount)); + expect(countSpy).toHaveBeenCalledWith( + CountParams.create({ where: expect.anything() }) + ); + }); + }); +}); diff --git a/src/common/abis/__tests__/abi.repository.unit.test.ts b/src/common/abis/__tests__/abi.repository.unit.test.ts deleted file mode 100644 index 07cf619..0000000 --- a/src/common/abis/__tests__/abi.repository.unit.test.ts +++ /dev/null @@ -1,18 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { Long } from 'mongodb'; -import { ContractEncodedAbi } from '../contract-encoded-abi'; - -const document = { - block_number: Long.fromBigInt(100n), - contract: 'foo', - hex: 'foo_hex', -}; - -describe('Abi Repository Unit tests', () => { - it('"create" should create Abi instance', async () => { - const abi = ContractEncodedAbi.create(100, 'foo', 'foo_hex'); - expect(abi).toBeInstanceOf(ContractEncodedAbi); - }); -}); diff --git a/src/common/abis/__tests__/abi.unit.test.ts b/src/common/abis/__tests__/abi.unit.test.ts deleted file mode 100644 index 20f440e..0000000 --- a/src/common/abis/__tests__/abi.unit.test.ts +++ /dev/null @@ -1,48 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { Long } from 'mongodb'; -import { ContractEncodedAbi } from '../contract-encoded-abi'; - -const document = { - block_number: Long.fromBigInt(100n), - contract: 'foo', - hex: 'foo_hex', -}; - -describe('Abi Unit tests', () => { - it('"create" should create Abi instance', async () => { - const abi = ContractEncodedAbi.create(100, 'foo', 'foo_hex'); - expect(abi).toBeInstanceOf(ContractEncodedAbi); - }); - - it('"fromDocument" should create Abi instance based on docuemnt data', async () => { - const abi = ContractEncodedAbi.fromDocument({ - block_number: Long.fromBigInt(100n), - contract: 'foo', - hex: 'foo_hex', - }); - expect(abi).toBeInstanceOf(ContractEncodedAbi); - expect(abi.blockNumber).toEqual(100n); - expect(abi.contract).toEqual('foo'); - expect(abi.hex).toEqual('foo_hex'); - }); - - it('"toJson" should return Abi JSON object', async () => { - const abi = ContractEncodedAbi.create(100, 'foo', 'foo_hex'); - expect(abi.toJson()).toEqual({ - blockNumber: 100n, - contract: 'foo', - hex: 'foo_hex', - }); - }); - - it('"toDocument" should return mongo document based on Abi data', async () => { - const abi = ContractEncodedAbi.create(100, 'foo', 'foo_hex'); - expect(abi.toDocument()).toEqual({ - block_number: Long.fromBigInt(100n), - contract: 'foo', - hex: 'foo_hex', - }); - }); -}); diff --git a/src/common/abis/__tests__/abis.cache.unit.test.ts b/src/common/abis/__tests__/abis.cache.unit.test.ts new file mode 100644 index 0000000..56e0293 --- /dev/null +++ b/src/common/abis/__tests__/abis.cache.unit.test.ts @@ -0,0 +1,104 @@ +import { AbisCache } from '../abis.cache'; + +describe('AbisCache', () => { + let abisCache: AbisCache; + + beforeEach(() => { + abisCache = new AbisCache(); + }); + + describe('getAbis', () => { + it('should return an empty array when cache is empty', () => { + const result = abisCache.getAbis({ contracts: ['eosio'] }); + expect(result).toEqual([]); + }); + + it('should return an empty array when contracts array is empty', () => { + abisCache.insertAbis([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + + const result = abisCache.getAbis({ contracts: [] }); + expect(result).toEqual([]); + }); + + it('should return all ABIs when no startBlock or endBlock is provided', () => { + abisCache.insertAbis([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + + const result = abisCache.getAbis({ contracts: ['eosio'] }); + expect(result).toEqual([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + }); + + it('should return matching ABIs within the specified range', () => { + abisCache.insertAbis([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + + const result = abisCache.getAbis({ + contracts: ['eosio'], + startBlock: 2n, + endBlock: 3n, + }); + expect(result).toEqual([ + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + }); + }); + + describe('getAbi', () => { + it('should return null when cache is empty', () => { + const result = abisCache.getAbi(1n, 'eosio'); + expect(result).toBeNull(); + }); + + it('should return the latest ABI that matches the block number', () => { + abisCache.insertAbis([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + + const result = abisCache.getAbi(2n, 'eosio'); + expect(result).toEqual({ contract: 'eosio', blockNumber: 2n } as any); + }); + + it('should return null when no matching ABI is found', () => { + abisCache.insertAbis([ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]); + + const result = abisCache.getAbi(4n, 'undefined'); + expect(result).toBeNull(); + }); + }); + + describe('insertAbis', () => { + it('should insert ABIs into the cache', () => { + const abis = [ + { contract: 'eosio', blockNumber: 1n } as any, + { contract: 'eosio', blockNumber: 2n } as any, + { contract: 'eosio', blockNumber: 3n } as any, + ]; + + abisCache.insertAbis(abis); + + const result = abisCache.getAbis({ contracts: ['eosio'] }); + expect(result).toEqual(abis); + }); + }); +}); diff --git a/src/common/abis/__tests__/abis.unit.test.ts b/src/common/abis/__tests__/abis.unit.test.ts new file mode 100644 index 0000000..90082e0 --- /dev/null +++ b/src/common/abis/__tests__/abis.unit.test.ts @@ -0,0 +1,161 @@ +import { Abis } from '../abis'; +import { AbisRepositoryImpl } from '../abis.repository-impl'; +import { Failure, Result, AbiService } from '@alien-worlds/aw-core'; +import { AbiNotFoundError, AbisServiceNotSetError } from '../abis.errors'; + +// Mock dependencies +jest.mock('../abis.repository-impl'); +jest.mock('../abis.service'); + +const mockAbisRepository = { + cacheAbis: jest.fn(), + getAbis: jest.fn(), + getAbi: jest.fn(), + insertAbis: jest.fn(), + countAbis: jest.fn(), +} as any; +const mockAbisService = { fetchAbis: jest.fn() } as any; + +describe('Abis', () => { + let abis: Abis; + + beforeEach(() => { + // Reset mocks and create a new instance of Abis + jest.resetAllMocks(); + abis = new Abis(new mockAbisRepository(), new mockAbisService()); + }); + + describe('getAbis', () => { + it('should return ABIs from repository when available', async () => { + const mockAbis = [ + /* Mocked ABIs */ + ]; + mockAbisRepository.prototype.getAbis.mockResolvedValue( + Result.withContent(mockAbis) + ); + + const result = await abis.getAbis(); + + expect(mockAbisRepository.prototype.getAbis).toHaveBeenCalledTimes(1); + expect(result.isFailure).toBe(false); + expect(result.content).toEqual(mockAbis); + }); + + it('should fetch ABIs when none are available in the repository and fetch option is enabled', async () => { + const mockAbis = [ + /* Mocked ABIs */ + ]; + mockAbisRepository.prototype.getAbis.mockResolvedValue(Result.withContent([])); + abis.fetchAbis = jest.fn().mockResolvedValue(Result.withContent(mockAbis)); + + const result = await abis.getAbis({ fetch: true }); + + expect(mockAbisRepository.prototype.getAbis).toHaveBeenCalledTimes(1); + expect(abis.fetchAbis).toHaveBeenCalledTimes(1); + expect(result.isFailure).toBe(false); + expect(result.content).toEqual(mockAbis); + }); + + // Add more test cases for different scenarios + }); + + describe('getAbi', () => { + it('should return ABI from repository when available', async () => { + const mockAbi = '1234567890'; + mockAbisRepository.prototype.getAbi.mockResolvedValue(Result.withContent(mockAbi)); + + const result = await abis.getAbi(123n, '0x123'); + + expect(mockAbisRepository.prototype.getAbi).toHaveBeenCalledTimes(1); + expect(result.isFailure).toBe(false); + expect(result.content).toEqual(mockAbi); + }); + + it('should fetch ABI when not available in the repository and fetch option is enabled', async () => { + const mockAbi = '1234567890'; + mockAbisRepository.prototype.getAbi.mockResolvedValue( + Result.withFailure(Failure.fromError(new AbiNotFoundError())) + ); + abis.fetchAbis = jest.fn().mockResolvedValue(Result.withContent([mockAbi])); + + const result = await abis.getAbi(123n, '0x123', true); + + expect(mockAbisRepository.prototype.getAbi).toHaveBeenCalledTimes(1); + expect(abis.fetchAbis).toHaveBeenCalledTimes(1); + expect(result.isFailure).toBe(false); + expect(result.content).toEqual(mockAbi); + }); + + // Add more test cases for different scenarios + }); + + describe('storeAbi', () => { + it('should insert the ABI into the repository', async () => { + const blockNumber = 123n; + const contract = '0x123'; + const hex = '0xabcdef'; + + mockAbisRepository.prototype.insertAbis.mockResolvedValue(Result.withContent(true)); + + const result = await abis.storeAbi(blockNumber, contract, hex); + + expect(mockAbisRepository.prototype.insertAbis).toHaveBeenCalledTimes(1); + expect(mockAbisRepository.prototype.insertAbis).toHaveBeenCalledWith([ + expect.objectContaining({ blockNumber, contract, hex }), + ]); + expect(result.isFailure).toBe(false); + expect(result.content).toBe(true); + }); + + // Add more test cases for different scenarios + }); + + describe('fetchAbis', () => { + it('should throw AbisServiceNotSetError when service is not set', async () => { + abis = new Abis(new mockAbisRepository()); // Create instance without AbisService + + await expect(abis.fetchAbis()).rejects.toThrow(AbisServiceNotSetError); + }); + + it('should fetch ABIs using the service', async () => { + const mockAbis = [ + /* Mocked ABIs */ + ]; + const mockContracts = ['0x123', '0x456']; + const mockServiceResponse = Result.withContent(mockAbis); + + abis = new Abis(new mockAbisRepository(), new mockAbisService()); + mockAbisService.prototype.fetchAbis.mockResolvedValue(mockServiceResponse); + + const result = await abis.fetchAbis(mockContracts); + + expect(mockAbisService.prototype.fetchAbis).toHaveBeenCalledTimes( + mockContracts.length + ); + expect(mockAbisService.prototype.fetchAbis).toHaveBeenCalledWith( + expect.any(String) + ); + expect(result.isFailure).toBe(false); + expect(result.content).toEqual(mockAbis); + }); + + // Add more test cases for different scenarios + }); + + describe('cacheAbis', () => { + it('should cache ABIs in the repository', async () => { + const mockContracts = ['0x123', '0x456']; + + mockAbisRepository.prototype.cacheAbis.mockResolvedValue(); + + const result = await abis.cacheAbis(mockContracts); + + expect(mockAbisRepository.prototype.cacheAbis).toHaveBeenCalledTimes(1); + expect(mockAbisRepository.prototype.cacheAbis).toHaveBeenCalledWith(mockContracts); + expect(result.isFailure).toBe(false); + expect(result.content).toBeUndefined(); + }); + + // Add more test cases for different scenarios + }); +}); diff --git a/src/common/abis/abis.cache.ts b/src/common/abis/abis.cache.ts index d31bdb8..5e7bb4e 100644 --- a/src/common/abis/abis.cache.ts +++ b/src/common/abis/abis.cache.ts @@ -1,21 +1,50 @@ /* eslint-disable @typescript-eslint/no-unused-vars */ -import { ContractEncodedAbi } from './contract-encoded-abi'; +import { ContractEncodedAbi } from '@alien-worlds/aw-core'; +/** + * A function that filters ABI entries based on the block number being greater than or equal to the start block. + * @param startBlock - The start block number. + * @param endBlock - The end block number. + * @returns A filter function. + */ const filterFromStartBlock = (startBlock: bigint, endBlock: bigint) => (abi: ContractEncodedAbi) => abi.blockNumber >= startBlock; +/** + * A function that filters ABI entries based on the block number being less than or equal to the end block. + * @param startBlock - The start block number. + * @param endBlock - The end block number. + * @returns A filter function. + */ const filterTillEndBlock = (startBlock: bigint, endBlock: bigint) => (abi: ContractEncodedAbi) => abi.blockNumber <= endBlock; +/** + * A function that filters ABI entries based on the block number being within the specified range. + * @param startBlock - The start block number. + * @param endBlock - The end block number. + * @returns A filter function. + */ const filterInRange = (startBlock: bigint, endBlock: bigint) => (abi: ContractEncodedAbi) => abi.blockNumber >= startBlock && abi.blockNumber <= endBlock; +/** + * Class representing the cache for storing and retrieving contract ABIs. + */ export class AbisCache { private cache: Map> = new Map(); + /** + * Retrieves contract ABIs from the cache based on the specified options. + * @param options - Options for retrieving the contract ABIs. + * @param options.startBlock - The start block number to filter the ABIs. + * @param options.endBlock - The end block number to filter the ABIs. + * @param options.contracts - An array of contract addresses to filter the ABIs. + * @returns An array of matching contract ABIs. + */ public getAbis(options: { startBlock?: bigint; endBlock?: bigint; @@ -63,6 +92,12 @@ export class AbisCache { return abis; } + /** + * Retrieves the ABI for the specified block number and contract address. + * @param blockNumber - The block number to find the ABI. + * @param contract - The contract address. + * @returns The matching ABI or null if not found. + */ public getAbi(blockNumber: bigint, contract: string): ContractEncodedAbi { if (this.cache.has(contract)) { const abis = this.cache.get(contract); @@ -80,6 +115,10 @@ export class AbisCache { return null; } + /** + * Inserts an array of ABIs into the cache. + * @param abis - An array of ABIs to insert into the cache. + */ public insertAbis(abis: ContractEncodedAbi[]): void { abis.forEach(abi => { let set = this.cache.get(abi.contract); diff --git a/src/common/abis/abis.errors.ts b/src/common/abis/abis.errors.ts index 27ec8bc..70da4cc 100644 --- a/src/common/abis/abis.errors.ts +++ b/src/common/abis/abis.errors.ts @@ -5,3 +5,9 @@ export class AbisServiceNotSetError extends Error { ); } } + +export class AbiNotFoundError extends Error { + constructor() { + super(`ABI data not found`); + } +} diff --git a/src/common/abis/abis.repository-impl.ts b/src/common/abis/abis.repository-impl.ts new file mode 100644 index 0000000..b5bb511 --- /dev/null +++ b/src/common/abis/abis.repository-impl.ts @@ -0,0 +1,147 @@ +import { + ContractEncodedAbi, + CountParams, + DataSourceError, + FindParams, + log, + RepositoryImpl, + Result, + Where, +} from '@alien-worlds/aw-core'; +import { AbisCache } from './abis.cache'; +import { AbisRepository } from './abis.repository'; + +/** + * Implements the AbisRepository with caching functionality. + * This class manages ContractEncodedAbi entities, providing CRUD operations and additional functionalities such as caching. + * It extends the base RepositoryImpl and implements the AbisRepository interface. + */ +export class AbisRepositoryImpl + extends RepositoryImpl + implements AbisRepository +{ + /** + * Cache instance for caching the ABI objects. + * @private + */ + private cache: AbisCache = new AbisCache(); + + /** + * Cache the ABIs. + * @param {string[]} [contracts] - List of contracts. + */ + public async cacheAbis(contracts?: string[]) { + const abis = await this.getAbis({ contracts }); + if (Array.isArray(abis)) { + this.cache.insertAbis(abis); + } + } + + /** + * Retrieve the ABIs. + * @param options - Filter options for retrieving ABIs. + * @returns Promise that resolves with the Result of an array of ContractEncodedAbi. + */ + public async getAbis(options: { + startBlock?: bigint; + endBlock?: bigint; + contracts?: string[]; + }): Promise> { + try { + const { startBlock, endBlock, contracts } = options || {}; + + const cachedAbis = this.cache.getAbis(options); + + if (cachedAbis.length > 0) { + return Result.withContent(cachedAbis); + } + const where = Where.bind(); + + if (startBlock && endBlock) { + where.props().blockNumber.isGte(startBlock).isLte(endBlock); + } + + if (contracts) { + where.props().contract.isIn(contracts); + } + + return this.find(FindParams.create({ where })); + } catch (error) { + log(error); + return Result.withContent([]); + } + } + + /** + * Retrieve a specific ABI. + * @param blockNumber - The block number. + * @param contract - The contract name. + * @returns Promise that resolves with the Result of ContractEncodedAbi. + */ + public async getAbi( + blockNumber: bigint, + contract: string + ): Promise> { + const cachedAbi = this.cache.getAbi(blockNumber, contract); + + if (cachedAbi) { + return Result.withContent(cachedAbi); + } + + const where = Where.bind(); + where.props().blockNumber.isLte(blockNumber).props().contract.isEq(contract); + const { content, failure } = await this.find( + FindParams.create({ where, limit: 1, sort: { block_number: -1 } }) + ); + + if (content) { + return Result.withContent(content[0]); + } + + if (failure) { + return Result.withFailure(failure); + } + } + + /** + * Inserts an array of ABIs. + * @param abis - The ABIs to be inserted. + * @returns Promise that resolves with the Result of boolean indicating the success of the operation. + */ + public async insertAbis(abis: ContractEncodedAbi[]): Promise> { + this.cache.insertAbis(abis); + const { content, failure } = await this.add(abis); + + if (failure) { + if ((failure.error).isDuplicateError === false) { + log(failure.error); + } + return Result.withContent(false); + } + + return Result.withContent(content.length > 0); + } + + /** + * Counts the number of ABIs between the provided block numbers. + * @param startBlock - The start block number. + * @param endBlock - The end block number. + * @returns Promise that resolves with the Result of number indicating the count of ABIs. + */ + public async countAbis( + startBlock?: bigint, + endBlock?: bigint + ): Promise> { + const where = Where.bind(); + + if (typeof startBlock === 'bigint') { + where.props().blockNumber.isGte(startBlock); + } + + if (typeof endBlock === 'bigint') { + where.props().blockNumber.isLte(endBlock); + } + + return this.count(CountParams.create({ where })); + } +} diff --git a/src/common/abis/abis.repository.ts b/src/common/abis/abis.repository.ts index e4f4314..569e84b 100644 --- a/src/common/abis/abis.repository.ts +++ b/src/common/abis/abis.repository.ts @@ -1,158 +1,53 @@ -/* eslint-disable @typescript-eslint/unbound-method */ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { ContractEncodedAbiDocument } from './abis.types'; -import { ContractEncodedAbi } from './contract-encoded-abi'; -import { - CollectionMongoSource, - DataSourceBulkWriteError, - DataSourceOperationError, - log, - MongoDB, - MongoSource, - OperationErrorType, -} from '@alien-worlds/api-core'; -import { AbisCache } from './abis.cache'; - -export class AbisCollection extends CollectionMongoSource { - constructor(source: MongoSource) { - super(source, 'history_tools.abis', { - indexes: [ - { key: { block_number: 1, hex: 1, contract: 1 }, unique: true, background: true }, - ], - }); - } -} -export class AbisRepository { - private cache: AbisCache = new AbisCache(); - - constructor(private collection: AbisCollection) {} - - public async cacheAbis(contracts?: string[]) { - const abis = await this.getAbis({ contracts }); - if (Array.isArray(abis)) { - this.cache.insertAbis(abis); - } - } - - public async getAbis(options: { +import { ContractEncodedAbi, Result } from '@alien-worlds/aw-core'; + +/** + * This class manages ContractEncodedAbi entities, providing CRUD operations and additional functionalities such as caching. + * It extends the base RepositoryImpl and implements the AbisRepository interface. + */ +export abstract class AbisRepository { + /** + * Cache the ABIs. + * @param {string[]} [contracts] - List of contracts. + */ + public abstract cacheAbis(contracts?: string[]); + + /** + * Retrieve the ABIs. + * @param options - Filter options for retrieving ABIs. + * @returns Promise that resolves with the Result of an array of ContractEncodedAbi. + */ + public abstract getAbis(options: { startBlock?: bigint; endBlock?: bigint; contracts?: string[]; - }): Promise { - try { - const { startBlock, endBlock, contracts } = options || {}; - - const cachedAbis = this.cache.getAbis(options); - - if (cachedAbis.length > 0) { - return cachedAbis; - } - - const filter: { block_number?: unknown; contract?: unknown } = {}; - - if (startBlock && endBlock) { - filter.block_number = { - $gte: MongoDB.Long.fromBigInt(startBlock), - $lte: MongoDB.Long.fromBigInt(endBlock), - }; - } - - if (contracts) { - filter.contract = { $in: contracts }; - } - - const documents = await this.collection.find({ filter }); - const entities = documents.map(ContractEncodedAbi.fromDocument); - - return entities; - } catch (error) { - log(error); - return []; - } - } - - public async getAbi( + }): Promise>; + + /** + * Retrieve a specific ABI. + * @param blockNumber - The block number. + * @param contract - The contract name. + * @returns Promise that resolves with the Result of ContractEncodedAbi. + */ + public abstract getAbi( blockNumber: bigint, contract: string - ): Promise { - try { - const cachedAbi = this.cache.getAbi(blockNumber, contract); - - if (cachedAbi) { - return cachedAbi; - } - - const filter: { block_number: unknown; contract: unknown } = { - block_number: { - $lte: MongoDB.Long.fromBigInt(blockNumber), - }, - contract: { $eq: contract }, - }; - - const document = await this.collection.findOne({ - filter, - options: { sort: { block_number: -1 }, limit: 1 }, - }); - - return document ? ContractEncodedAbi.fromDocument(document) : null; - } catch (error) { - log(error); - return null; - } - } - - public async insertAbi(abi: ContractEncodedAbi): Promise { - try { - this.cache.insertAbis([abi]); - const result = await this.collection.insert(abi.toDocument()); - return !!result; - } catch (error) { - const { type } = error as DataSourceOperationError; - if (type !== OperationErrorType.Duplicate) { - log(error); - } - return false; - } - } - - public async insertAbis(abis: ContractEncodedAbi[]): Promise { - try { - this.cache.insertAbis(abis); - const documents = abis.map(abi => abi.toDocument()); - const result = await this.collection.insertMany(documents); - return result.length > 0; - } catch (error) { - const { writeErrors } = error as DataSourceBulkWriteError; - - if (writeErrors && writeErrors.length > 0) { - writeErrors.forEach(error => { - if (error.type !== OperationErrorType.Duplicate) { - log(error); - } - }); - } - - return false; - } - } - - public async countAbis(startBlock?: bigint, endBlock?: bigint): Promise { - try { - const filter: MongoDB.Filter = {}; - if (typeof startBlock === 'bigint') { - filter['block_number'] = { $gte: MongoDB.Long.fromBigInt(startBlock) }; - } - - if (typeof endBlock === 'bigint') { - filter['block_number'] = { $lte: MongoDB.Long.fromBigInt(endBlock) }; - } - - const count = await this.collection.count({ filter }); - - return count; - } catch (error) { - return 0; - } - } + ): Promise>; + + /** + * Inserts an array of ABIs. + * @param abis - The ABIs to be inserted. + * @returns Promise that resolves with the Result of boolean indicating the success of the operation. + */ + public abstract insertAbis(abis: ContractEncodedAbi[]): Promise>; + + /** + * Counts the number of ABIs between the provided block numbers. + * @param startBlock - The start block number. + * @param endBlock - The end block number. + * @returns Promise that resolves with the Result of number indicating the count of ABIs. + */ + public abstract countAbis( + startBlock?: bigint, + endBlock?: bigint + ): Promise>; } diff --git a/src/common/abis/abis.serialize.ts b/src/common/abis/abis.serialize.ts deleted file mode 100644 index 8b89835..0000000 --- a/src/common/abis/abis.serialize.ts +++ /dev/null @@ -1,121 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { Serialize } from 'eosjs'; -import { Authorization, hexToUint8Array } from 'eosjs/dist/eosjs-serialize'; -import { Abi } from 'eosjs/dist/eosjs-rpc-interfaces'; -import { log } from '@alien-worlds/api-core'; -import { AbiTableJson } from '../blockchain/abi'; - -export type SerializeUtil = { - deserializeAction: ( - account: string, - action: string, - data: Uint8Array, - hex: string - ) => T; - - deserializeTable: ( - account: string, - table: string, - data: Uint8Array, - hex: string - ) => T; -}; - -export class AbisSerialize { - public static deserializeAction = ( - account: string, - action: string, - data: Uint8Array, - hex: string - ): T => { - try { - const authorization: Authorization[] = []; - const textEncoder = new TextEncoder(); - const textDecoder = new TextDecoder(); - const bytes = hexToUint8Array(hex); - const abiTypes = Serialize.getTypesFromAbi(Serialize.createAbiTypes()); - const buffer = new Serialize.SerialBuffer({ - textEncoder, - textDecoder, - array: bytes, - }); - buffer.restartRead(); - const abi: Abi = abiTypes.get('abi_def').deserialize(buffer); - const types = Serialize.getTypesFromAbi(Serialize.createInitialTypes(), abi); - const actions = new Map(); - for (const { name, type } of abi.actions) { - actions.set(name, Serialize.getType(types, type)); - } - const contract = { types, actions }; - const deserializedAction = Serialize.deserializeAction( - contract, - account, - action, - authorization, - data, - new TextEncoder(), - new TextDecoder() - ); - - if (!deserializedAction.data) { - log( - `Serialized object is empty check the result of "Serialize.deserializeAction"` - ); - log(deserializedAction); - } - - return deserializedAction.data as T; - } catch (error) { - log(error); - return null; - } - }; - - public static deserializeTable = ( - account: string, - table: string, - data: Uint8Array, - hex: string - ): T => { - try { - const textEncoder = new TextEncoder(); - const textDecoder = new TextDecoder(); - const bytes = hexToUint8Array(hex); - const abiTypes = Serialize.getTypesFromAbi(Serialize.createAbiTypes()); - const buffer = new Serialize.SerialBuffer({ - textEncoder, - textDecoder, - array: bytes, - }); - buffer.restartRead(); - const abi: Abi = abiTypes.get('abi_def').deserialize(buffer); - const types = Serialize.getTypesFromAbi(Serialize.createInitialTypes(), abi); - const actions = new Map(); - for (const { name, type } of abi.actions) { - actions.set(name, Serialize.getType(types, type)); - } - const contract = { types, actions }; - - let this_table: AbiTableJson, type: string; - for (const t of abi.tables) { - if (t.name === table) { - this_table = t; - break; - } - } - - if (this_table) { - type = this_table.type; - } else { - return null; - } - - const sb = new Serialize.SerialBuffer({ textEncoder, textDecoder, array: data }); - - return contract.types.get(type).deserialize(sb) as T; - } catch (e) { - return null; - } - }; -} diff --git a/src/common/abis/abis.service.ts b/src/common/abis/abis.service.ts deleted file mode 100644 index 0d5f848..0000000 --- a/src/common/abis/abis.service.ts +++ /dev/null @@ -1,32 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { AbisServiceConfig } from './abis.types'; -import fetch from 'node-fetch'; -import { ContractEncodedAbi } from './contract-encoded-abi'; - -export class AbisService { - constructor(private config: AbisServiceConfig) {} - - public async fetchAbis(contract: string): Promise { - try { - const list: ContractEncodedAbi[] = []; - const { url, limit, filter } = this.config; - - const res = await fetch( - `${url}/v2/history/get_actions?account=${contract}&filter=${ - filter || 'eosio:setabi' - }&limit=${limit || 100}&sort=-1` - ); - const json = await res.json(); - for (let i = 0; i < json.actions.length; i++) { - const act = json.actions[i]; - list.push( - ContractEncodedAbi.create(act.block_num, contract, String(act.act.data.abi)) - ); - } - return list; - } catch (error) { - return []; - } - } -} diff --git a/src/common/abis/abis.ts b/src/common/abis/abis.ts index 8f8c520..b137c70 100644 --- a/src/common/abis/abis.ts +++ b/src/common/abis/abis.ts @@ -1,101 +1,92 @@ -import { MongoConfig, MongoSource, log } from '@alien-worlds/api-core'; -import { FeaturedConfig } from '../featured'; -import { ContractEncodedAbi } from './contract-encoded-abi'; -import { AbisServiceNotSetError } from './abis.errors'; -import { AbisCollection, AbisRepository } from './abis.repository'; -import { AbisService } from './abis.service'; -import { AbisServiceConfig } from './abis.types'; - +import { + AbiService, + ContractEncodedAbi, + Failure, + Result, + log, +} from '@alien-worlds/aw-core'; +import { AbiNotFoundError, AbisServiceNotSetError } from './abis.errors'; +import { AbisRepository } from './abis.repository'; + +/** + * Represents a collection of ABIs (Application Binary Interfaces) for smart contracts. + */ export class Abis { - public static async create( - mongo: MongoSource | MongoConfig, - abisConfig?: AbisServiceConfig, - featured?: FeaturedConfig, - setCache?: boolean - ): Promise { - let mongoSource: MongoSource; - - log(` * Abis ... [starting]`); - - if (mongo instanceof MongoSource) { - mongoSource = mongo; - } else { - mongoSource = await MongoSource.create(mongo); - } - const collection = new AbisCollection(mongoSource); - const repository = new AbisRepository(collection); - const service = abisConfig ? new AbisService(abisConfig) : null; - const abis = new Abis(repository, service, featured); - - if (setCache) { - await abis.cacheAbis(); - log(` * Abis cache restored`); - } - - log(` * Abis ... [ready]`); - - return abis; - } - private contracts: Set = new Set(); - private constructor( + /** + * Constructs a new instance of the Abis class. + * + * @param {AbisRepository} repository - The repository for accessing ABIs. + * @param {AbisService} [service] - The service for fetching ABIs. + * @param {FeaturedConfig} [featuredConfig] - The featured configuration containing contract traces and deltas. + * @private + */ + constructor( private repository: AbisRepository, - private service?: AbisService, - featuredConfig?: FeaturedConfig + private service?: AbiService, + contracts?: string[] ) { - if (featuredConfig) { - const { traces, deltas } = featuredConfig; - - traces.forEach(trace => { - const { contract } = trace; - contract.forEach(value => { - this.contracts.add(value); - }); - }); - - deltas.forEach(delta => { - const { code } = delta; - // apply if it is not a "match" object { match: "", processor:"" } - if (code) { - code.forEach(value => { - this.contracts.add(value); - }); - } + if (contracts) { + contracts.forEach(contract => { + this.contracts.add(contract); }); } } + /** + * Retrieves the ABIs (Application Binary Interfaces) for the specified options. + * + * @param {Object} [options] - The options for retrieving the ABIs. + * @param {bigint} [options.startBlock] - The starting block number. + * @param {bigint} [options.endBlock] - The ending block number. + * @param {string[]} [options.contracts] - The contract addresses to filter the ABIs. + * @param {boolean} [options.fetch] - Indicates whether to fetch ABIs if not found in the database. + * @returns {Promise>} A promise that resolves to a Result object containing the retrieved ABIs. + */ public async getAbis(options?: { startBlock?: bigint; endBlock?: bigint; contracts?: string[]; fetch?: boolean; - }): Promise { + }): Promise> { const { startBlock, endBlock, contracts, fetch } = options || {}; - let abis = await this.repository.getAbis(options); + const { content: abis } = await this.repository.getAbis(options); if (abis.length === 0 && fetch) { log( `No contract ABIs (${startBlock}-${endBlock}) were found in the database. Trying to fetch ABIs...` ); - abis = await this.fetchAbis(contracts); + return this.fetchAbis(contracts); } - return abis; + return Result.withContent(abis || []); } + /** + * Retrieves the ABI (Application Binary Interface) for the specified block number and contract address. + * + * @param {bigint} blockNumber - The block number. + * @param {string} contract - The contract address. + * @param {boolean} [fetch=false] - Indicates whether to fetch the ABI if not found in the database. + * @returns {Promise>} A promise that resolves to a Result object containing the retrieved ABI. + */ public async getAbi( blockNumber: bigint, contract: string, fetch = false - ): Promise { - let abi = await this.repository.getAbi(blockNumber, contract); + ): Promise> { + const getAbiResult = await this.repository.getAbi(blockNumber, contract); + + if (fetch && getAbiResult.isFailure) { + const { content: abis, failure } = await this.fetchAbis([contract]); + + if (failure) { + return Result.withFailure(failure); + } - if (fetch && !abi) { - const abis = await this.fetchAbis([contract]); - abi = abis.reduce((result, abi) => { + const abi = abis.reduce((result, abi) => { if (abi.blockNumber <= blockNumber) { if (!result || result.blockNumber < abi.blockNumber) { result = abi; @@ -103,22 +94,44 @@ export class Abis { } return result; }, null); + + if (abi) { + return Result.withContent(abi); + } + + return Result.withFailure(Failure.fromError(new AbiNotFoundError())); } - return abi; + return getAbiResult; } + /** + * Stores the ABI (Application Binary Interface) with the specified block number, contract address, and hex code. + * + * @param {unknown} blockNumber - The block number. + * @param {string} contract - The contract address. + * @param {string} hex - The hex code representing the ABI. + * @returns {Promise>} A promise that resolves to a Result object indicating the success or failure of the operation. + */ public async storeAbi( blockNumber: unknown, contract: string, hex: string - ): Promise { - return this.repository.insertAbi( - ContractEncodedAbi.create(blockNumber, contract, hex) - ); + ): Promise> { + return this.repository.insertAbis([ + ContractEncodedAbi.create(blockNumber, contract, hex), + ]); } - public async fetchAbis(contracts?: string[]): Promise { + /** + * Fetches the ABIs (Application Binary Interfaces) for the specified contracts. + * + * @param {string[]} [contracts] - The contract addresses to fetch ABIs for. + * @returns {Promise>} A promise that resolves to a Result object containing the fetched ABIs. + * @throws {AbisServiceNotSetError} Thrown if the AbisService is not set. + * @private + */ + public async fetchAbis(contracts?: string[]): Promise> { if (!this.service) { throw new AbisServiceNotSetError(); } @@ -137,14 +150,23 @@ export class Abis { } catch (error) { log(error.message); } - return abis; + return Result.withContent(abis); } - public async cacheAbis(contracts?: string[]): Promise { + /** + * Caches the ABIs (Application Binary Interfaces) for the specified contracts. + * + * @param {string[]} [contracts] - The contract addresses to cache ABIs for. + * @returns {Promise>} A promise that resolves to a Result object indicating the success or failure of the operation. + * @private + */ + public async cacheAbis(contracts?: string[]): Promise> { try { await this.repository.cacheAbis(contracts); + return Result.withoutContent(); } catch (error) { log(error.message); + return Result.withFailure(Failure.fromError(error)); } } } diff --git a/src/common/abis/abis.types.ts b/src/common/abis/abis.types.ts index f1bf80a..c6c8201 100644 --- a/src/common/abis/abis.types.ts +++ b/src/common/abis/abis.types.ts @@ -1,26 +1,13 @@ -import { MongoDB, MongoConfig } from '@alien-worlds/api-core'; -import { FeaturedConfig } from '../featured'; - -export type ContractEncodedAbiJson = { - blockNumber: bigint; - contract: string; - hex: string; -}; - -export type ContractEncodedAbiDocument = { - block_number: MongoDB.Long; - contract: string; - hex: string; -}; +import { UnknownObject } from '@alien-worlds/aw-core'; export type AbisServiceConfig = { url: string; limit?: number; filter?: string; + [key: string]: unknown; }; -export type AbisConfig = { +export type AbisConfig = { service: AbisServiceConfig; - mongo: MongoConfig; - featured: FeaturedConfig; + database: DatabaseConfig; }; diff --git a/src/common/abis/contract-encoded-abi.ts b/src/common/abis/contract-encoded-abi.ts deleted file mode 100644 index c6f1d7e..0000000 --- a/src/common/abis/contract-encoded-abi.ts +++ /dev/null @@ -1,43 +0,0 @@ -import { MongoDB, parseToBigInt } from '@alien-worlds/api-core'; -import { ContractEncodedAbiDocument, ContractEncodedAbiJson } from './abis.types'; - -export class ContractEncodedAbi { - public static fromDocument(document: ContractEncodedAbiDocument): ContractEncodedAbi { - const { block_number, contract, hex } = document; - return new ContractEncodedAbi(parseToBigInt(block_number), contract, hex); - } - - public static create( - blockNumber: unknown, - contract: string, - hex: string - ): ContractEncodedAbi { - return new ContractEncodedAbi(parseToBigInt(blockNumber), contract, hex); - } - - private constructor( - public readonly blockNumber: bigint, - public readonly contract: string, - public readonly hex: string - ) {} - - public toDocument(): ContractEncodedAbiDocument { - const { blockNumber, hex, contract } = this; - - return { - block_number: MongoDB.Long.fromBigInt(blockNumber), - hex, - contract, - }; - } - - public toJson(): ContractEncodedAbiJson { - const { blockNumber, hex, contract } = this; - - return { - blockNumber, - hex, - contract, - }; - } -} diff --git a/src/common/abis/index.ts b/src/common/abis/index.ts index a4c8f59..5599dc6 100644 --- a/src/common/abis/index.ts +++ b/src/common/abis/index.ts @@ -1,8 +1,6 @@ -export * from './contract-encoded-abi'; -export * from './abis'; export * from './abis.cache'; +export * from './abis.errors'; +export * from './abis.repository-impl'; export * from './abis.repository'; -export * from './abis.service'; export * from './abis.types'; -export * from './abis.errors'; -export * from './abis.serialize'; +export * from './abis'; diff --git a/src/reader/block-range-scanner/block-range-scan.repository.ts b/src/common/block-range-scanner/block-range-scan.repository.ts similarity index 78% rename from src/reader/block-range-scanner/block-range-scan.repository.ts rename to src/common/block-range-scanner/block-range-scan.repository.ts index d51ba98..3b05fb2 100644 --- a/src/reader/block-range-scanner/block-range-scan.repository.ts +++ b/src/common/block-range-scanner/block-range-scan.repository.ts @@ -1,6 +1,6 @@ import { BlockRangeScan } from './block-range-scan'; -import { BlockRangeScanMongoSource } from './block-range-scan.mongo.source'; -import { BlockRangeScanConfig } from './block-range-scanner.config'; +import { BlockRangeScanSource } from './block-range-scan.source'; +import { Mapper } from '@alien-worlds/aw-core'; export type ScanRequest = { error?: Error; @@ -8,14 +8,15 @@ export type ScanRequest = { export class BlockRangeScanRepository { constructor( - private readonly source: BlockRangeScanMongoSource, - private readonly config: BlockRangeScanConfig + private readonly source: BlockRangeScanSource, + private readonly mapper: Mapper, + private readonly maxChunkSize: number ) {} public async startNextScan(scanKey: string): Promise { try { const document = await this.source.startNextScan(scanKey); - return document ? BlockRangeScan.fromDocument(document) : null; + return document ? this.mapper.toEntity(document) : null; } catch (error) { return null; } @@ -27,15 +28,15 @@ export class BlockRangeScanRepository { endBlock: bigint ): Promise { try { - const { maxChunkSize } = this.config; + const { maxChunkSize } = this; const rootRange = BlockRangeScan.create(startBlock, endBlock, scanKey, 0); const rangesToPersist = [rootRange]; const childRanges = BlockRangeScan.createChildRanges(rootRange, maxChunkSize); childRanges.forEach(range => rangesToPersist.push(range)); - const documents = rangesToPersist.map(range => range.toDocument()); - await this.source.insertMany(documents); + const documents = rangesToPersist.map(range => this.mapper.fromEntity(range)); + await this.source.insert(documents); return {}; } catch (error) { diff --git a/src/common/block-range-scanner/block-range-scan.source.ts b/src/common/block-range-scanner/block-range-scan.source.ts new file mode 100644 index 0000000..31bf41b --- /dev/null +++ b/src/common/block-range-scanner/block-range-scan.source.ts @@ -0,0 +1,29 @@ +/* eslint-disable @typescript-eslint/no-unsafe-return */ + +import { DataSource } from '@alien-worlds/aw-core'; + +export abstract class BlockRangeScanSource extends DataSource { + public abstract startNextScan(scanKey: string): Promise; + public abstract countScanNodes( + scanKey: string, + startBlock: bigint, + endBlock: bigint + ): Promise; + public abstract removeAll(scanKey: string); + public abstract hasScanKey( + scanKey: string, + startBlock?: bigint, + endBlock?: bigint + ): Promise; + public abstract hasUnscannedNodes( + scanKey: string, + startBlock?: bigint, + endBlock?: bigint + ): Promise; + public abstract findRangeForBlockNumber(blockNumber: bigint, scanKey: string); + public abstract findCompletedParentNode(document: T); + public abstract updateProcessedBlockNumber( + scanKey: string, + blockNumber: bigint + ): Promise; +} diff --git a/src/reader/block-range-scanner/block-range-scan.ts b/src/common/block-range-scanner/block-range-scan.ts similarity index 65% rename from src/reader/block-range-scanner/block-range-scan.ts rename to src/common/block-range-scanner/block-range-scan.ts index 391f642..78aec29 100644 --- a/src/reader/block-range-scanner/block-range-scan.ts +++ b/src/common/block-range-scanner/block-range-scan.ts @@ -1,12 +1,7 @@ /* eslint-disable @typescript-eslint/no-explicit-any */ import crypto from 'crypto'; -import { parseToBigInt, removeUndefinedProperties } from '@alien-worlds/api-core'; -import { Long } from 'mongodb'; +import { parseToBigInt, removeUndefinedProperties } from '@alien-worlds/aw-core'; import { serialize } from 'v8'; -import { - BlockRangeScanDocument, - BlockRangeScanIdDocument, -} from './block-range-scanner.dtos'; export class BlockRangeScanParent { /** @@ -16,7 +11,7 @@ export class BlockRangeScanParent { * @param {bigint} end * @param {string} scanKey */ - private constructor( + constructor( public readonly start: bigint, public readonly end: bigint, public readonly scanKey: string, @@ -27,29 +22,6 @@ export class BlockRangeScanParent { return new BlockRangeScanParent(start, end, scanKey, treeDepth); } - public static fromDocument(document: BlockRangeScanIdDocument) { - const { start, end, scan_key, tree_depth } = document; - - return new BlockRangeScanParent( - parseToBigInt(start), - parseToBigInt(end), - scan_key, - tree_depth - ); - } - - public toDocument() { - const { start, end, scanKey, treeDepth } = this; - const doc = { - start: Long.fromString(start.toString()), - end: Long.fromString(end.toString()), - scan_key: scanKey, - tree_depth: treeDepth, - }; - - return doc; - } - public toJson() { const { start, end, scanKey, treeDepth } = this; @@ -61,7 +33,7 @@ export class BlockRangeScanParent { * @class */ export class BlockRangeScan { - protected constructor( + constructor( public readonly hash: string, public readonly start: bigint, public readonly end: bigint, @@ -138,41 +110,6 @@ export class BlockRangeScan { ); } - public static fromDocument(document: BlockRangeScanDocument) { - const { - _id: { start, end, scan_key, tree_depth }, - hash, - processed_block, - timestamp, - start_timestamp, - end_timestamp, - parent_id, - is_leaf_node, - } = document; - - const parent = parent_id ? BlockRangeScanParent.fromDocument(parent_id) : null; - - let processedBlock: bigint; - - if (processed_block) { - processedBlock = parseToBigInt(processed_block); - } - - return new BlockRangeScan( - hash, - parseToBigInt(start), - parseToBigInt(end), - scan_key, - tree_depth, - parent, - is_leaf_node, - processedBlock, - timestamp, - start_timestamp, - end_timestamp - ); - } - public static createChildRanges( blockRange: BlockRangeScan, maxChunkSize: number @@ -212,45 +149,6 @@ export class BlockRangeScan { this.isLeafNode = true; } - public toDocument() { - const { start, scanKey, end, treeDepth, hash } = this; - const doc: BlockRangeScanDocument = { - _id: { - start: Long.fromString(start.toString()), - end: Long.fromString(end.toString()), - scan_key: scanKey, - tree_depth: treeDepth, - }, - hash, - }; - - if (typeof this.processedBlock == 'bigint') { - doc.processed_block = Long.fromString(this.processedBlock.toString()); - } - - if (typeof this.isLeafNode == 'boolean') { - doc.is_leaf_node = this.isLeafNode; - } - - if (this.parent) { - doc.parent_id = this.parent.toDocument(); - } - - if (this.timestamp) { - doc.timestamp = this.timestamp; - } - - if (this.startTimestamp) { - doc.start_timestamp = this.startTimestamp; - } - - if (this.endTimestamp) { - doc.end_timestamp = this.endTimestamp; - } - - return doc; - } - public toJson(): BlockRangeScan { const { start, diff --git a/src/reader/block-range-scanner/block-range-scanner.config.ts b/src/common/block-range-scanner/block-range-scanner.config.ts similarity index 100% rename from src/reader/block-range-scanner/block-range-scanner.config.ts rename to src/common/block-range-scanner/block-range-scanner.config.ts diff --git a/src/reader/block-range-scanner/block-range-scanner.errors.ts b/src/common/block-range-scanner/block-range-scanner.errors.ts similarity index 100% rename from src/reader/block-range-scanner/block-range-scanner.errors.ts rename to src/common/block-range-scanner/block-range-scanner.errors.ts diff --git a/src/reader/block-range-scanner/block-range-scanner.ts b/src/common/block-range-scanner/block-range-scanner.ts similarity index 60% rename from src/reader/block-range-scanner/block-range-scanner.ts rename to src/common/block-range-scanner/block-range-scanner.ts index c19c16a..1bbeaf6 100644 --- a/src/reader/block-range-scanner/block-range-scanner.ts +++ b/src/common/block-range-scanner/block-range-scanner.ts @@ -1,35 +1,11 @@ -import { MongoConfig, MongoSource, log } from '@alien-worlds/api-core'; import { BlockRangeScan } from './block-range-scan'; import { BlockRangeScanRepository, ScanRequest } from './block-range-scan.repository'; import { DuplicateBlockRangeScanError } from './block-range-scanner.errors'; -import { BlockRangeScanConfig } from './block-range-scanner.config'; -import { BlockRangeScanMongoSource } from './block-range-scan.mongo.source'; /** * @class */ export class BlockRangeScanner { - public static async create( - mongo: MongoSource | MongoConfig, - config: BlockRangeScanConfig - ): Promise { - let mongoSource: MongoSource; - - log(` * Block Range Scanner ... [starting]`); - - if (mongo instanceof MongoSource) { - mongoSource = mongo; - } else { - mongoSource = await MongoSource.create(mongo); - } - const source = new BlockRangeScanMongoSource(mongoSource); - const repository = new BlockRangeScanRepository(source, config); - const scanner: BlockRangeScanner = new BlockRangeScanner(repository); - - log(` * Block Range Scanner ... [ready]`); - return scanner; - } - constructor(private blockRangeScanRepository: BlockRangeScanRepository) {} public async createScanNodes( diff --git a/src/reader/block-range-scanner/index.ts b/src/common/block-range-scanner/index.ts similarity index 69% rename from src/reader/block-range-scanner/index.ts rename to src/common/block-range-scanner/index.ts index 6432c68..e20b474 100644 --- a/src/reader/block-range-scanner/index.ts +++ b/src/common/block-range-scanner/index.ts @@ -1,7 +1,6 @@ -export * from './block-range-scan.mongo.source'; export * from './block-range-scan.repository'; +export * from './block-range-scan.source'; export * from './block-range-scan'; export * from './block-range-scanner.config'; -export * from './block-range-scanner.dtos'; export * from './block-range-scanner.errors'; export * from './block-range-scanner'; diff --git a/src/common/block-state/__tests__/block-state.unit.test.ts b/src/common/block-state/__tests__/block-state.unit.test.ts new file mode 100644 index 0000000..ac7e823 --- /dev/null +++ b/src/common/block-state/__tests__/block-state.unit.test.ts @@ -0,0 +1,112 @@ +import { RepositoryImpl, Failure, Result } from '@alien-worlds/aw-core'; +import { BlockState } from '../block-state'; + +jest.mock('@alien-worlds/aw-core'); +jest.mock('./block-state.types'); + +describe('BlockState', () => { + let blockState; + let dataSourceMock; + let mapperMock; + let queryBuildersMock; + let queryBuilderMock; + + beforeEach(() => { + dataSourceMock = { + find: jest.fn(), + count: jest.fn(), + aggregate: jest.fn(), + update: jest.fn(), + insert: jest.fn(), + remove: jest.fn(), + startTransaction: jest.fn(), + commitTransaction: jest.fn(), + rollbackTransaction: jest.fn(), + } as any; + mapperMock = { + toEntity: jest.fn(), + fromEntity: jest.fn(), + getEntityKeyMapping: jest.fn(), + } as any; + queryBuildersMock = { + buildFindQuery: jest.fn(), + buildCountQuery: jest.fn(), + buildUpdateQuery: jest.fn(), + buildRemoveQuery: jest.fn(), + buildAggregationQuery: jest.fn(), + } as any; + queryBuilderMock = { + with: jest.fn(), + build: jest.fn(), + } as any; + + blockState = new BlockState( + dataSourceMock, + mapperMock, + queryBuilderMock, + queryBuilderMock + ); + }); + + afterEach(() => { + jest.resetAllMocks(); + }); + + it('should be a valid instance', () => { + expect(blockState).toBeInstanceOf(BlockState); + expect(blockState).toBeInstanceOf(RepositoryImpl); + }); + + describe('getState()', () => { + it('should return block state data', async () => { + const mockData = { + content: [ + { + lastModifiedTimestamp: new Date(), + actions: [], + tables: [], + blockNumber: 0n, + }, + ], + }; + blockState.find = jest.fn().mockResolvedValue(mockData); + + const result = await blockState.getState(); + + expect(blockState.find).toHaveBeenCalled(); + expect(result).toEqual(Result.withContent(mockData.content[0])); + }); + + it('should handle error properly', async () => { + const mockError = new Error('Database error'); + blockState.find = jest.fn().mockRejectedValue(mockError); + + const result = await blockState.getState(); + + expect(blockState.find).toHaveBeenCalled(); + expect(result).toEqual(Result.withFailure(Failure.fromError(mockError))); + }); + }); + + describe('updateBlockNumber()', () => { + it('should update block number and return true if successful', async () => { + const mockValue = 10n; + const mockData = { + content: { + modifiedCount: 1, + upsertedCount: 0, + }, + }; + blockState.update = jest.fn().mockResolvedValue(mockData); + + const result = await blockState.updateBlockNumber(mockValue); + + expect(blockState.update).toHaveBeenCalledWith( + queryBuilderMock.with({ blockNumber: mockValue }) + ); + expect(result).toEqual(Result.withContent(true)); + }); + + // Write similar tests for getBlockNumber() here... + }); +}); diff --git a/src/common/block-state/block-state.source.ts b/src/common/block-state/block-state.source.ts deleted file mode 100644 index e5b66d9..0000000 --- a/src/common/block-state/block-state.source.ts +++ /dev/null @@ -1,116 +0,0 @@ -import { - CollectionMongoSource, - MongoDB, - MongoSource, - parseToBigInt, -} from '@alien-worlds/api-core'; -import { BlockStateDocument } from './block-state.types'; - -export class BlockStateSource extends CollectionMongoSource { - constructor(mongoSource: MongoSource) { - super(mongoSource, 'history_tools.block_state'); - } - - public async getState(): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return state; - } - - public async newState( - blockNumber: bigint, - actions: string[] = [], - tables: string[] = [] - ): Promise { - const result = await this.update( - { - block_number: MongoDB.Long.fromBigInt(blockNumber), - actions, - tables, - last_modified_timestamp: new Date(), - }, - { options: { upsert: true } } - ); - if (result) { - await this.update( - { last_modified_timestamp: new Date() }, - { options: { upsert: true } } - ); - } - } - - /** - * Updates block number. - * (Only if given value is higher than the one currently stored in the database) - * - * @param {bigint} value - */ - public async updateBlockNumber(value: bigint): Promise { - const result = await this.update( - { - $max: { block_number: MongoDB.Long.fromBigInt(value) }, - $set: { last_modified_timestamp: new Date() }, - }, - { options: { upsert: true } } - ); - return !!result; - } - - public async removeActions(labels: string[]): Promise { - await this.update( - { $pull: { actions: { $in: labels } }, last_modified_timestamp: new Date() }, - { options: { upsert: true } } - ); - } - - public async setActions(labels: string[]): Promise { - await this.update( - { actions: labels, last_modified_timestamp: new Date() }, - { options: { upsert: true } } - ); - } - - public async removeTables(labels: string[]): Promise { - await this.update( - { $pull: { tables: { $in: labels } }, last_modified_timestamp: new Date() }, - { options: { upsert: true } } - ); - } - - public async setTables(labels: string[]): Promise { - await this.update( - { tables: labels, last_modified_timestamp: new Date() }, - { options: { upsert: true } } - ); - } - - public async getBlockNumber(): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return parseToBigInt(state?.block_number ? state.block_number : MongoDB.Long.NEG_ONE); - } - - public async getActions(): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return state?.actions || []; - } - - public async getTables(): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return state?.tables || []; - } - - public async includesAction(label: string): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return state?.actions.includes(label) === true; - } - - public async includesTable(label: string): Promise { - const state: BlockStateDocument = await this.findOne({ filter: {} }); - - return state?.tables.includes(label) === true; - } -} diff --git a/src/common/block-state/block-state.ts b/src/common/block-state/block-state.ts index 3db814e..a27ae7a 100644 --- a/src/common/block-state/block-state.ts +++ b/src/common/block-state/block-state.ts @@ -1,104 +1,102 @@ import { + DataSource, Failure, - log, - MongoConfig, - MongoSource, - parseToBigInt, + Mapper, + QueryBuilder, + QueryBuilders, + RepositoryImpl, Result, -} from '@alien-worlds/api-core'; -import { BlockStateSource } from './block-state.source'; -import { BlockStateData } from './block-state.types'; +} from '@alien-worlds/aw-core'; +import { BlockStateModel } from './block-state.types'; -export class BlockState { - public static async create(mongo: MongoSource | MongoConfig) { - log(` * Block State ... [starting]`); - - let state: BlockState; - - if (mongo instanceof MongoSource) { - state = new BlockState(mongo); - } else { - const mongoSource = await MongoSource.create(mongo); - state = new BlockState(mongoSource); - } - - log(` * Block State ... [ready]`); - return state; +/** + * A class representing a block state. + */ +export class BlockState extends RepositoryImpl { + /** + * Creates an instance of the BlockState class. + * + * @param {DataSource} source - The data source. + * @param {BlockStateMongoMapper} mapper - The data mapper. + * @param {QueryBuilders} queryBuilders - The query builders. + * @param {QueryBuilder} updateBlockNumberQueryBuilder - The query builder to update block number. + */ + constructor( + source: DataSource, + mapper: Mapper, + queryBuilders: QueryBuilders, + private updateBlockNumberQueryBuilder: QueryBuilder + ) { + super(source, mapper, queryBuilders); } - private source: BlockStateSource; + /** + * Fetches the current state of the data source. + * + * @returns {Promise>} - The result of the operation. + */ + public async getState(): Promise> { + try { + const { content: states } = await this.find(); - private constructor(mongo: MongoSource) { - this.source = new BlockStateSource(mongo); - } + if (states) { + const state = states[0]; + const { lastModifiedTimestamp, actions, tables, blockNumber } = state; - public async getState(): Promise> { - try { - const state = await this.source.getState(); - let data: BlockStateData; - if (state) { - const { last_modified_timestamp, actions, tables, block_number } = state; - data = { - lastModifiedTimestamp: last_modified_timestamp || new Date(), + return Result.withContent({ + lastModifiedTimestamp: lastModifiedTimestamp || new Date(), actions: actions || [], tables: tables || [], - blockNumber: parseToBigInt(block_number) || 0n, - }; + blockNumber: blockNumber || 0n, + }); } - data = { + return Result.withContent({ lastModifiedTimestamp: new Date(), actions: [], tables: [], blockNumber: 0n, - }; - return Result.withContent(data); + }); } catch (error) { return Result.withFailure(Failure.fromError(error)); } } /** - * Updates block number. - * (Only if given value is higher than the one currently stored in the database) + * Updates the block number in the current state. * - * @param {bigint} value + * @param {bigint} value - The new block number. + * @returns {Promise>} - The result of the operation. */ - public async newState(blockNumber: bigint): Promise { - try { - await this.source.updateBlockNumber(blockNumber); - return Result.withoutContent(); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); + public async updateBlockNumber(value: bigint): Promise> { + this.updateBlockNumberQueryBuilder.with({ blockNumber: value }); + const { content, failure } = await this.update(this.updateBlockNumberQueryBuilder); + + if (failure) { + return Result.withFailure(failure); } + + return Result.withContent(content.modifiedCount + content.upsertedCount > 0); } /** - * Updates block number. - * (Only if given value is higher than the one currently stored in the database) + * Fetches the current block number from the current state. * - * @param {bigint} value + * @returns {Promise>} - The result of the operation. */ - public async updateBlockNumber(value: bigint): Promise> { - try { - const isUpdated = await this.source.updateBlockNumber(value); + public async getBlockNumber(): Promise> { + const { content: states, failure } = await this.find(); - return Result.withContent(isUpdated); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); + if (failure) { + return Result.withFailure(failure); } - } - /** - * Returns current block number or -1 - * @returns - */ - public async getBlockNumber(): Promise> { - try { - const currentBlockNumber = await this.source.getBlockNumber(); - return Result.withContent(currentBlockNumber); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); + if (states.length > 0) { + const state = states[0]; + + return Result.withContent(state.blockNumber); } + + return Result.withContent(-1n); } } diff --git a/src/common/block-state/block-state.types.ts b/src/common/block-state/block-state.types.ts index ffbdb0f..b7c8e9e 100644 --- a/src/common/block-state/block-state.types.ts +++ b/src/common/block-state/block-state.types.ts @@ -1,16 +1,6 @@ -import { MongoDB } from "@alien-worlds/api-core" - -export type BlockStateDocument = { - _id: MongoDB.ObjectId; - last_modified_timestamp: Date; - block_number: MongoDB.Long; - actions: string[]; - tables: string[]; -} - -export type BlockStateData = { - lastModifiedTimestamp: Date; - blockNumber: bigint; - actions: string[]; - tables: string[]; -} \ No newline at end of file +export type BlockStateModel = { + lastModifiedTimestamp: Date; + blockNumber: bigint; + actions: string[]; + tables: string[]; +}; diff --git a/src/common/block-state/index.ts b/src/common/block-state/index.ts index 8bb1413..42a8bdc 100644 --- a/src/common/block-state/index.ts +++ b/src/common/block-state/index.ts @@ -1,2 +1,2 @@ export * from './block-state'; -export * from './block-state.source'; +export * from './block-state.types'; diff --git a/src/common/blockchain/abi/__tests__/abi.unit.test.ts b/src/common/blockchain/abi/__tests__/abi.unit.test.ts deleted file mode 100644 index d92eb00..0000000 --- a/src/common/blockchain/abi/__tests__/abi.unit.test.ts +++ /dev/null @@ -1,185 +0,0 @@ -import { Abi } from '../abi'; -import { AbiExtension } from '../abi-extension'; -import { AbiStruct, StructField } from '../abi-struct'; -import { AbiTable } from '../abi-table'; -import { AbiType } from '../abi-type'; -import { AbiVariant } from '../abi-variant'; -import { AbiErrorMessage } from '../abi-error-message'; -import { RicardianClause } from '../ricardian-clause'; - -jest.mock('eosjs/dist/eosjs-serialize'); - -const typeDto = { - new_type_name: 'some_name', - type: 'some_type', -}; - -const variantDto = { - name: 'SOME_NAME', - types: ['TYPE_1', 'TYPE_2'], -}; - -const customTypeDto = { - new_type_name: 'SOME_TYPE_NAME', - type: 'SOME_TYPE', -}; - -const structDto = { - name: 'foo.name', - base: 'FOO', - fields: [{ name: 'FIELD_NAME', type: 'FIELD_TYPE' }], -}; - -const structFieldDto = { - name: 'FIELD_NAME', - type: 'FIELD_TYPE', -}; - -const ricardianClauseDto = { - id: 'SOME_ID', - body: 'SOME_BODY', -}; - -const abiExtensionDto = { - tag: 12345, - value: 'SOME_BODY', -}; - -const errorMessageDto = { - error_code: 200, - error_msg: 'SOME_MESSAGE', -}; - -const tableDto = { - name: 'accounts', - type: 'account', - index_type: 'i64', - key_names: ['KEY_NAME_1'], - key_types: ['KEY_TYPE_1'], -}; - -const actionDto = { - name: 'create', - type: 'create', - ricardian_contract: 'contract', -}; - -const abiDto = { - version: 'version_1', - types: [typeDto], - structs: [structDto], - tables: [tableDto], - actions: [actionDto], - ricardian_clauses: [ricardianClauseDto], - abi_extensions: [abiExtensionDto], - error_messages: [errorMessageDto], - variants: [variantDto], -}; - -describe('CustomType Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiType.fromDto(customTypeDto); - expect(entity.toDto()).toEqual(customTypeDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiType.fromDto(customTypeDto); - expect(entity.toDto()).toEqual(customTypeDto); - }); -}); - -describe('Struct Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiStruct.fromDto(structDto); - expect(entity.toDto()).toEqual(structDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiStruct.fromDto(structDto); - expect(entity.toDto()).toEqual(structDto); - }); -}); - -describe('StructField Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = StructField.fromDto(structFieldDto); - expect(entity.toDto()).toEqual(structFieldDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = StructField.fromDto(structFieldDto); - expect(entity.toDto()).toEqual(structFieldDto); - }); -}); - -describe('RicardianClause Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = RicardianClause.fromDto(ricardianClauseDto); - expect(entity.toDto()).toEqual(ricardianClauseDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = RicardianClause.fromDto(ricardianClauseDto); - expect(entity.toDto()).toEqual(ricardianClauseDto); - }); -}); - -describe('AbiExtension Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiExtension.fromDto(abiExtensionDto); - expect(entity.toDto()).toEqual(abiExtensionDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiExtension.fromDto(abiExtensionDto); - expect(entity.toDto()).toEqual(abiExtensionDto); - }); -}); - -describe('AbiErrorMessage Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiErrorMessage.fromDto(errorMessageDto); - expect(entity.toDto()).toEqual(errorMessageDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiErrorMessage.fromDto(errorMessageDto); - expect(entity.toDto()).toEqual(errorMessageDto); - }); -}); - -describe('Variant Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiVariant.fromDto(variantDto); - expect(entity.toDto()).toEqual(variantDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiVariant.fromDto(variantDto); - expect(entity.toDto()).toEqual(variantDto); - }); -}); - -describe('Table Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = AbiTable.fromDto(tableDto); - expect(entity.toDto()).toEqual(tableDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = AbiTable.fromDto(tableDto); - expect(entity.toDto()).toEqual(tableDto); - }); -}); - -describe('Abi Unit tests', () => { - it('"fromDto" should create entity', () => { - const entity = Abi.fromJson(abiDto); - expect(entity.toJson()).toEqual(abiDto); - }); - - it('"toDto" should create dto from entity', () => { - const entity = Abi.fromJson(abiDto); - expect(entity.toJson()).toEqual(abiDto); - }); -}); diff --git a/src/common/blockchain/abi/abi-action.ts b/src/common/blockchain/abi/abi-action.ts deleted file mode 100644 index 80be518..0000000 --- a/src/common/blockchain/abi/abi-action.ts +++ /dev/null @@ -1,40 +0,0 @@ -import { AbiActionJson } from './abi.dtos'; - -/** - * @class - */ -export class AbiAction { - /** - * - * @param {string} name - The name of the action as defined in the contract - * @param {string} type - The name of the implicit struct as described in the ABI - * @param {string=} ricardianContract - An optional ricardian clause to associate to this action describing its intended functionality. - */ - private constructor( - public readonly name: string, - public readonly type: string, - public readonly ricardianContract?: string - ) {} - - /** - * @returns {AbiActionJson} - */ - public toDto(): AbiActionJson { - const { name, type, ricardianContract } = this; - return { - name, - type, - ricardian_contract: ricardianContract, - }; - } - - /** - * @static - * @param {AbiActionJson} dto - * @returns {AbiAction} - */ - public static fromDto(dto: AbiActionJson): AbiAction { - const { name, type, ricardian_contract } = dto; - return new AbiAction(name, type, ricardian_contract); - } -} diff --git a/src/common/blockchain/abi/abi-error-message.ts b/src/common/blockchain/abi/abi-error-message.ts deleted file mode 100644 index 1408241..0000000 --- a/src/common/blockchain/abi/abi-error-message.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { AbiErrorMessageJson } from './abi.dtos'; - -/** - * @class - */ -export class AbiErrorMessage { - /** - * - * @param {number} errorCode - * @param {string} message - */ - private constructor( - public readonly errorCode: number, - public readonly message: string - ) {} - - /** - * @returns {AbiErrorMessageJson} - */ - public toDto(): AbiErrorMessageJson { - const { errorCode, message } = this; - return { error_code: errorCode, error_msg: message }; - } - - /** - * @static - * @param {AbiErrorMessageJson} dto - * @returns {AbiErrorMessage} - */ - public static fromDto(dto: AbiErrorMessageJson): AbiErrorMessage { - const { error_code, error_msg } = dto; - return new AbiErrorMessage(error_code, error_msg); - } -} diff --git a/src/common/blockchain/abi/abi-extension.ts b/src/common/blockchain/abi/abi-extension.ts deleted file mode 100644 index eccb994..0000000 --- a/src/common/blockchain/abi/abi-extension.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { AbiExtensionJson } from './abi.dtos'; - -/** - * @class - */ -export class AbiExtension { - /** - * - * @param {number} tag - * @param {string} value - */ - private constructor(public readonly tag: number, public readonly value: string) {} - - /** - * @returns {AbiExtensionJson} - */ - public toDto(): AbiExtensionJson { - const { tag, value } = this; - return { tag, value }; - } - - /** - * @static - * @param {AbiExtensionJson} dto - * @returns {AbiExtension} - */ - public static fromDto(dto: AbiExtensionJson): AbiExtension { - const { tag, value } = dto; - return new AbiExtension(tag, value); - } -} diff --git a/src/common/blockchain/abi/abi-struct.ts b/src/common/blockchain/abi/abi-struct.ts deleted file mode 100644 index 501b226..0000000 --- a/src/common/blockchain/abi/abi-struct.ts +++ /dev/null @@ -1,70 +0,0 @@ -/* eslint-disable @typescript-eslint/unbound-method */ -import { AbiStructJson, FieldJson } from './abi.dtos'; - -/** - * @class - */ -export class StructField { - /** - * - * @param {string} name - The field's name - * @param {string} type - The field's type - */ - private constructor(public readonly name: string, public readonly type: string) {} - - /** - * @returns {FieldJson} - */ - public toDto(): FieldJson { - const { name, type } = this; - return { name, type }; - } - - /** - * @static - * @param {FieldJson} dto - * @returns {StructField} - */ - public static fromDto(dto: FieldJson): StructField { - const { name, type } = dto; - return new StructField(name, type); - } -} - -/** - * @class - */ -export class AbiStruct { - /** - * - * @param {string} name - * @param {string} base - Inheritance, parent struct - * @param {StructField[]} fields - Array of field objects describing the struct's fields - */ - private constructor( - public readonly name: string, - public readonly base: string, - public readonly fields: StructField[] - ) {} - - /** - * @returns {AbiStructJson} - */ - public toDto(): AbiStructJson { - return { - name: this.name, - base: this.base, - fields: this.fields.map(field => field.toDto()), - }; - } - - /** - * @static - * @param {AbiStructJson} dto - * @returns {AbiStruct} - */ - public static fromDto(dto: AbiStructJson): AbiStruct { - const { name, base, fields } = dto; - return new AbiStruct(name, base, fields.map(StructField.fromDto)); - } -} diff --git a/src/common/blockchain/abi/abi-table.ts b/src/common/blockchain/abi/abi-table.ts deleted file mode 100644 index 7392898..0000000 --- a/src/common/blockchain/abi/abi-table.ts +++ /dev/null @@ -1,46 +0,0 @@ -import { AbiTableJson } from './abi.dtos'; - -/** - * @class - */ -export class AbiTable { - /** - * - * @param {string} name - The name of the table, determined during instantiation. - * @param {string} type - The table's corresponding struct - * @param {string} indexType - The type of primary index of this table - * @param {string[]} keyNames - An array of key names, length must equal length of key_types member - * @param {string[]} keyTypes - An array of key types that correspond to key names array member, length of array must equal length of key names array. - */ - private constructor( - public readonly name: string, - public readonly type: string, - public readonly indexType: string, - public readonly keyNames: string[], - public readonly keyTypes: string[] - ) {} - - /** - * @returns {AbiTableJson} - */ - public toDto(): AbiTableJson { - const { name, type, indexType, keyNames, keyTypes } = this; - return { - name, - type, - index_type: indexType, - key_names: keyNames, - key_types: keyTypes, - }; - } - - /** - * @static - * @param {AbiTableJson} dto - * @returns {AbiTable} - */ - public static fromDto(dto: AbiTableJson): AbiTable { - const { name, type, index_type, key_names, key_types } = dto; - return new AbiTable(name, type, index_type, key_names, key_types); - } -} diff --git a/src/common/blockchain/abi/abi-type.ts b/src/common/blockchain/abi/abi-type.ts deleted file mode 100644 index 70352ed..0000000 --- a/src/common/blockchain/abi/abi-type.ts +++ /dev/null @@ -1,40 +0,0 @@ -import { AbiTypeJson } from './abi.dtos'; - -/** - * Type entity - * @class - */ -export class AbiType { - /** - * - * @param {string} newTypeName - * @param {string} type - */ - private constructor( - public readonly newTypeName: string, - public readonly type: string - ) {} - - /** - * Parse Type entity to DTO - * @returns {AbiTypeJson} - */ - public toDto(): AbiTypeJson { - return { - new_type_name: this.newTypeName, - type: this.type, - }; - } - - /** - * Create ABI entity based on provided DTO - * - * @static - * @param {AbiTypeJson} dto - * @returns {AbiType} - */ - public static fromDto(dto: AbiTypeJson): AbiType { - const { new_type_name, type } = dto; - return new AbiType(new_type_name, type); - } -} diff --git a/src/common/blockchain/abi/abi-variant.ts b/src/common/blockchain/abi/abi-variant.ts deleted file mode 100644 index b042e06..0000000 --- a/src/common/blockchain/abi/abi-variant.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { AbiVariantJson } from './abi.dtos'; - -/** - * @class - */ -export class AbiVariant { - /** - * - * @param {string} name - * @param {string[]} types - */ - private constructor(public readonly name: string, public readonly types: string[]) {} - - /** - * @returns {AbiVariantJson} - */ - public toDto(): AbiVariantJson { - const { name, types } = this; - return { name, types }; - } - - /** - * @static - * @param {AbiVariantJson} dto - * @returns {AbiVariant} - */ - public static fromDto(dto: AbiVariantJson): AbiVariant { - const { name, types } = dto; - return new AbiVariant(name, types); - } -} diff --git a/src/common/blockchain/abi/abi.dtos.ts b/src/common/blockchain/abi/abi.dtos.ts deleted file mode 100644 index e313ddb..0000000 --- a/src/common/blockchain/abi/abi.dtos.ts +++ /dev/null @@ -1,136 +0,0 @@ -export type AbiActionJson = { - name: string; // The name of the action as defined in the contract - type: string; // The name of the implicit struct as described in the ABI - ricardian_contract: string; // An optional ricardian clause to associate to this action describing its intended functionality. -}; - -export type AbiJson = { - version: string; - types: AbiTypeJson[]; - structs: AbiStructJson[]; - tables: AbiTableJson[]; - actions: AbiActionJson[]; - ricardian_clauses: RicardianClauseJson[]; - abi_extensions: AbiExtensionJson[]; - error_messages: AbiErrorMessageJson[]; - variants?: AbiVariantJson[]; -}; - -export type AbiErrorMessageJson = { - error_code: number; - error_msg: string; -}; - -export type AbiVariantJson = { - name: string; - types: string[]; -}; - -export type RicardianClauseJson = { - id: string; - body: string; -}; - -export type AbiExtensionJson = { - tag: number; - value: string; -}; - -export type AbiTypeJson = { - new_type_name: string; - type: string; -}; - -export type AbiStructFieldJson = { - name: string; - type: string; -}; - -export type AbiStructJson = { - name: string; - base: string; - fields: AbiStructFieldJson[]; -}; - -export type CreateStructJson = { - name: 'create'; - base: ''; - fields: [IssuerFieldJson, MaximumSupplyFieldJson]; -}; - -export type IssueStructJson = { - name: 'issue'; - base: ''; - fields: [ToFieldJson, QuantityFieldJson, MemoFieldJson]; -}; - -export type RetireStructJson = { - name: 'retire'; - base: ''; - fields: [QuantityFieldJson, MemoFieldJson]; -}; - -export type TransfereStructJson = { - name: 'transfer'; - base: ''; - fields: [FromFieldJson, ToFieldJson, QuantityFieldJson, MemoFieldJson]; -}; - -export type CloseStructJson = { - name: 'close'; - base: ''; - fields: [SymbolFieldJson, OwnerFieldJson]; -}; - -export type FieldJson = { - name: string; - type: string; -}; - -export type OwnerFieldJson = { - name: 'owner'; - type: 'name'; -}; - -export type SymbolFieldJson = { - name: 'symbol'; - type: 'symbol'; -}; - -export type MemoFieldJson = { - name: 'memo'; - type: 'string'; -}; - -export type QuantityFieldJson = { - name: 'quantity'; - type: 'asset'; -}; - -export type ToFieldJson = { - name: 'to'; - type: 'name'; -}; - -export type FromFieldJson = { - name: 'from'; - type: 'name'; -}; - -export type IssuerFieldJson = { - name: 'issuer'; - type: 'name'; -}; - -export type MaximumSupplyFieldJson = { - name: 'maximum_supply'; - type: 'asset'; -}; - -export type AbiTableJson = { - name: string; // 'accounts' | 'stats' - type: string; // 'account' | 'currency_stats' ... Corresponds to previously defined struct - index_type: string; // 'i64' - key_names: string[]; - key_types: string[]; -}; diff --git a/src/common/blockchain/abi/abi.ts b/src/common/blockchain/abi/abi.ts deleted file mode 100644 index a2de49e..0000000 --- a/src/common/blockchain/abi/abi.ts +++ /dev/null @@ -1,138 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-return */ -import { getTypesFromAbi } from 'eosjs/dist/eosjs-serialize'; - -import { AbiJson } from './abi.dtos'; -import { Serialize } from 'eosjs'; -import { AbiType } from './abi-type'; -import { AbiStruct } from './abi-struct'; -import { AbiTable } from './abi-table'; -import { AbiAction } from './abi-action'; -import { RicardianClause } from './ricardian-clause'; -import { AbiExtension } from './abi-extension'; -import { AbiErrorMessage } from './abi-error-message'; -import { AbiVariant } from './abi-variant'; -import { deserialize, serialize } from 'v8'; - -/** - * ABI entity - * @class - */ -export class Abi { - private typesMap: Map; - /** - * - * @param {string} version - * @param {AbiType[]} types - * @param {AbiStruct[]} structs - * @param {AbiAction[]} actions - * @param {AbiTable[]} tables - * @param {RicardianClause[]} ricardianClauses - * @param {AbiExtension[]} abiExtensions - * @param {AbiErrorMessage[]} errorMessages - * @param {string} comment - * @param {AbiVariant[]} variants - */ - private constructor( - public readonly version: string, - public readonly types: AbiType[], - public readonly structs: AbiStruct[], - public readonly tables: AbiTable[], - public readonly actions: AbiAction[], - public readonly ricardianClauses: RicardianClause[], - public readonly abiExtensions: AbiExtension[], - public readonly errorMessages: AbiErrorMessage[], - public readonly variants?: AbiVariant[] - ) { - this.typesMap = getTypesFromAbi(Serialize.createInitialTypes(), this.toJson()); - } - - /** - * Parse ABI entity to DTO - * @returns {AbiJson} - */ - public toJson(): AbiJson { - const { - version, - types, - structs, - actions, - tables, - ricardianClauses, - abiExtensions, - errorMessages: AbierrorMessages, - variants, - } = this; - - const dto: AbiJson = { - version, - types: types.map(item => item.toDto()), - structs: structs.map(item => item.toDto()), - tables: tables.map(item => item.toDto()), - actions: actions ? actions.map(item => item.toDto()) : [], - ricardian_clauses: ricardianClauses - ? ricardianClauses.map(item => item.toDto()) - : [], - abi_extensions: abiExtensions ? abiExtensions.map(item => item.toDto()) : [], - error_messages: AbierrorMessages ? AbierrorMessages.map(item => item.toDto()) : [], - variants: variants ? variants.map(item => item.toDto()) : [], - }; - - return dto; - } - - public toBuffer(): Buffer { - return serialize(this.toJson()); - } - - public toHex(): string { - return serialize(this.toJson()).toString('hex'); - } - - public getTypesMap(): Map { - return this.typesMap; - } - - /** - * Create ABI entity based on provided DTO - * - * @static - * @param {AbiJson} dto - * @returns {Abi} - */ - public static fromJson(dto: AbiJson): Abi { - const { version, types, structs, tables } = dto; - const actions = dto.actions ? dto.actions.map(dto => AbiAction.fromDto(dto)) : []; - const ricardian_clauses = dto.ricardian_clauses - ? dto.ricardian_clauses.map(dto => RicardianClause.fromDto(dto)) - : []; - const abi_extensions = dto.abi_extensions - ? dto.abi_extensions.map(dto => AbiExtension.fromDto(dto)) - : []; - const error_messages = dto.error_messages - ? dto.error_messages.map(dto => AbiErrorMessage.fromDto(dto)) - : []; - const variants = dto.variants ? dto.variants.map(dto => AbiVariant.fromDto(dto)) : []; - - return new Abi( - version, - types.map(dto => AbiType.fromDto(dto)), - structs.map(dto => AbiStruct.fromDto(dto)), - tables.map(dto => AbiTable.fromDto(dto)), - actions, - ricardian_clauses, - abi_extensions, - error_messages, - variants - ); - } - - public static fromBuffer(buffer: Buffer): Abi { - const json = deserialize(buffer); - return Abi.fromJson(json); - } - - public static fromHex(value: string): Abi { - const buf = Buffer.from(value, 'hex'); - return Abi.fromBuffer(buf); - } -} diff --git a/src/common/blockchain/abi/index.ts b/src/common/blockchain/abi/index.ts deleted file mode 100644 index 545c409..0000000 --- a/src/common/blockchain/abi/index.ts +++ /dev/null @@ -1,10 +0,0 @@ -export { RicardianClause } from "./ricardian-clause"; -export { Abi } from "./abi"; -export { AbiAction } from "./abi-action"; -export { AbiErrorMessage } from "./abi-error-message"; -export { AbiExtension } from "./abi-extension"; -export { AbiStruct } from "./abi-struct"; -export { AbiTable } from "./abi-table"; -export { AbiType } from "./abi-type"; -export { AbiVariant } from "./abi-variant"; -export * from "./abi.dtos"; diff --git a/src/common/blockchain/abi/ricardian-clause.ts b/src/common/blockchain/abi/ricardian-clause.ts deleted file mode 100644 index e248fcb..0000000 --- a/src/common/blockchain/abi/ricardian-clause.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { RicardianClauseJson } from './abi.dtos'; - -/** - * @class - */ -export class RicardianClause { - /** - * - * @param {string} id - * @param {string} body - */ - private constructor(public readonly id: string, public readonly body: string) {} - - /** - * @returns {RicardianClauseJson} - */ - public toDto(): RicardianClauseJson { - const { id, body } = this; - return { id, body }; - } - - /** - * @static - * @param {RicardianClauseJson} dto - * @returns {RicardianClause} - */ - public static fromDto(dto: RicardianClauseJson): RicardianClause { - const { id, body } = dto; - return new RicardianClause(id, body); - } -} diff --git a/src/common/blockchain/block-reader/block-reader.config.ts b/src/common/blockchain/block-reader/block-reader.config.ts deleted file mode 100644 index 61745be..0000000 --- a/src/common/blockchain/block-reader/block-reader.config.ts +++ /dev/null @@ -1,10 +0,0 @@ -import { MongoConfig } from '@alien-worlds/api-core'; - -export type BlockReaderConfig = { - mongo: MongoConfig; - endpoints: string[]; - reconnectInterval?: number; - shouldFetchDeltas?: boolean; - shouldFetchTraces?: boolean; - shouldFetchBlock?: boolean; -}; diff --git a/src/common/blockchain/block-reader/block-reader.enums.ts b/src/common/blockchain/block-reader/block-reader.enums.ts deleted file mode 100644 index 41676cf..0000000 --- a/src/common/blockchain/block-reader/block-reader.enums.ts +++ /dev/null @@ -1,6 +0,0 @@ -export enum BlockReaderConnectionState { - Connecting = 'connecting', - Connected = 'connected', - Idle = 'idle', - Disconnecting = 'disconnecting', -} diff --git a/src/common/blockchain/block-reader/block-reader.errors.ts b/src/common/blockchain/block-reader/block-reader.errors.ts deleted file mode 100644 index 84d278a..0000000 --- a/src/common/blockchain/block-reader/block-reader.errors.ts +++ /dev/null @@ -1,37 +0,0 @@ -export class AbiNotFoundError extends Error { - constructor() { - super(`ABI data not found`); - } -} - -export class MissingHandlersError extends Error { - constructor() { - super('Set "onReceivedBlock" handler before calling readOneBlock/readBlocks'); - } -} - -export class ServiceNotConnectedError extends Error { - constructor() { - super(`Client is not connected, requestBlocks cannot be called`); - } -} - -export class UnhandledBlockRequestError extends Error { - constructor(start: bigint, end: bigint) { - super( - `Error sending the block_range request ${start.toString()}-${end.toString()}. The current request was not completed or canceled.` - ); - } -} - -export class UnhandledMessageTypeError extends Error { - constructor(public readonly type: string) { - super(`Unhandled message type: ${type}`); - } -} - -export class UnhandledMessageError extends Error { - constructor(public readonly message, public readonly error) { - super('Received a message while no block range is being processed'); - } -} diff --git a/src/common/blockchain/block-reader/block-reader.message.ts b/src/common/blockchain/block-reader/block-reader.message.ts deleted file mode 100644 index 4c63146..0000000 --- a/src/common/blockchain/block-reader/block-reader.message.ts +++ /dev/null @@ -1,56 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { Abi } from '../abi'; -import { GetBlocksResultMessageContent } from './block-reader.types'; -import { deserializeMessage } from './block-reader.utils'; -import { Block } from './block/block'; -import { BlockJson } from './block/block.types'; - -export class BlockReaderMessage { - public static readonly version = 'v0'; - - private static isGetBlocksResultPongMessage(data: GetBlocksResultMessageContent): boolean { - return ( - typeof data.head === 'object' && - typeof data.last_irreversible === 'object' && - !data.prev_block && - !data.this_block && - !data.block && - !data.traces && - !data.deltas - ); - } - - public static create( - dto: Uint8Array, - abi: Abi - ) { - const result = deserializeMessage('result', dto, abi.getTypesMap()); - const [resultType, resultJson]: [string, MessageContentType] = result || []; - - if (resultType) { - if (resultType === `get_blocks_result_${this.version}`) { - if ( - BlockReaderMessage.isGetBlocksResultPongMessage( - resultJson - ) - ) { - return new BlockReaderMessage(resultType, null, true); - } - - (resultJson).abi_version = abi.version; - return new BlockReaderMessage( - resultType, - Block.fromJson(resultJson) - ); - } - } - - return null; - } - - private constructor( - public readonly type: string, - public readonly content: MessageContentType, - public readonly isPongMessage = false - ) {} -} diff --git a/src/common/blockchain/block-reader/block-reader.requests.ts b/src/common/blockchain/block-reader/block-reader.requests.ts deleted file mode 100644 index 38c334c..0000000 --- a/src/common/blockchain/block-reader/block-reader.requests.ts +++ /dev/null @@ -1,69 +0,0 @@ -import { Serialize } from 'eosjs'; -import { BlockReaderOptions } from './block-reader.types'; -import { serializeMessage } from './block-reader.utils'; - -export class GetBlocksRequest { - public readonly version = 'v0'; - - public static create( - startBlock: bigint, - endBlock: bigint, - options: BlockReaderOptions, - types: Map - ) { - const { shouldFetchDeltas, shouldFetchTraces } = options; - - return new GetBlocksRequest( - startBlock, - endBlock, - shouldFetchTraces, - shouldFetchDeltas, - types - ); - } - - private constructor( - public readonly startBlock: bigint, - public readonly endBlock: bigint, - public readonly shouldFetchTraces: boolean, - public readonly shouldFetchDeltas: boolean, - public readonly types: Map - ) {} - - public toUint8Array(): Uint8Array { - return serializeMessage( - 'request', - [ - `get_blocks_request_${this.version}`, - { - irreversible_only: false, - start_block_num: Number(this.startBlock.toString()), - end_block_num: Number(this.endBlock.toString()), - max_messages_in_flight: 1, - have_positions: [], - fetch_block: true, - fetch_traces: this.shouldFetchTraces, - fetch_deltas: this.shouldFetchDeltas, - }, - ], - this.types - ); - } -} - -export class GetBlocksAckRequest { - public readonly version = 'v0'; - - constructor( - public readonly messagesCount: number, - public readonly types: Map - ) {} - - public toUint8Array() { - return serializeMessage( - 'request', - [`get_blocks_ack_request_${this.version}`, { num_messages: this.messagesCount }], - this.types - ); - } -} diff --git a/src/common/blockchain/block-reader/block-reader.source.ts b/src/common/blockchain/block-reader/block-reader.source.ts deleted file mode 100644 index fb0fb37..0000000 --- a/src/common/blockchain/block-reader/block-reader.source.ts +++ /dev/null @@ -1,135 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -/* eslint-disable @typescript-eslint/no-unsafe-call */ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -/* eslint-disable @typescript-eslint/no-unsafe-return */ -import { log } from '@alien-worlds/api-core'; -import WebSocket from 'ws'; -import { BlockReaderConfig } from './block-reader.config'; -import { BlockReaderConnectionState } from './block-reader.enums'; -import { ConnectionChangeHandler } from './block-reader.types'; - -export class BlockReaderSource { - private messageHandler: (...args: unknown[]) => void; - private errorHandler: (...args: unknown[]) => void; - private client: WebSocket; - private endpoint: string; - private connectionState = BlockReaderConnectionState.Idle; - private connectionChangeHandlers: Map< - BlockReaderConnectionState, - ConnectionChangeHandler - > = new Map(); - private socketIndex = -1; - private reconnectDelay; - - constructor(private readonly config: BlockReaderConfig) {} - - private async updateConnectionState(state: BlockReaderConnectionState, data?: string) { - const previousState = state; - this.connectionState = state; - this.reconnectDelay = this.config.reconnectInterval || 1000; - const handler = this.connectionChangeHandlers.get(state); - if (handler) { - return handler({ previousState, state, data }); - } - } - - private getNextEndpoint() { - let nextIndex = ++this.socketIndex; - - if (nextIndex >= this.config.endpoints.length) { - nextIndex = 0; - } - this.socketIndex = nextIndex; - - return this.config.endpoints[this.socketIndex]; - } - - private waitUntilConnectionIsOpen() { - log(`BlockReader plugin connecting to: ${this.endpoint}`); - return new Promise(resolve => { - this.client.once('open', () => { - log(`BlockReader plugin connection open.`); - resolve(true); - }); - }); - } - - private async onConnectionClosed(code: number) { - this.client = null; - log(`BlockReader plugin connection closed with code #${code}.`); - await this.updateConnectionState(BlockReaderConnectionState.Idle); - } - - private receiveAbi() { - return new Promise(resolve => this.client.once('message', resolve)); - } - - public onError(handler: (error: Error) => void) { - this.errorHandler = handler; - } - - public onMessage(handler: (dto: Uint8Array) => void) { - this.messageHandler = handler; - } - - public addConnectionStateHandler( - state: BlockReaderConnectionState, - handler: ConnectionChangeHandler - ) { - if (this.connectionChangeHandlers.has(state)) { - console.warn(`Overriding the handler assigned to the "${state}" state`); - } else { - this.connectionChangeHandlers.set(state, handler); - } - } - - public get isConnected() { - return this.connectionState === BlockReaderConnectionState.Connected; - } - - public async connect() { - if (this.connectionState === BlockReaderConnectionState.Idle) { - log(`BlockReader plugin connecting...`); - try { - await this.updateConnectionState(BlockReaderConnectionState.Connecting); - this.endpoint = this.getNextEndpoint(); - this.client = new WebSocket(this.endpoint, { - perMessageDeflate: false, - }); - this.client.on('close', code => this.onConnectionClosed(code)); - this.client.on('error', error => this.errorHandler(error)); - await this.waitUntilConnectionIsOpen(); - // receive ABI - first message from WS is always ABI - const abi = await this.receiveAbi(); - // set message handler - this.client.on('message', message => this.messageHandler(message)); - - await this.updateConnectionState(BlockReaderConnectionState.Connected, abi); - } catch (error) { - setTimeout( - () => this.updateConnectionState(BlockReaderConnectionState.Idle), - this.reconnectDelay - ); - this.connectionState = BlockReaderConnectionState.Idle; - this.errorHandler(error); - } - } - } - - public async disconnect() { - if (this.connectionState === BlockReaderConnectionState.Connected) { - log(`BlockReader plugin disconnecting...`); - try { - await this.updateConnectionState(BlockReaderConnectionState.Disconnecting); - this.client.removeAllListeners(); - this.client.close(); - } catch (error) { - this.errorHandler(error); - } - } - } - - public send(message: Uint8Array) { - this.client.send(message); - } -} diff --git a/src/common/blockchain/block-reader/block-reader.ts b/src/common/blockchain/block-reader/block-reader.ts deleted file mode 100644 index 412f17e..0000000 --- a/src/common/blockchain/block-reader/block-reader.ts +++ /dev/null @@ -1,243 +0,0 @@ -/* eslint-disable @typescript-eslint/restrict-template-expressions */ -import { MongoSource, log } from '@alien-worlds/api-core'; -import { BlockReaderConfig } from './block-reader.config'; -import { BlockReaderConnectionState } from './block-reader.enums'; -import { - AbiNotFoundError, - MissingHandlersError, - UnhandledMessageError, - UnhandledMessageTypeError, -} from './block-reader.errors'; -import { BlockReaderSource } from './block-reader.source'; -import { BlockReaderOptions, ConnectionChangeHandlerOptions } from './block-reader.types'; -import { BlockReaderMessage } from './block-reader.message'; -import { GetBlocksAckRequest, GetBlocksRequest } from './block-reader.requests'; -import { Block } from './block/block'; -import { ShipAbiSource } from '../../ship/ship-abi.source'; -import { Abi, AbiJson } from '../abi'; - -export class BlockReader { - public static async create(config: BlockReaderConfig): Promise { - const mongoSource = await MongoSource.create(config.mongo); - const shipAbiSource = new ShipAbiSource(mongoSource); - const source = new BlockReaderSource(config); - source.onError(error => log(error)); - - return new BlockReader(source, shipAbiSource); - } - - private errorHandler: (error: Error) => void; - private warningHandler: (...args: unknown[]) => void; - private receivedBlockHandler: (content: Block) => Promise | void; - private blockRangeCompleteHandler: ( - startBlock: bigint, - endBlock: bigint - ) => Promise; - private _blockRangeRequest: GetBlocksRequest; - private _abi: Abi; - private _paused = false; - private isLastBlock = false; - - constructor(private source: BlockReaderSource, private shipAbi: ShipAbiSource) { - this.source.onMessage(message => this.onMessage(message)); - this.source.onError(error => { - this.handleError(error); - }); - this.source.addConnectionStateHandler(BlockReaderConnectionState.Connected, options => - this.onConnected(options) - ); - this.source.addConnectionStateHandler(BlockReaderConnectionState.Idle, options => - this.onDisconnected(options) - ); - } - - private async onConnected({ data }: ConnectionChangeHandlerOptions) { - log(`BlockReader plugin connected`); - - const abi = Abi.fromJson(JSON.parse(data) as AbiJson); - if (abi) { - const result = await this.shipAbi.getAbi(abi.version); - - if (result.isFailure) { - await this.shipAbi.updateAbi(abi); - } - this._abi = abi; - } - } - - private onDisconnected({ previousState }: ConnectionChangeHandlerOptions) { - log(`BlockReader plugin disconnected`); - if (previousState === BlockReaderConnectionState.Disconnecting) { - this._abi = null; - } - this.connect(); - } - - public get abi(): Abi { - return this._abi; - } - - public onMessage(dto: Uint8Array): Promise { - const { abi } = this; - - if (!abi) { - this.handleError(new AbiNotFoundError()); - return; - } - - const message = BlockReaderMessage.create(dto, abi); - - if (message && message.isPongMessage === false) { - this.handleBlocksResultContent(message.content); - } else if (!message) { - this.handleError(new UnhandledMessageTypeError(message.type)); - } - } - - private async handleBlocksResultContent(result: Block) { - const { thisBlock } = result; - const { abi } = this; - - // skip any extra result messages - if (this.isLastBlock) { - return; - } - - if (!abi) { - this.handleError(new AbiNotFoundError()); - return; - } - - try { - if (thisBlock) { - const { - _blockRangeRequest: { startBlock, endBlock }, - } = this; - this.isLastBlock = thisBlock.blockNumber === endBlock - 1n; - - if (this.isLastBlock) { - await this.receivedBlockHandler(result); - this.blockRangeCompleteHandler(startBlock, endBlock); - } else { - this.receivedBlockHandler(result); - // State history plugs will answer every call of ack_request, even after - // processing the full range, it will send messages containing only head. - // After the block has been processed, the connection should be closed so - // there is no need to ack request. - if (this.source.isConnected && this._paused === false) { - // Acknowledge a request so that source can send next one. - this.source.send( - new GetBlocksAckRequest(1, abi.getTypesMap()).toUint8Array() - ); - } - } - } else { - this.handleWarning(`the received message does not contain this_block`); - } - } catch (error) { - this.handleError(new UnhandledMessageError(result, error)); - } - } - - private handleError(error: Error) { - if (this.errorHandler) { - return this.errorHandler(error); - } - } - - private handleWarning(...args: unknown[]) { - if (this.warningHandler) { - return this.warningHandler(...args); - } - } - - public async connect(): Promise { - if (this.source.isConnected === false) { - await this.source.connect(); - } else { - log(`Service already connected`); - } - } - - public async disconnect(): Promise { - if (this.source.isConnected) { - await this.source.disconnect(); - } else { - log(`Service not connected`); - } - } - - public pause(): void { - if (this._paused === false) { - this._paused = true; - } - } - - public resume(): void { - if (this._paused && !this.isLastBlock) { - this._paused = false; - this.source.send(new GetBlocksAckRequest(1, this.abi.getTypesMap()).toUint8Array()); - } - } - - public readBlocks( - startBlock: bigint, - endBlock: bigint, - options?: BlockReaderOptions - ): void { - this.sendRequest(startBlock, endBlock, options); - log(`BlockReader plugin: read blocks`, { startBlock, endBlock }); - } - - public readOneBlock(block: bigint, options?: BlockReaderOptions): void { - this.sendRequest(block, block + 1n, options); - log(`BlockReader plugin: read single block ${block}`); - } - - private sendRequest( - startBlock: bigint, - endBlock: bigint, - options?: BlockReaderOptions - ): void { - const requestOptions = options || { - shouldFetchDeltas: true, - shouldFetchTraces: true, - }; - - this.isLastBlock = false; - this.resume(); - - const { abi, receivedBlockHandler, source } = this; - if (!receivedBlockHandler) { - throw new MissingHandlersError(); - } - - if (!abi) { - throw new AbiNotFoundError(); - } - - this._blockRangeRequest = GetBlocksRequest.create( - startBlock, - endBlock, - requestOptions, - abi.getTypesMap() - ); - source.send(this._blockRangeRequest.toUint8Array()); - } - - public onReceivedBlock(handler: (content: Block) => Promise | void) { - this.receivedBlockHandler = handler; - } - - public onComplete(handler: (startBlock: bigint, endBlock: bigint) => Promise) { - this.blockRangeCompleteHandler = handler; - } - - public onError(handler: (error: Error) => void) { - this.errorHandler = handler; - } - - public onWarning(handler: (...args: unknown[]) => void) { - this.warningHandler = handler; - } -} diff --git a/src/common/blockchain/block-reader/block-reader.types.ts b/src/common/blockchain/block-reader/block-reader.types.ts deleted file mode 100644 index 230a818..0000000 --- a/src/common/blockchain/block-reader/block-reader.types.ts +++ /dev/null @@ -1,40 +0,0 @@ -import { BlockReaderConnectionState } from './block-reader.enums'; - -export type ConnectionChangeHandlerOptions = { - previousState: BlockReaderConnectionState; - state: BlockReaderConnectionState; - data: string; -}; - -export type ConnectionChangeHandler = ( - options: ConnectionChangeHandlerOptions -) => void | Promise; - -export type BlockReaderOptions = { - shouldFetchDeltas?: boolean; - shouldFetchTraces?: boolean; - shouldFetchBlock?: boolean; -}; - -export type GetBlocksResultMessageContent = { - head?: { - block_num: number; - block_id: string; - }; - last_irreversible?: { - block_num: number; - block_id: string; - }; - this_block?: { - block_num: number; - block_id: string; - }; - prev_block?: { - block_num: number; - block_id: string; - }; - block?: Uint8Array; - traces?: Uint8Array; - deltas?: Uint8Array; - [key: string]: unknown; -}; diff --git a/src/common/blockchain/block-reader/block-reader.utils.ts b/src/common/blockchain/block-reader/block-reader.utils.ts deleted file mode 100644 index f990211..0000000 --- a/src/common/blockchain/block-reader/block-reader.utils.ts +++ /dev/null @@ -1,37 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-call */ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { Serialize } from 'eosjs'; -import { TextDecoder, TextEncoder } from 'text-encoding'; - -export const serializeMessage = ( - type: string, - value: unknown, - types: Map -) => { - const buffer = new Serialize.SerialBuffer({ - textEncoder: new TextEncoder(), - textDecoder: new TextDecoder(), - }); - Serialize.getType(types, type).serialize(buffer, value); - return buffer.asUint8Array(); -}; - -// eslint-disable-next-line @typescript-eslint/no-explicit-any -export const deserializeMessage = ( - type: string, - array: Uint8Array, - types: Map -): T => { - const buffer = new Serialize.SerialBuffer({ - textEncoder: new TextEncoder(), - textDecoder: new TextDecoder(), - array, - }); - const result = Serialize.getType(types, type).deserialize( - buffer, - new Serialize.SerializerState({ bytesAsUint8Array: true }) - ); - - if (buffer.readPos != array.length) throw new Error('oops: ' + type); // todo: remove check - return result; -}; diff --git a/src/common/blockchain/block-reader/block/block.ts b/src/common/blockchain/block-reader/block/block.ts deleted file mode 100644 index af6086f..0000000 --- a/src/common/blockchain/block-reader/block/block.ts +++ /dev/null @@ -1,145 +0,0 @@ -import { MongoDB, parseToBigInt } from '@alien-worlds/api-core'; -import { - BlockDocument, - BlockJson, - BlockNumberWithIdDocument, - BlockNumberWithIdJson, -} from './block.types'; - -export class BlockNumberWithId { - public static fromJson(dto: BlockNumberWithIdJson) { - const { block_id, block_num } = dto; - return new BlockNumberWithId(parseToBigInt(block_num), block_id); - } - - public static fromDocument(dto: BlockNumberWithIdDocument) { - const { block_id, block_num } = dto; - return new BlockNumberWithId(parseToBigInt(block_num), block_id); - } - - private constructor( - public readonly blockNumber: bigint, - public readonly blockId: string - ) {} - - public toJson() { - return { - block_num: this.blockNumber.toString(), - block_id: this.blockId, - }; - } - - public toDocument() { - return { - block_num: MongoDB.Long.fromBigInt(this.blockNumber), - block_id: this.blockId, - }; - } -} - -export class Block { - public static fromJson(json: BlockJson): Block { - const { block, traces, deltas, abi_version } = json; - const head = BlockNumberWithId.fromJson(json.head); - const lastIrreversible = BlockNumberWithId.fromJson(json.last_irreversible); - const prevBlock = BlockNumberWithId.fromJson(json.prev_block); - const thisBlock = BlockNumberWithId.fromJson(json.this_block); - - return new Block( - head, - lastIrreversible, - prevBlock, - thisBlock, - block, - traces, - deltas, - abi_version - ); - } - - public static fromDocument(content: BlockDocument): Block { - const { block, traces, deltas, _id, abi_version } = content; - const head = BlockNumberWithId.fromDocument(content.head); - const lastIrreversible = BlockNumberWithId.fromDocument(content.last_irreversible); - const prevBlock = BlockNumberWithId.fromDocument(content.prev_block); - const thisBlock = BlockNumberWithId.fromDocument(content.this_block); - - return new Block( - head, - lastIrreversible, - prevBlock, - thisBlock, - block.buffer, - traces.buffer, - deltas.buffer, - abi_version, - _id.toString() - ); - } - - private constructor( - public readonly head: BlockNumberWithId, - public readonly lastIrreversible: BlockNumberWithId, - public readonly prevBlock: BlockNumberWithId, - public readonly thisBlock: BlockNumberWithId, - public readonly block: Uint8Array, - public readonly traces: Uint8Array, - public readonly deltas: Uint8Array, - public readonly abiVersion?: string, - public readonly id?: string - ) {} - - public toJson(): BlockJson { - const { head, thisBlock, prevBlock, lastIrreversible, block, traces, deltas, abiVersion } = this; - - const json: BlockJson = { - head: head.toJson(), - this_block: thisBlock.toJson(), - prev_block: prevBlock.toJson(), - last_irreversible: lastIrreversible.toJson(), - block, - traces, - deltas, - }; - - if (abiVersion) { - json.abi_version = abiVersion; - } - - return json; - } - - public toDocument(): BlockDocument { - const { - head, - thisBlock, - prevBlock, - lastIrreversible, - block, - traces, - deltas, - id, - abiVersion, - } = this; - - const document: BlockDocument = { - head: head.toDocument(), - this_block: thisBlock.toDocument(), - prev_block: prevBlock.toDocument(), - last_irreversible: lastIrreversible.toDocument(), - block: new MongoDB.Binary(block), - traces: new MongoDB.Binary(traces), - deltas: new MongoDB.Binary(deltas), - }; - - if (abiVersion) { - document.abi_version = abiVersion; - } - - if (id) { - document._id = new MongoDB.ObjectId(id); - } - - return document; - } -} diff --git a/src/common/blockchain/block-reader/block/block.types.ts b/src/common/blockchain/block-reader/block/block.types.ts deleted file mode 100644 index 356c5b0..0000000 --- a/src/common/blockchain/block-reader/block/block.types.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { MongoDB } from '@alien-worlds/api-core'; - -export type BlockNumberWithIdJson = { - block_num: string; - block_id: string; -}; - -export type BlockJson = { - head?: BlockNumberWithIdJson; - this_block?: BlockNumberWithIdJson; - last_irreversible?: BlockNumberWithIdJson; - prev_block?: BlockNumberWithIdJson; - block?: Uint8Array; - traces?: Uint8Array; - deltas?: Uint8Array; - abi_version?: string; -}; - -export type BlockDocument = { - head?: BlockNumberWithIdDocument; - this_block?: BlockNumberWithIdDocument; - last_irreversible?: BlockNumberWithIdDocument; - prev_block?: BlockNumberWithIdDocument; - block?: MongoDB.Binary; - traces?: MongoDB.Binary; - deltas?: MongoDB.Binary; - _id?: MongoDB.ObjectId; - abi_version?: string; - [key: string]: unknown; -}; - -export type BlockNumberWithIdDocument = { - block_num?: MongoDB.Long; - block_id?: string; -}; diff --git a/src/common/blockchain/block-reader/block/index.ts b/src/common/blockchain/block-reader/block/index.ts deleted file mode 100644 index 788a4b9..0000000 --- a/src/common/blockchain/block-reader/block/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './block'; -export * from './block.types'; diff --git a/src/common/blockchain/block-reader/index.ts b/src/common/blockchain/block-reader/index.ts deleted file mode 100644 index 6475d7c..0000000 --- a/src/common/blockchain/block-reader/index.ts +++ /dev/null @@ -1,9 +0,0 @@ -export * from './block-reader.config'; -export * from './block-reader.enums'; -export * from './block-reader.errors'; -export * from './block-reader.message'; -export * from './block-reader.requests'; -export * from './block-reader.source'; -export * from './block-reader'; -export * from './block-reader.types'; -export * from './block-reader.utils'; diff --git a/src/common/blockchain/blockchain.ts b/src/common/blockchain/blockchain.ts deleted file mode 100644 index dd8f661..0000000 --- a/src/common/blockchain/blockchain.ts +++ /dev/null @@ -1,45 +0,0 @@ -import fetch from 'node-fetch'; -import { parseToBigInt } from '@alien-worlds/api-core'; -import { Api, JsonRpc } from 'eosjs'; -import { GetInfoResult } from 'eosjs/dist/eosjs-rpc-interfaces'; -import { BlockchainConfig } from './blockchain.types'; - -export class Blockchain { - public static create(config: BlockchainConfig): Blockchain { - const { endpoint, chainId } = config; - const api = new Api({ - rpc: new JsonRpc(endpoint, { fetch }), - chainId, - signatureProvider: null, - textDecoder: new TextDecoder(), - textEncoder: new TextEncoder(), - }); - - return new Blockchain(endpoint, chainId, api); - } - - private constructor( - protected endpoint: string, - protected chainId: string, - protected api: Api - ) {} - - public getInfo = async (): Promise => { - return this.api.rpc.get_info(); - }; - - public async getHeadBlockNumber(): Promise { - const info = await this.api.rpc.get_info(); - const value = parseToBigInt(info.head_block_num); - return value; - } - - public async getLastIrreversibleBlockNumber(): Promise { - const info = await this.api.rpc.get_info(); - const value = parseToBigInt(info.last_irreversible_block_num); - return value; - } -} - -//log(`Head block number: ${value.toString()}`); -//log(`Last irreversible block number: ${value.toString()}`); \ No newline at end of file diff --git a/src/common/blockchain/blockchain.types.ts b/src/common/blockchain/blockchain.types.ts deleted file mode 100644 index d3479cb..0000000 --- a/src/common/blockchain/blockchain.types.ts +++ /dev/null @@ -1,4 +0,0 @@ -export type BlockchainConfig = { - endpoint: string; - chainId: string; -}; diff --git a/src/common/blockchain/contract-reader/contract-reader.config.ts b/src/common/blockchain/contract-reader/contract-reader.config.ts deleted file mode 100644 index 3d52795..0000000 --- a/src/common/blockchain/contract-reader/contract-reader.config.ts +++ /dev/null @@ -1,3 +0,0 @@ -export type ContractReaderConfig = { - url: string; -}; diff --git a/src/common/blockchain/contract-reader/contract-reader.dtos.ts b/src/common/blockchain/contract-reader/contract-reader.dtos.ts deleted file mode 100644 index 61cf327..0000000 --- a/src/common/blockchain/contract-reader/contract-reader.dtos.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { MongoDB } from '@alien-worlds/api-core'; - -export type FeaturedContractDocument = { - _id?: MongoDB.ObjectId; - account?: string; - initial_block_number?: MongoDB.Long; -}; - -export type FeaturedContractModel = { - account: string; - initialBlockNumber: bigint; -}; - -export type FetchContractResponse = { - account: string; - block_num: string | number; -}; diff --git a/src/common/blockchain/contract-reader/contract-reader.ts b/src/common/blockchain/contract-reader/contract-reader.ts deleted file mode 100644 index 4ccdc4c..0000000 --- a/src/common/blockchain/contract-reader/contract-reader.ts +++ /dev/null @@ -1,95 +0,0 @@ -import fetch from 'node-fetch'; -import { MongoConfig, MongoSource, log } from '@alien-worlds/api-core'; -import { ContractReaderConfig } from './contract-reader.config'; -import { FetchContractResponse } from './contract-reader.dtos'; -import { FeaturedContract } from './featured-contract'; -import { FeaturedContractSource } from './featured-contract.source'; - -export abstract class ContractReader { - public static async create( - config: ContractReaderConfig, - mongo: MongoSource | MongoConfig - ): Promise { - let mongoSource: MongoSource; - - log(` * Contract Reader ... [starting]`); - - if (mongo instanceof MongoSource) { - mongoSource = mongo; - } else { - mongoSource = await MongoSource.create(mongo); - } - const source = new FeaturedContractSource(mongoSource); - const contractReader = new ContractReaderService(source, config); - - log(` * Contract Reader ... [ready]`); - - return contractReader; - } - - public abstract getInitialBlockNumber(contract: string): Promise; - public abstract readContracts(contracts: string[]): Promise; -} - -export class ContractReaderService implements ContractReader { - private cache: Map = new Map(); - - constructor( - private source: FeaturedContractSource, - private config: ContractReaderConfig - ) {} - - private async fetchContract(account: string): Promise { - try { - const { url } = this.config; - - const res = await fetch( - `${url}/v2/history/get_actions?account=eosio&act.name=setabi&act.authorization.actor=${account}&limit=1&sort=asc` - ); - const json = await res.json(); - - const block_num = json.actions[0].block_num; - return { account, block_num }; - } catch (error) { - log(`An error occurred while retrieving contract data. ${error.message}`); - return null; - } - } - - public async getInitialBlockNumber(contract: string): Promise { - try { - const list = await this.readContracts([contract]); - return list[0].initialBlockNumber; - } catch (error) { - return -1n; - } - } - - public async readContracts(contracts: string[]): Promise { - const list: FeaturedContract[] = []; - for (const contract of contracts) { - let entity: FeaturedContract; - if (this.cache.has(contract)) { - list.push(this.cache.get(contract)); - } else { - const document = await this.source.findOne({ filter: { account: contract } }); - - if (document) { - entity = FeaturedContract.fromDocument(document); - this.cache.set(entity.account, entity); - list.push(entity); - } else { - const resp = await this.fetchContract(contract); - if (resp) { - entity = FeaturedContract.create(resp.account, resp.block_num); - this.cache.set(entity.account, entity); - this.source.insert(entity.toDocument()); - list.push(entity); - } - } - } - } - - return list; - } -} diff --git a/src/common/blockchain/contract-reader/featured-contract.source.ts b/src/common/blockchain/contract-reader/featured-contract.source.ts deleted file mode 100644 index 235c43c..0000000 --- a/src/common/blockchain/contract-reader/featured-contract.source.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { CollectionMongoSource, MongoDB, MongoSource } from '@alien-worlds/api-core'; -import { FeaturedContractDocument } from './contract-reader.dtos'; - -export class FeaturedContractSource extends CollectionMongoSource { - constructor(mongoSource: MongoSource) { - super(mongoSource, 'history_tools.featured_contracts', { - indexes: [ - { key: { account: 1 }, background: true }, - { key: { initial_block_number: 1, account: 1 }, unique: true, background: true }, - ], - }); - } - - public async getInitialBlockNumber(account: string): Promise { - const contract: FeaturedContractDocument = await this.findOne({ - filter: { account }, - }); - return contract ? contract.initial_block_number : MongoDB.Long.MIN_VALUE; - } - - public async newState(account: string, initialBlockNumber: bigint): Promise { - await this.update( - { - initial_block_number: MongoDB.Long.fromBigInt(initialBlockNumber), - account, - }, - { options: { upsert: true } } - ); - } -} diff --git a/src/common/blockchain/contract-reader/featured-contract.ts b/src/common/blockchain/contract-reader/featured-contract.ts deleted file mode 100644 index 73e7548..0000000 --- a/src/common/blockchain/contract-reader/featured-contract.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { - MongoDB, - parseToBigInt, - removeUndefinedProperties, -} from '@alien-worlds/api-core'; -import { FeaturedContractDocument } from './contract-reader.dtos'; - -export class FeaturedContract { - /** - * @constructor - * @private - * @param {string} id - * @param {bigint} initialBlockNumber - * @param {bigint} account - */ - private constructor( - public readonly id: string, - public readonly initialBlockNumber: bigint, - public readonly account: string - ) {} - - public static create(account: string, initialBlockNumber: string | number) { - return new FeaturedContract('', parseToBigInt(initialBlockNumber), account); - } - - public static fromDocument(document: FeaturedContractDocument) { - const { initial_block_number, _id, account } = document; - - return new FeaturedContract( - _id ? _id.toString() : '', - parseToBigInt(initial_block_number), - account - ); - } - - public toDocument() { - const { id, initialBlockNumber, account } = this; - const doc: FeaturedContractDocument = { - initial_block_number: MongoDB.Long.fromBigInt(initialBlockNumber), - account, - }; - - if (id) { - doc._id = new MongoDB.ObjectId(id); - } - - return removeUndefinedProperties(doc); - } -} diff --git a/src/common/blockchain/contract-reader/index.ts b/src/common/blockchain/contract-reader/index.ts deleted file mode 100644 index fd79e8c..0000000 --- a/src/common/blockchain/contract-reader/index.ts +++ /dev/null @@ -1,5 +0,0 @@ -export * from './contract-reader'; -export * from './contract-reader.config'; -export * from './contract-reader.dtos'; -export * from './featured-contract.source'; -export * from './featured-contract'; diff --git a/src/common/blockchain/contract/action-trace/__tests__/action-trace.unit.test.ts b/src/common/blockchain/contract/action-trace/__tests__/action-trace.unit.test.ts deleted file mode 100644 index 4c82cef..0000000 --- a/src/common/blockchain/contract/action-trace/__tests__/action-trace.unit.test.ts +++ /dev/null @@ -1,63 +0,0 @@ -/* eslint-disable @typescript-eslint/no-explicit-any */ -import { Act, ActionTrace, Receipt } from '../action-trace'; -import { ActJson, ActionTraceDto } from '../action-trace.dtos'; - -const actDto: ActJson = { - account: 'foo.account', - name: 'foo.name', - authorization: { - actor: 'foo.actor', - permission: 'foo.permission', - }, - data: [] as any, -}; - -const receiptDto = { - receiver: 'foo', - act_digest: 'act', - global_sequence: '100', - recv_sequence: '100', - auth_sequence: [{ account: 'foo', sequence: 'foo_sequence' }], - code_sequence: 100, - abi_sequence: 100, -}; - -describe('Act Unit tests', () => { - it('"create" should create entity', () => { - const entity = Act.create(actDto); - expect(entity.account).toEqual(actDto.account); - expect(entity.name).toEqual(actDto.name); - expect(entity.authorization).toEqual(actDto.authorization); - expect(entity.data).toEqual(actDto.data); - }); -}); - -describe('Receipt Unit tests', () => { - it('"create" should create entity', () => { - const entity = Receipt.create('foo', receiptDto); - - expect(entity).not.toBeUndefined(); - }); -}); - -const actionTraceDto: ActionTraceDto = { - action_ordinal: 1, - creator_action_ordinal: 1, - receipt: ['foo', receiptDto], - receiver: 'receiver', - act: actDto, - context_free: true, - elapsed: 'elapsed', - console: 'foo_console', - account_ram_deltas: [], - except: '', - error_code: '200', -}; - -describe('ActionTrace Unit tests', () => { - it('"create" should create entity', () => { - const entity = ActionTrace.create('foo', actionTraceDto); - - expect(entity).not.toBeUndefined(); - }); -}); diff --git a/src/common/blockchain/contract/action-trace/action-trace.dtos.ts b/src/common/blockchain/contract/action-trace/action-trace.dtos.ts deleted file mode 100644 index a080d68..0000000 --- a/src/common/blockchain/contract/action-trace/action-trace.dtos.ts +++ /dev/null @@ -1,80 +0,0 @@ -export type AuthSequenceJson = { - account: string; - sequence: string; -}; - -export type ReceiptJson = { - receiver: string; - act_digest: string; - global_sequence: string; - recv_sequence: string; - auth_sequence: AuthSequenceJson[]; - code_sequence: number; - abi_sequence: number; -}; - -export type ReceiptByNameDto = [string, ReceiptJson]; - -export type ActAuthJson = { - actor: string; - permission: string; -}; - -export type ActJson = { - account: string; - name: string; - authorization: ActAuthJson; - data: Uint8Array; -}; - -export type ActionTraceDto = { - ship_message_name?: string; - action_ordinal?: number; - creator_action_ordinal?: number; - receipt?: ReceiptByNameDto; - receiver?: string; - act?: ActJson; - context_free?: boolean; - elapsed?: string; - console?: string; - account_ram_deltas?: unknown[]; - except?: unknown; - error_code?: string | number; -}; - -export type ActionTraceByNameDto = [string, ActionTraceDto]; - -export type ActionTraceModel = { - shipTraceMessageName?: string; - actionOrdinal?: number; - creatorActionOrdinal?: number; - receipt?: { - shipMessageName?: string; - receiver?: string; - actDigest?: string; - globalSequence?: bigint; - recvSequence?: bigint; - authSequence?: { - account?: string; - sequence?: string; - }[]; - codeSequence?: number; - abiSequence?: number; - }; - receiver?: string; - act?: { - account?: string; - name?: string; - authorization?: { - actor?: string; - permission?: string; - }; - data?: Uint8Array; - }; - isContextFree?: boolean; - elapsed?: string; - console?: string; - accountRamDeltas?: unknown[]; - except?: unknown; - errorCode?: number; -}; diff --git a/src/common/blockchain/contract/action-trace/action-trace.ts b/src/common/blockchain/contract/action-trace/action-trace.ts deleted file mode 100644 index d461625..0000000 --- a/src/common/blockchain/contract/action-trace/action-trace.ts +++ /dev/null @@ -1,127 +0,0 @@ -import { parseToBigInt } from '@alien-worlds/api-core'; -import { ActAuthJson, ActJson, ActionTraceDto, ReceiptJson } from './action-trace.dtos'; - -export class ActAuth { - public static create(dto: ActAuthJson): ActAuth { - const { actor, permission } = dto; - - return new ActAuth(actor, permission); - } - private constructor( - public readonly actor: string, - public readonly permission: string - ) {} -} - -export class Act { - public static create(dto: ActJson): Act { - const { account, name, data } = dto; - - //parse DATA - let authorization: ActAuth; - - if (dto.authorization) { - authorization = ActAuth.create(dto.authorization); - } - - return new Act(account, name, authorization, data); - } - private constructor( - public readonly account: string, - public readonly name: string, - public readonly authorization: ActAuth, - public readonly data: Uint8Array - ) {} -} - -export type AuthSequence = { - account: string; - sequence: string; -}; - -export class Receipt { - public static create(shipMessageName: string, dto: ReceiptJson): Receipt { - const { - receiver, - act_digest, - global_sequence, - recv_sequence, - auth_sequence, - code_sequence, - abi_sequence, - } = dto; - return new Receipt( - shipMessageName, - receiver, - act_digest, - parseToBigInt(global_sequence), - parseToBigInt(recv_sequence), - auth_sequence, - code_sequence, - abi_sequence - ); - } - private constructor( - public readonly shipMessageName: string, - public readonly receiver: string, - public readonly actDigest: string, - public readonly globalSequence: bigint, - public readonly recvSequence: bigint, - public readonly authSequence: AuthSequence[], - public readonly codeSequence: number, - public readonly abiSequence: number - ) {} -} - -export class ActionTrace { - public static create(shipMessageName: string, dto: ActionTraceDto): ActionTrace { - const { - action_ordinal, - creator_action_ordinal, - receiver, - act, - context_free, - elapsed, - console, - account_ram_deltas, - except, - error_code, - } = dto; - - let receipt: Receipt; - if (dto.receipt && dto.receipt.length) { - const [receiptType, receiptContent] = dto.receipt; - receipt = Receipt.create(receiptType, receiptContent); - } - - return new ActionTrace( - shipMessageName, - action_ordinal, - creator_action_ordinal, - receipt, - receiver, - Act.create(act), - context_free, - elapsed, - console, - account_ram_deltas, - except, - Number(error_code) - ); - } - - private constructor( - public readonly shipMessageName: string, - public readonly actionOrdinal: number, - public readonly creatorActionOrdinal: number, - public readonly receipt: Receipt | null, - public readonly receiver: string, - public readonly act: Act, - public readonly isContextFree: boolean, - public readonly elapsed: string, - public readonly console: string, - public readonly accountRamDeltas: unknown[], - public readonly except: unknown, - public readonly errorCode: number - ) {} -} diff --git a/src/common/blockchain/contract/action-trace/index.ts b/src/common/blockchain/contract/action-trace/index.ts deleted file mode 100644 index 986e248..0000000 --- a/src/common/blockchain/contract/action-trace/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './action-trace'; -export * from './action-trace.dtos'; \ No newline at end of file diff --git a/src/common/blockchain/contract/delta/__tests__/delta.unit.test.ts b/src/common/blockchain/contract/delta/__tests__/delta.unit.test.ts deleted file mode 100644 index 41a198f..0000000 --- a/src/common/blockchain/contract/delta/__tests__/delta.unit.test.ts +++ /dev/null @@ -1,14 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unused-vars */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { Delta } from '../delta'; - -describe('Delta Unit tests', () => { - it('"create" should create Delta entity based on given DTO', async () => { - const entity = Delta.create('foo', { - name: 'delta', - rows: [{ present: 1, data: Uint8Array.from([]) }], - }); - expect(entity).toBeInstanceOf(Delta); - }); -}); diff --git a/src/common/blockchain/contract/delta/delta.dtos.ts b/src/common/blockchain/contract/delta/delta.dtos.ts deleted file mode 100644 index f4a013f..0000000 --- a/src/common/blockchain/contract/delta/delta.dtos.ts +++ /dev/null @@ -1,22 +0,0 @@ -export type DeltaRowDto = { - present?: number; - data?: Uint8Array; -}; - -export type DeltaJson = { - name?: string; - rows?: DeltaRowDto[]; -}; - -export type DeltaByNameDto = [string, DeltaJson]; - -export type DeltaRowModel = { - present?: number; - data?: Uint8Array; -}; - -export type DeltaModel = { - shipDeltaMessageName?: string; - name?: string; - rows?: DeltaRowModel[]; -}; diff --git a/src/common/blockchain/contract/delta/delta.ts b/src/common/blockchain/contract/delta/delta.ts deleted file mode 100644 index cd0981b..0000000 --- a/src/common/blockchain/contract/delta/delta.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { DeltaJson, DeltaRowDto } from './delta.dtos'; - -export class DeltaRow { - public static create(dto: DeltaRowDto): DeltaRow { - const { present, data } = dto; - return new DeltaRow(present, data); - } - - private constructor( - public readonly present: number, - public readonly data: Uint8Array - ) {} -} - -export class Delta { - public static create(shipMessageName: string, dto: DeltaJson): Delta { - const { name, rows } = dto; - - return new Delta( - shipMessageName, - name, - rows.map(dto => DeltaRow.create(dto)) - ); - } - - private constructor( - public readonly shipDeltaMessageName: string, - public readonly name: string, - public readonly rows: DeltaRow[] - ) {} -} diff --git a/src/common/blockchain/contract/delta/index.ts b/src/common/blockchain/contract/delta/index.ts deleted file mode 100644 index 248ee3a..0000000 --- a/src/common/blockchain/contract/delta/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './delta'; -export * from './delta.dtos'; \ No newline at end of file diff --git a/src/common/blockchain/contract/index.ts b/src/common/blockchain/contract/index.ts deleted file mode 100644 index 4c0314e..0000000 --- a/src/common/blockchain/contract/index.ts +++ /dev/null @@ -1,5 +0,0 @@ -export * from './action-trace/action-trace'; -export * from './signed-block'; -export * from './delta/delta'; -export * from './trace/trace'; -export * from './transaction/transaction'; diff --git a/src/common/blockchain/contract/signed-block/__tests__/block.unit.test.ts b/src/common/blockchain/contract/signed-block/__tests__/block.unit.test.ts deleted file mode 100644 index 2afd95a..0000000 --- a/src/common/blockchain/contract/signed-block/__tests__/block.unit.test.ts +++ /dev/null @@ -1,47 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unused-vars */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { SignedBlock } from '../signed-block'; -const dto = { - timestamp: '2022-06-30T12:28:32.900Z', - producer: 'some_producer', - confirmed: 0, - previous: 'previous_value', - transaction_mroot: 'mroot', - action_mroot: 'action', - schedule_version: 1, - new_producers: '', - header_extensions: [], - producer_signature: 'prod', - transactions: [ - { - status: 0, - cpu_usage_us: 1, - net_usage_words: 1, - trx: ['', ''], - }, - ] as any, -}; - -describe('Block Unit tests', () => { - beforeAll(() => { - jest.useFakeTimers(); - jest.setSystemTime(new Date(2022, 4, 5)); - }); - - afterAll(() => { - jest.useRealTimers(); - }); - - it('"create" should create Block entity based on given DTO', async () => { - const entity = SignedBlock.create(dto); - expect(entity).toBeInstanceOf(SignedBlock); - }); - - it('"create" should use system current timestamp if DTO does not have one', async () => { - dto.timestamp = ''; - const entity = SignedBlock.create(dto); - expect(entity.timestamp.toISOString()).toEqual('2022-05-04T22:00:00.000Z'); - expect(entity).toBeInstanceOf(SignedBlock); - }); -}); diff --git a/src/common/blockchain/contract/signed-block/index.ts b/src/common/blockchain/contract/signed-block/index.ts deleted file mode 100644 index 0a24efc..0000000 --- a/src/common/blockchain/contract/signed-block/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './signed-block'; -export * from './signed-block.dtos'; diff --git a/src/common/blockchain/contract/signed-block/signed-block.ts b/src/common/blockchain/contract/signed-block/signed-block.ts deleted file mode 100644 index 86c2074..0000000 --- a/src/common/blockchain/contract/signed-block/signed-block.ts +++ /dev/null @@ -1,52 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ - -import { parseDateToMs } from '@alien-worlds/api-core'; -import { Transaction } from '../transaction/transaction'; -import { SignedBlockJson } from './signed-block.dtos'; - -export class SignedBlock { - public static create(dto: SignedBlockJson): SignedBlock { - const { - producer, - confirmed, - previous, - transaction_mroot, - action_mroot, - schedule_version, - new_producers, - header_extensions, - producer_signature, - transactions, - } = dto; - - const timestamp = dto.timestamp ? new Date(parseDateToMs(dto.timestamp)) : new Date(); - - return new SignedBlock( - timestamp, - producer, - confirmed, - previous, - transaction_mroot, - action_mroot, - schedule_version, - new_producers, - header_extensions, - producer_signature, - transactions.map(dto => Transaction.create(dto)) - ); - } - - private constructor( - public readonly timestamp: Date, - public readonly producer: string, - public readonly confirmed: number, - public readonly previous: string, - public readonly transactionMroot: string, - public readonly actionMroot: string, - public readonly scheduleVersion: number, - public readonly newProducers: unknown, - public readonly headerExtensions: unknown[], - public readonly producerSignature: string, - public readonly transactions: Transaction[] - ) {} -} diff --git a/src/common/blockchain/contract/trace/__tests__/trace.unit.test.ts b/src/common/blockchain/contract/trace/__tests__/trace.unit.test.ts deleted file mode 100644 index 0431de4..0000000 --- a/src/common/blockchain/contract/trace/__tests__/trace.unit.test.ts +++ /dev/null @@ -1,68 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unused-vars */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { Trace } from '../trace'; - -const dto = { - id: 'foo', - status: 0, - cpu_usage_us: 100, - net_usage_words: 100, - elapsed: '', - net_usage: '100', - scheduled: false, - action_traces: [ - [ - 'action', - { - action_ordinal: 0, - creator_action_ordinal: 0, - receipt: [ - 'foo_receipt', - { - receiver: 'receiver', - act_digest: '', - global_sequence: '', - recv_sequence: '', - auth_sequence: [], - code_sequence: 0, - abi_sequence: 10, - }, - ], - receiver: 'receiver', - act: {}, - context_free: false, - elapsed: 'elapsed', - console: 'console', - account_ram_deltas: [], - except: '', - error_code: '200', - }, - ], - ] as any, - account_ram_delta: '', - except: '', - error_code: 0, - failed_dtrx_trace: '', - partial: [ - 'foo', - { - expiration: '10000', - ref_block_num: 0, - ref_block_prefix: 0, - max_net_usage_words: 0, - max_cpu_usage_ms: 0, - delay_sec: 1, - transaction_extensions: [], - signatures: [], - context_free_data: [], - }, - ] as any, -}; - -describe('Trace Unit tests', () => { - it('"create" should create Trace entity based on given DTO', async () => { - const entity = Trace.create('foo', dto); - expect(entity).toBeInstanceOf(Trace); - }); -}); diff --git a/src/common/blockchain/contract/trace/index.ts b/src/common/blockchain/contract/trace/index.ts deleted file mode 100644 index 17d6a1a..0000000 --- a/src/common/blockchain/contract/trace/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './trace'; -export * from './trace.dtos'; \ No newline at end of file diff --git a/src/common/blockchain/contract/trace/trace.dtos.ts b/src/common/blockchain/contract/trace/trace.dtos.ts deleted file mode 100644 index b257134..0000000 --- a/src/common/blockchain/contract/trace/trace.dtos.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { ActionTraceByNameDto } from '../action-trace'; - -export type PartialDto = { - expiration: string; - ref_block_num: number; - ref_block_prefix: number; - max_net_usage_words: number; - max_cpu_usage_ms: number; - delay_sec: number; - transaction_extensions: unknown[]; - signatures: unknown[]; - context_free_data: unknown[]; -}; - -export type PartialByTypeDto = [string, PartialDto]; - -export type TraceJson = { - id: string; - status: number; - cpu_usage_us: number; - net_usage_words: number; - elapsed: string; - net_usage: string; - scheduled: boolean; - action_traces: ActionTraceByNameDto[]; - account_ram_delta: unknown; - except: unknown; - error_code: number | string; - failed_dtrx_trace: unknown; - partial: PartialByTypeDto; -}; - -export type TraceByNameDto = [string, TraceJson]; diff --git a/src/common/blockchain/contract/trace/trace.ts b/src/common/blockchain/contract/trace/trace.ts deleted file mode 100644 index e0d96ff..0000000 --- a/src/common/blockchain/contract/trace/trace.ts +++ /dev/null @@ -1,107 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ - -import { ActionTrace } from '../action-trace/action-trace'; -import { PartialDto, TraceJson } from './trace.dtos'; - -export class Partial { - public static create(type: string, dto: PartialDto): Partial { - const { - expiration, - ref_block_num, - ref_block_prefix, - max_net_usage_words, - max_cpu_usage_ms, - delay_sec, - transaction_extensions, - signatures, - context_free_data, - } = dto; - return new Partial( - type, - expiration, - ref_block_num, - ref_block_prefix, - max_net_usage_words, - max_cpu_usage_ms, - delay_sec, - transaction_extensions, - signatures, - context_free_data - ); - } - private constructor( - public readonly name: string, - public readonly expiration: string, - public readonly refBlockNumber: number, - public readonly refBlockPrefix: number, - public readonly maxNetUsageWords: number, - public readonly maxCpuUsageMs: number, - public readonly delayInSeconds: number, - public readonly transactionExtensions: unknown[], - public readonly signatures: unknown[], - public readonly contextFreeData: unknown[] - ) {} -} - -export class Trace { - public static create(shipMessageName: string, traceDto: TraceJson): Trace { - const { - id, - status, - cpu_usage_us, - net_usage_words, - elapsed, - net_usage, - scheduled, - action_traces, - account_ram_delta, - except, - error_code, - failed_dtrx_trace, - } = traceDto; - - const actionTraces = action_traces.map(item => { - const [actionTraceType, actionTraceDto] = item; - return ActionTrace.create(actionTraceType, actionTraceDto); - }); - let partial: Partial; - if (traceDto.partial) { - const [partialType, partialContent] = traceDto.partial; - partial = Partial.create(partialType, partialContent); - } - - return new Trace( - shipMessageName, - id, - status, - cpu_usage_us, - net_usage_words, - elapsed, - net_usage, - scheduled, - actionTraces, - account_ram_delta, - except, - Number(error_code), - failed_dtrx_trace, - partial - ); - } - - private constructor( - public readonly shipTraceMessageName: string, - public readonly id: string, - public readonly status: number, - public readonly cpuUsageUs: number, - public readonly netUsageWords: number, - public readonly elapsed: string, - public readonly netUsage: string, - public readonly scheduled: boolean, - public readonly actionTraces: ActionTrace[], - public readonly accountRamDelta: unknown, - public readonly except: unknown, - public readonly errorCode: number, - public readonly failedDtrxTrace: unknown, - public readonly partial: Partial | null - ) {} -} diff --git a/src/common/blockchain/contract/transaction/__tests__/transaction.unit.test.ts b/src/common/blockchain/contract/transaction/__tests__/transaction.unit.test.ts deleted file mode 100644 index c65bad9..0000000 --- a/src/common/blockchain/contract/transaction/__tests__/transaction.unit.test.ts +++ /dev/null @@ -1,44 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unused-vars */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { Transaction } from '../transaction'; - -const dto = { - status: 0, - cpu_usage_us: 1, - net_usage_words: 2, - trx: ['', ''] as any, -}; - -describe('Transaction Unit tests', () => { - it('"create" should create Transaction entity based on given DTO', async () => { - const entity = Transaction.create(dto); - expect(entity).toBeInstanceOf(Transaction); - }); - - it('"create" should log warning on unknown trx type', async () => { - dto.trx = ['foo', '']; - const entity = Transaction.create(dto); - expect(entity).toBeInstanceOf(Transaction); - }); - - it('"create" should create Trx entity when trx type is "transaction_id"', async () => { - dto.trx = ['transaction_id', 'content']; - const entity = Transaction.create(dto); - expect(entity).toBeInstanceOf(Transaction); - }); - - it('"create" should create PackedTrx entity when trx type is "transaction_id"', async () => { - dto.trx = [ - 'packed_transaction', - { - signatures: [], - compression: 0, - packed_context_free_data: '', - packed_trx: Uint8Array.from([]), - }, - ]; - const entity = Transaction.create(dto); - expect(entity).toBeInstanceOf(Transaction); - }); -}); diff --git a/src/common/blockchain/contract/transaction/index.ts b/src/common/blockchain/contract/transaction/index.ts deleted file mode 100644 index 709ffa3..0000000 --- a/src/common/blockchain/contract/transaction/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './transaction'; -export * from './transaction.dtos'; diff --git a/src/common/blockchain/contract/transaction/transaction.ts b/src/common/blockchain/contract/transaction/transaction.ts deleted file mode 100644 index 843c90a..0000000 --- a/src/common/blockchain/contract/transaction/transaction.ts +++ /dev/null @@ -1,63 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ - -import { PackedTrxDto, TransactionDto } from "./transaction.dtos"; - -export class PackedTrx { - public static create(type: string, dto: PackedTrxDto): PackedTrx { - const { signatures, compression, packed_context_free_data, packed_trx } = dto; - return new PackedTrx( - type, - signatures, - compression, - packed_context_free_data, - packed_trx - ); - } - - private constructor( - public readonly type: string, - public readonly signatures: string[], - public readonly compression: number, - public readonly packedContextFreeData: unknown, - public readonly content: unknown //TODO: we should deserialize "packed_trx" - ) {} -} - -export class Trx { - public static create(type: string, dto: string): Trx { - return new Trx(type, dto); - } - - private constructor(public readonly type: string, public readonly content: string) {} -} - -export class Transaction { - public static create(dto: TransactionDto): Transaction { - const { status, cpu_usage_us, net_usage_words } = dto; - - const [type, content] = dto.trx; - let trx; - - switch (type) { - case 'transaction_id': { - trx = Trx.create(type, content); - break; - } - case 'packed_transaction': { - trx = PackedTrx.create(type, content); - break; - } - default: { - console.warn(`Unknown trx type "${type}"`); - } - } - return new Transaction(status, cpu_usage_us, net_usage_words, trx); - } - - private constructor( - public readonly status: number, - public readonly cpuUsageUs: number, - public readonly netUsageWords: number, - public readonly trx: Trx | PackedTrx | unknown - ) {} -} diff --git a/src/common/blockchain/index.ts b/src/common/blockchain/index.ts deleted file mode 100644 index 93d5885..0000000 --- a/src/common/blockchain/index.ts +++ /dev/null @@ -1,6 +0,0 @@ -export * from './abi'; -export * from './blockchain'; -export * from './blockchain.types'; -export * from './block-reader'; -export * from './contract'; -export * from './contract-reader'; diff --git a/src/common/common.types.ts b/src/common/common.types.ts new file mode 100644 index 0000000..8765b57 --- /dev/null +++ b/src/common/common.types.ts @@ -0,0 +1,6 @@ +import { ConfigVars, UnknownObject } from '@alien-worlds/aw-core'; + +export type DatabaseConfigBuilder = ( + vars: ConfigVars, + ...args: unknown[] +) => UnknownObject; diff --git a/src/common/common.utils.ts b/src/common/common.utils.ts index 493cb86..d49f7a8 100644 --- a/src/common/common.utils.ts +++ b/src/common/common.utils.ts @@ -1,11 +1,9 @@ /** - * Suspends execution of the current process for a given number of milliseconds - * @async - * @param {number} ms - * @returns {Promise} + * Checks if a given contract and action represent the 'setabi' action of the 'eosio' contract. + * + * @param {string} contract The contract name. + * @param {string} action The action name. + * @returns {boolean} Returns `true` if the contract and action match the 'eosio' 'setabi' action; otherwise, returns `false`. */ -export const wait = async (ms: number) => new Promise(resolve => setTimeout(resolve, ms)); - export const isSetAbiAction = (contract: string, action: string) => contract === 'eosio' && action === 'setabi'; - diff --git a/src/common/dependencies.ts b/src/common/dependencies.ts new file mode 100644 index 0000000..71c81f3 --- /dev/null +++ b/src/common/dependencies.ts @@ -0,0 +1,16 @@ +import { Result } from '@alien-worlds/aw-core'; + +/** + * An abstract class representing a Process dependencies. + * + * @abstract + * @class Dependencies + */ +export abstract class Dependencies { + /** + * Initializes and configures the Dependencies instance. + * @abstract + * @returns {Promise} A promise that resolves when the initialization is complete. + */ + public abstract initialize(...args: unknown[]): Promise; +} diff --git a/src/common/featured/__tests__/featured.unit.test.ts b/src/common/featured/__tests__/featured.unit.test.ts deleted file mode 100644 index b220309..0000000 --- a/src/common/featured/__tests__/featured.unit.test.ts +++ /dev/null @@ -1,113 +0,0 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ -/* eslint-disable @typescript-eslint/no-explicit-any */ - -import { FeaturedContractContent } from "../featured"; - -const externalTraceData = { - shipTraceMessageName: ['external_foo_type'], - shipActionTraceMessageName: ['external_foo_name'], - contract: ['external_foo_contract'], - action: ['external_foo_action'], - processor: 'external_foo_processor', -} - -export const matchers = { - traces: new Map([ - [ - 'external', - async (data) => { - const result = data['shipTraceMessageName'].includes('external_foo_type') && data['contract'].includes('external_foo_contract'); - return result; - } - ] - ]) -} - -const config = { - traces: [{ - shipTraceMessageName: ['foo_type'], - shipActionTraceMessageName: ['foo_name'], - contract: ['foo_contract', 'foo_contract_2'], - action: ['foo_action'], - processor: 'foo_processor' - },{ - shipTraceMessageName: ['foo_2_type'], - shipActionTraceMessageName: ['foo_2_name'], - contract: ['*'], - action: ['*'], - processor: 'foo_2_processor' - }, { - matcher: 'external', - processor: 'external_foo_processor' - }], - deltas: [{ - shipDeltaMessageName: ['bar_type'], - name: ['bar_name'], - code: ['bar_code'], - scope: ['bar_scope'], - table: ['bar_table'], - processor: 'bar_processor' - }], -} - -describe('Featured Unit tests', () => { - it('"getProcessor" should return processor path assigned to given label', async () => { - const featured = new FeaturedContractContent(config as any, matchers); - - expect(await (featured).traces.getProcessor('foo_type:foo_name:foo_contract:foo_action')).toEqual('foo_processor') - expect(await (featured).traces.getProcessor('foo_2_type:foo_2_name:foo_2_contract:*')).toEqual('foo_2_processor') - expect(await (featured).deltas.getProcessor('bar_type:bar_name:bar_code:bar_scope:bar_table')).toEqual('bar_processor') - expect(await (featured).deltas.getProcessor('bar_2_type:*')).toEqual('') - }); - - it('"has" should return a bool value depending on whether the given pattern matches or not', async () => { - const featured = new FeaturedContractContent(config as any, matchers); - - expect(await (featured).traces.has({ shipTraceMessageName: ['foo_type'], shipActionTraceMessageName: ['foo_name'], contract: ['foo_contract'], action: ['foo_action'] })).toEqual(true) - expect(await (featured).traces.has({ shipTraceMessageName: ['foo_2_type'], shipActionTraceMessageName: ['foo_2_name'] })).toEqual(true) - expect(await (featured).traces.has({ shipTraceMessageName: ['foo_type_3'], shipActionTraceMessageName: ['*'], contract: ['*'], action: ['foo_action'] })).toEqual(false) - - expect(await (featured).deltas.has({ shipDeltaMessageName: ['bar_type'], name: ['bar_name'], code: ['bar_code'], scope: ['bar_scope'], table: ['bar_table'] })).toEqual(true) - expect(await (featured).deltas.has({ shipDeltaMessageName: ['bar_type'], name: ['bar_name'] })).toEqual(true) - expect(await (featured).deltas.has({ shipDeltaMessageName: ['bar_type_3'], name: ['*'] })).toEqual(false) - - expect(await (featured).traces.has({ - shipTraceMessageName: ['external_foo_type'], - shipActionTraceMessageName: ['external_foo_name'], - contract: ['external_foo_contract'], - action: ['external_foo_action'] - })).toEqual(true) - - expect(await (featured).traces.has({ - shipTraceMessageName: ['external_foo_type_UNKNOWN'], - shipActionTraceMessageName: ['external_foo_name_UNKNOWN'], - contract: ['external_foo_contract_UNKNOWN'], - action: ['external_foo_action_UNKNOWN'] - })).toEqual(false) - }); - - it('"get" should return an allocation object when given pattern matches', async () => { - const featured = new FeaturedContractContent(config as any, matchers); - - expect(await (featured).traces.get({ shipTraceMessageName: ['foo_type'], shipActionTraceMessageName: ['foo_name'], contract: ['foo_contract'], action: ['foo_action'] })).toEqual([config.traces[0]]) - expect(await (featured).traces.get({ shipTraceMessageName: ['foo_type_3'], shipActionTraceMessageName: ['*'], contract: ['*'], action: ['foo_action'] })).toEqual([]) - - expect(await (featured).deltas.get({ shipDeltaMessageName: ['bar_type'], name: ['bar_name'], code: ['bar_code'], scope: ['bar_scope'], table: ['bar_table'] })).toEqual(config.deltas) - expect(await (featured).deltas.get({ shipDeltaMessageName: ['bar_type_3'], name: ['*'] })).toEqual([]) - - expect(await (featured).traces.get({ - shipTraceMessageName: ['external_foo_type'], - shipActionTraceMessageName: ['external_foo_name'], - contract: ['external_foo_contract'], - action: ['external_foo_action'] - })).toEqual([externalTraceData]); - - expect(await (featured).traces.get({ - shipTraceMessageName: ['external_foo_type_UNKNOWN'], - shipActionTraceMessageName: ['external_foo_name_UNKNOWN'], - contract: ['external_foo_contract_UNKNOWN'], - action: ['external_foo_action_UNKNOWN'] - })).toEqual([]) - }); - -}); diff --git a/src/common/featured/__tests__/featured.utils.unit.test.ts b/src/common/featured/__tests__/featured.utils.unit.test.ts index 57320e5..67da7a2 100644 --- a/src/common/featured/__tests__/featured.utils.unit.test.ts +++ b/src/common/featured/__tests__/featured.utils.unit.test.ts @@ -1,15 +1,82 @@ -/* eslint-disable @typescript-eslint/no-empty-function */ -/* eslint-disable @typescript-eslint/no-explicit-any */ +import { FeaturedUtils } from '../featured.utils'; -import { contentOrAll } from "../featured.utils"; +describe('FeaturedUtils', () => { + describe('readFeaturedContracts', () => { + it('should return an empty array for empty data', () => { + const data = {}; + const result = FeaturedUtils.readFeaturedContracts(data); + expect(result).toEqual([]); + }); + it('should return an empty array for non-object data', () => { + const data = null; + const result = FeaturedUtils.readFeaturedContracts(data); + expect(result).toEqual([]); + }); -describe('Featured utils Unit tests', () => { - it('"contentOrAll" should return content when given array is not empty', async () => { - expect(contentOrAll(['foo', 'bar'])).toEqual(['foo', 'bar']); - }); + it('should return an array of unique contracts from the data object', () => { + const data = { + contract: ['ContractA', 'ContractB'], + otherProp: 'some value', + }; + const result = FeaturedUtils.readFeaturedContracts(data); + expect(result).toEqual(['ContractA', 'ContractB']); + }); + + it('should return an array of unique contracts from nested data', () => { + let data = { + prop1: 'value1', + prop2: { + contract: 'ContractC', + prop3: { + contract: ['ContractD', 'ContractE'], + }, + }, + }; + let result = FeaturedUtils.readFeaturedContracts(data); + expect(result).toEqual(['ContractC', 'ContractD', 'ContractE']); + + const nested = { + traces: [ + { + prop1: 'value1', + prop2: { + contract: 'ContractC', + prop3: { + contract: ['ContractD', 'ContractE'], + }, + }, + }, + ], + deltas: [ + { + prop1: 'value1', + prop2: { + contract: 'ContractF', + prop3: { + contract: ['ContractA', 'ContractB'], + }, + }, + }, + ], + }; + result = FeaturedUtils.readFeaturedContracts(nested); + expect(result).toEqual([ + 'ContractC', + 'ContractD', + 'ContractE', + 'ContractF', + 'ContractA', + 'ContractB', + ]); + }); - it('"contentOrAll" should return ALL wildcard (*) when given array is empty', async () => { - expect(contentOrAll([])).toEqual(['*']); + it('should ignore non-string values in the contract property', () => { + const data = { + contract: [123, 'ContractF', true, null], + }; + const result = FeaturedUtils.readFeaturedContracts(data); + expect(result).toEqual(['ContractF']); + }); }); }); diff --git a/src/common/featured/featured-contract.ts b/src/common/featured/featured-contract.ts new file mode 100644 index 0000000..69e97ca --- /dev/null +++ b/src/common/featured/featured-contract.ts @@ -0,0 +1,34 @@ +import { parseToBigInt } from '@alien-worlds/aw-core'; + +/** + * Class representing a FeaturedContract + * @class + * @public + */ +export class FeaturedContract { + /** + * Creates a new instance of the FeaturedContract + * @constructor + * @param {string} id - The ID of the contract + * @param {bigint} initialBlockNumber - The initial block number of the contract + * @param {string} account - The account associated with the contract + */ + constructor( + public id: string, + public initialBlockNumber: bigint, + public account: string + ) {} + + /** + * Creates a new instance of FeaturedContract with a specified account and initial block number, + * ID will be set to empty string by default + * @static + * @public + * @param {string} account - The account associated with the contract + * @param {string | number} initialBlockNumber - The initial block number of the contract + * @returns {FeaturedContract} A new instance of FeaturedContract + */ + public static create(account: string, initialBlockNumber: string | number) { + return new FeaturedContract('', parseToBigInt(initialBlockNumber), account); + } +} diff --git a/src/common/featured/featured-contracts.ts b/src/common/featured/featured-contracts.ts new file mode 100644 index 0000000..e5fcd7f --- /dev/null +++ b/src/common/featured/featured-contracts.ts @@ -0,0 +1,81 @@ +import { + FindParams, + Repository, + Result, + SmartContractService, + UnknownObject, + Where, +} from '@alien-worlds/aw-core'; +import { FeaturedContract } from './featured-contract'; +import { FeaturedUtils } from './featured.utils'; + +export class FeaturedContracts { + protected cache: Map = new Map(); + protected featuredContracts: string[]; + + constructor( + private repository: Repository, + private smartContractService: SmartContractService, + criteria: UnknownObject + ) { + this.featuredContracts = FeaturedUtils.readFeaturedContracts(criteria); + } + + /** + * Reads multiple contracts and returns the results as an array of FeaturedContract objects. + * + * @abstract + * @param {string[]} contracts - An array of contract addresses or identifiers. + * @returns {Promise>} A Promise that resolves to an array of FeaturedContract objects. + */ + public async readContracts( + data: string[] | UnknownObject + ): Promise> { + const list: FeaturedContract[] = []; + + const contracts = Array.isArray(data) + ? data + : FeaturedUtils.readFeaturedContracts(data); + + for (const contract of contracts) { + if (this.cache.has(contract)) { + list.push(this.cache.get(contract)); + } else { + const { content: contracts, failure } = await this.repository.find( + FindParams.create({ where: new Where().valueOf('account').isEq(contract) }) + ); + + if (failure) { + return Result.withFailure(failure); + } + + if (contracts.length > 0) { + const featuredContract = contracts[0]; + this.cache.set(featuredContract.account, featuredContract); + list.push(featuredContract); + } else { + const fetchResult = await this.smartContractService.getStats(contract); + + if (fetchResult.isFailure) { + return Result.withFailure(fetchResult.failure); + } + if (fetchResult.content) { + const featuredContract = FeaturedContract.create( + fetchResult.content.account_name, + fetchResult.content.first_block_num + ); + this.cache.set(featuredContract.account, featuredContract); + this.repository.add([featuredContract]); + list.push(featuredContract); + } + } + } + } + + return Result.withContent(list); + } + + public isFeatured(contract: string): boolean { + return this.featuredContracts.includes(contract); + } +} diff --git a/src/common/featured/featured.config.ts b/src/common/featured/featured.config.ts new file mode 100644 index 0000000..68b0ab2 --- /dev/null +++ b/src/common/featured/featured.config.ts @@ -0,0 +1,4 @@ +export type FeaturedConfig = { + serviceUrl: string; + rpcUrl: string; +}; diff --git a/src/common/featured/featured.enums.ts b/src/common/featured/featured.enums.ts deleted file mode 100644 index 4665f63..0000000 --- a/src/common/featured/featured.enums.ts +++ /dev/null @@ -1,4 +0,0 @@ -export enum FeaturedContentType { - Action = 'action', - Delta = 'delta', -} diff --git a/src/common/featured/featured.errors.ts b/src/common/featured/featured.errors.ts index 640374e..965a593 100644 --- a/src/common/featured/featured.errors.ts +++ b/src/common/featured/featured.errors.ts @@ -10,8 +10,34 @@ export class MatcherNotFoundError extends Error { } } +export class UndefinedPatternError extends Error { + constructor() { + super(`No pattern assigned to the criteria`); + } +} + +export class PatternMismatchError extends Error { + constructor() { + super( + `The length of the keys on the label does not match the number of keys in the pattern` + ); + } +} + export class UnknownContentTypeError extends Error { constructor(type: string) { super(`Unknown type: ${type}`); } } + +export class MissingCriteriaError extends Error { + constructor(path: string) { + super(`No criteria found at: ${path}`); + } +} + +export class DefaultsMismatchError extends Error { + constructor() { + super(`Defaults keys do not match pattern keys.`); + } +} diff --git a/src/common/featured/featured.ts b/src/common/featured/featured.ts index 239705f..da0b655 100644 --- a/src/common/featured/featured.ts +++ b/src/common/featured/featured.ts @@ -1,58 +1,108 @@ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ - -/* eslint-disable @typescript-eslint/no-unused-vars */ -import { FeaturedContentType } from './featured.enums'; import { + DefaultsMismatchError, MatcherNotFoundError, PatternMatchError, - UnknownContentTypeError, + PatternMismatchError, } from './featured.errors'; import { - AllocationType, - FeaturedAllocationType, - FeaturedConfig, - FeaturedDelta, - FeaturedDeltaAllocation, - FeaturedMatcher, - FeaturedMatchers, - FeaturedTrace, - FeaturedTraceAllocation, - FeaturedType, + MatchCriteria, + ProcessorMatchCriteria, + ProcessorMatcher, } from './featured.types'; -import { buildFeaturedAllocation } from './featured.utils'; -export abstract class FeaturedContent { - public abstract listContracts(): string[]; - public abstract getProcessor(label: string): Promise; +/** + * A mapper class for processing and matching contracts based on given criteria. + * This class can be extended by other processors that require specific match criteria. + */ +export class Featured { + /** + * Map of processors matched by a matching function. + */ + protected processorByMatchers: ProcessorMatcher = new Map(); + /** + * Array of match criteria. + */ + protected matchCriteria: ProcessorMatchCriteria[] = []; + /** + * Set of contracts. + */ + protected contracts: Set = new Set(); + + /** + * Creates a new instance of the contract processor mapper. + * @param {ProcessorMatchCriteria[]} criteria - An array of match criteria for the processor. + * @param {MatchCriteriaType} pattern - The criteria pattern. + * @param {Partial} defaults - Default criteria key:value pairs. + * @param {ProcessorMatcher} matchers - Optional map of matchers. + */ + constructor( + criteria: ProcessorMatchCriteria[], + protected pattern: MatchCriteriaType, + protected defaults?: Partial, + matchers?: ProcessorMatcher + ) { + if (defaults) { + const patternKeys = Object.keys(pattern); + const hasAllDefaults = Object.keys(defaults).every(elem => + patternKeys.includes(elem) + ); + + if (hasAllDefaults === false) { + throw new DefaultsMismatchError(); + } + } + + criteria.forEach(current => { + const { processor, matcher, ...rest } = current; + const { contract, code } = rest as unknown as MatchCriteria; - protected processorsByMatchers: FeaturedMatcher = new Map(); - protected allocations: T[] = []; + if (defaults) { + const defaultsKeys = Object.keys(defaults); - constructor(allocations: T[], matchers?: FeaturedMatcher) { - allocations.forEach(allocation => { - const { processor, matcher, ...rest } = allocation as FeaturedType; + defaultsKeys.forEach(defaultKey => { + if (!current[defaultKey]) { + current[defaultKey] = defaults[defaultKey]; + } + }); + } + + if (Array.isArray(contract)) { + contract.forEach(contract => this.contracts.add(contract)); + } else if (typeof contract === 'string') { + this.contracts.add(contract); + } + + if (Array.isArray(code)) { + code.forEach(contract => this.contracts.add(contract)); + } else if (typeof code === 'string') { + this.contracts.add(code); + } if (matcher && !matchers?.has(matcher)) { throw new MatcherNotFoundError(matcher); } if (matcher && matchers.has(matcher)) { - this.processorsByMatchers.set(processor, matchers.get(matcher)); + this.processorByMatchers.set(processor, matchers.get(matcher)); } else { - this.validateAllocation(rest); - // - if (this.allocations.indexOf(allocation) === -1) { - this.allocations.push(allocation); + this.validateCriteria(rest as MatchCriteriaType); + + if (this.matchCriteria.indexOf(current) === -1) { + this.matchCriteria.push(current); } } }); } - protected validateAllocation(allocation: FeaturedAllocationType): void { - const keys = Object.keys(allocation); + /** + * Validates the given match criteria. + * @param criteria - The criteria to validate. + */ + protected validateCriteria(criteria: MatchCriteriaType): void { + const keys = Object.keys(criteria); for (const key of keys) { - const values = allocation[key]; + const values = criteria[key]; for (const value of values) { if (/^(\*|[A-Za-z0-9_.]*)$/g.test(value) === false) { throw new PatternMatchError(value, '^(*|[A-Za-z0-9_.]*)$'); @@ -61,16 +111,22 @@ export abstract class FeaturedContent { } } - protected isMatch( - ref: T, - candidate: K + /** + * Determines if a candidate match criteria meets a reference match criteria. + * @param ref - The reference match criteria. + * @param candidate - The candidate match criteria. + * @returns True if a match is found, false otherwise. + */ + protected isMatch( + ref: ProcessorMatchCriteria, + candidate: MatchCriteriaType ): boolean { let matchFound = false; const keys = Object.keys(candidate); for (const key of keys) { - const candidateValues: string | string[] = candidate[key]; - const refValues: string[] = ref[key]; + const candidateValues = candidate[key]; + const refValues = ref[key]; if (Array.isArray(refValues)) { const values: string[] = Array.isArray(candidateValues) ? candidateValues @@ -88,41 +144,72 @@ export abstract class FeaturedContent { return matchFound; } - protected async testMatchers( - allocation: FeaturedAllocationType | AllocationType - ): Promise { - const { processorsByMatchers } = this; - const entries = Array.from(processorsByMatchers.entries()); + /** + * Finds a processor match criteria for a given match criteria. + * @param criteria - The criteria to find a match for. + * @returns The matching processor match criteria if found, null otherwise. + */ + protected async findProcessorMatchCriteria( + criteria: MatchCriteriaType + ): Promise> { + const { processorByMatchers } = this; + const entries = Array.from(processorByMatchers.entries()); for (const entry of entries) { const [processor, matcher] = entry; - if (await matcher(allocation)) { + if (await matcher(criteria)) { + const keys = Object.keys(criteria); + const matchCriteria = {} as MatchCriteriaType; + + for (const key of keys) { + matchCriteria[key] = ['*']; + } + return { - ...buildFeaturedAllocation(allocation), + ...matchCriteria, processor, - } as T; + }; } } return null; } - public async has( - allocation: FeaturedAllocationType | AllocationType - ): Promise { - const { allocations, processorsByMatchers } = this; + /** + * Checks if the criteria already exist in the array + * + * @param {ProcessorMatchCriteria} criteria + * @param {ProcessorMatchCriteria[]} array + * @returns + */ + protected criteriaExistsInArray( + criteria: ProcessorMatchCriteria, + array: ProcessorMatchCriteria[] + ): boolean { + return array.some(item => + Object.keys(item).every(key => item[key] === criteria[key]) + ); + } - for (const item of allocations) { - if (this.isMatch(item, allocation)) { + /** + * Determines if the given match criteria exists in the processor. + * @param criteria - The criteria to check. + * @returns True if the criteria exists, false otherwise. + */ + public async hasCriteria(criteria: MatchCriteriaType): Promise { + const { matchCriteria, processorByMatchers } = this; + + for (const item of matchCriteria) { + if (this.isMatch(item, criteria)) { return true; } } - if (processorsByMatchers.size > 0) { - const featured = await this.testMatchers(allocation); + if (processorByMatchers.size > 0) { + const featured = await this.findProcessorMatchCriteria(criteria); if (featured) { - if (allocations.indexOf(featured) === -1) { - allocations.push(featured); + if (this.criteriaExistsInArray(featured, matchCriteria) === false) { + matchCriteria.push(featured); } return true; } @@ -131,22 +218,26 @@ export abstract class FeaturedContent { return false; } - public async get(allocation: FeaturedAllocationType | AllocationType): Promise { - const { allocations, processorsByMatchers } = this; - const result: T[] = []; - - for (const item of allocations) { - if (this.isMatch(item, allocation)) { + /** + * Gets all match criteria that match the given criteria. + * @param criteria - The criteria to match. + * @returns An array of matching criteria. + */ + public async getCriteria(criteria: MatchCriteriaType): Promise { + const { matchCriteria, processorByMatchers } = this; + const result: MatchCriteriaType[] = []; + + for (const item of matchCriteria) { + if (this.isMatch(item, criteria)) { result.push(item); } } - if (result.length === 0 && processorsByMatchers.size > 0) { - const featured = await this.testMatchers(allocation); - + if (result.length === 0 && processorByMatchers.size > 0) { + const featured = await this.findProcessorMatchCriteria(criteria); if (featured) { - if (allocations.indexOf(featured) === -1) { - allocations.push(featured); + if (this.criteriaExistsInArray(featured, matchCriteria) === false) { + matchCriteria.push(featured); } result.push(featured); } @@ -155,129 +246,50 @@ export abstract class FeaturedContent { return result; } - public toJson(): T[] { - return this.allocations; - } + /** + * Gets the processor for the given label and criteria. + * @param label - The label to find a processor for. + * @param pattern - The match criteria pattern. + * @returns The processor if found, empty string otherwise. + */ + public async getProcessor(label: string): Promise { + const { matchCriteria, pattern } = this; - protected async getProcessorBySchema( - label: string, - allocationSchema: SchemaType - ): Promise { - const { allocations } = this; - const keys = Object.keys(allocationSchema); + const keys = Object.keys(pattern); const parts = label.split(':').map(part => part.split(',')); - const allocation = parts.reduce((result, part, i) => { + + if (parts.length !== keys.length) { + throw new PatternMismatchError(); + } + + const candidate = parts.reduce((result, part, i) => { result[keys[i]] = part; return result; - }, allocationSchema); + }, pattern); - for (const featured of allocations) { - if (this.isMatch(featured, allocation)) { - return (featured).processor; + for (const criteriaRef of matchCriteria) { + if (this.isMatch(criteriaRef, candidate)) { + return criteriaRef.processor; } } - const featured = await this.testMatchers(allocation as FeaturedAllocationType); + const featured = await this.findProcessorMatchCriteria(candidate); if (featured) { - if (allocations.indexOf(featured) === -1) { - allocations.push(featured); + if (matchCriteria.indexOf(featured) === -1) { + matchCriteria.push(featured); } - return (featured).processor; + return featured.processor; } return ''; } -} - -export class FeaturedTraces extends FeaturedContent { - private contracts: Set = new Set(); - constructor(traces: FeaturedTrace[], matchers?: FeaturedMatcher) { - super(traces, matchers); - traces.forEach(trace => { - if (trace.contract) { - trace.contract.forEach(contract => this.contracts.add(contract)); - } - }); - } - - public listContracts(): string[] { - return Array.from(this.contracts); - } - - public async getProcessor(label: string): Promise { - return this.getProcessorBySchema(label, { - shipTraceMessageName: [], - shipActionTraceMessageName: [], - contract: [], - action: [], - }); - } -} -export class FeaturedDeltas extends FeaturedContent { - private contracts: Set = new Set(); - constructor(deltas: FeaturedDelta[], matchers?: FeaturedMatcher) { - super(deltas, matchers); - deltas.forEach(delta => { - if (delta.code) { - delta.code.forEach(contract => this.contracts.add(contract)); - } - }); - } - - public listContracts(): string[] { + /** + * Lists all contracts in the processor. + * @returns An array of contracts. + */ + public getContracts(): string[] { return Array.from(this.contracts); } - - public async getProcessor(label: string): Promise { - return this.getProcessorBySchema(label, { - shipDeltaMessageName: [], - name: [], - code: [], - scope: [], - table: [], - }); - } -} - -export class FeaturedContractContent { - private traces: FeaturedTraces; - private deltas: FeaturedDeltas; - - constructor(config: FeaturedConfig, matchers?: FeaturedMatchers) { - const { traces, deltas } = matchers || {}; - this.traces = new FeaturedTraces(config.traces, traces); - this.deltas = new FeaturedDeltas(config.deltas, deltas); - } - - public getProcessor(type: string, label: string) { - if (type === FeaturedContentType.Action) { - return this.traces.getProcessor(label); - } else if (type === FeaturedContentType.Delta) { - return this.deltas.getProcessor(label); - } else { - throw new UnknownContentTypeError(type); - } - } - - public listContracts(): string[] { - const { traces, deltas } = this; - const list: string[] = []; - [...traces.listContracts(), ...deltas.listContracts()].forEach(contract => { - if (list.includes(contract) === false) { - list.push(contract); - } - }); - - return list; - } - - public toJson() { - const { deltas, traces } = this; - return { - traces: traces.toJson(), - deltas: deltas.toJson(), - }; - } } diff --git a/src/common/featured/featured.types.ts b/src/common/featured/featured.types.ts index a30f3aa..ce965ca 100644 --- a/src/common/featured/featured.types.ts +++ b/src/common/featured/featured.types.ts @@ -1,59 +1,41 @@ -export type AllocationType = { - [key: string]: string; +export type FeaturedContractModel = { + account: string; + initialBlockNumber: bigint; }; -export type FeaturedAllocationType = { - [key: string]: string[]; +export type FetchContractResponse = { + account: string; + block_num: string | number; }; -export type FeaturedType = FeaturedAllocationType & { +export type CriteriaValue = string | string[]; + +export type MatchCriteria = { + [key: string]: CriteriaValue; +}; + +export type ProcessorMatchCriteria = { matcher?: string; processor: string; -}; +} & MatchCriteriaType; -export type OptionalTraceAllocation = { - shipTraceMessageName?: string; - shipActionTraceMessageName?: string; - contract?: string; - action?: string; -}; +export type ProcessorMatcher = Map< + string, + MatchFunction +>; -export type TraceAllocation = { - shipTraceMessageName: string; - shipActionTraceMessageName: string; - contract: string; - action: string; -}; +export type MatchFunction = ( + criteria: MatchCriteriaType +) => Promise; -export type FeaturedTraceAllocation = { +export type ContractTraceMatchCriteria = MatchCriteria & { shipTraceMessageName: string[]; shipActionTraceMessageName: string[]; contract: string[]; action: string[]; }; -export type FeaturedTrace = FeaturedTraceAllocation & { - matcher?: string; - processor: string; -}; - -export type OptionalDeltaAllocation = { - shipDeltaMessageName?: string; - name?: string; - code?: string; - scope?: string; - table?: string; -}; - -export type DeltaAllocation = { - shipDeltaMessageName: string; - name: string; - code: string; - scope: string; - table: string; -}; - -export type FeaturedDeltaAllocation = { +export type ContractDeltaMatchCriteria = MatchCriteria & { shipDeltaMessageName: string[]; name: string[]; code: string[]; @@ -61,28 +43,7 @@ export type FeaturedDeltaAllocation = { table: string[]; }; -export type FeaturedDelta = FeaturedDeltaAllocation & { - matcher?: string; - processor: string; -}; - -export type FeaturedConfig = { - traces: FeaturedTrace[]; - deltas: FeaturedDelta[]; +export type FeaturedContractDataCriteria = { + traces: ProcessorMatchCriteria[]; + deltas: ProcessorMatchCriteria[]; }; - -export type FeaturedMatchers = { - traces?: FeaturedMatcher; - deltas?: FeaturedMatcher; -}; - -export type PathLink = { - link: string[][]; - path: string; -}; - -export type FeaturedMatcher = Map; - -export type MatchFunction = ( - data: FeaturedAllocationType | AllocationType -) => Promise; diff --git a/src/common/featured/featured.utils.ts b/src/common/featured/featured.utils.ts index 3a332b1..1018326 100644 --- a/src/common/featured/featured.utils.ts +++ b/src/common/featured/featured.utils.ts @@ -1,19 +1,58 @@ -import { AllocationType, FeaturedAllocationType } from './featured.types'; +import fetch from 'node-fetch'; +import { UnknownObject } from '@alien-worlds/aw-core'; +import { FeaturedContractDataCriteria } from './featured.types'; +import { existsSync, readFileSync } from 'fs'; -export const contentOrAll = (content: string[]) => - content?.length > 0 ? content : ['*']; +export class FeaturedUtils { + public static readFeaturedContracts(data: UnknownObject | unknown[]): string[] { + const contracts = new Set(); + if (!data) { + return []; + } + Object.keys(data).forEach(key => { + const value = data[key]; -export const buildFeaturedAllocation = ( - allocation: FeaturedAllocationType | AllocationType -): FeaturedAllocationType => { - const keys = Object.keys(allocation); - const result = {}; + if ((key === 'contract' || key === 'code') && Array.isArray(value)) { + value.forEach(contract => { + if (typeof contract === 'string') { + contracts.add(contract); + } + }); + } else if ((key === 'contract' || key === 'code') && typeof value === 'string') { + if (typeof value === 'string') { + contracts.add(value); + } + } else if (Array.isArray(value) || typeof value === 'object') { + const result = this.readFeaturedContracts(value); + result.forEach(contract => { + contracts.add(contract); + }); + } + }); + return Array.from(contracts); + } - for (const key of keys) { - const value = allocation[key]; + public static async fetchCriteria( + filePath: string + ): Promise { + const urlRegex = + /^((ftp|http|https):\/\/)?(www\.)?((([a-z\d]([a-z\d-]*[a-z\d])*)\.)+[a-z]{2,}|((\d{1,3}\.){3}\d{1,3}))(:\d+)?(\/[-a-z\d%_.~+]*)*(\?[;&a-z\d%_.~+=-]*)?(#[-a-z\d_]*)?$/i; + const isHttpPath = urlRegex.test(filePath); - result[key] = Array.isArray(value) ? value : [value]; - } + if (isHttpPath) { + const response = await fetch(filePath); + if (!response.ok) { + return null; + } - return result; -}; + return await response.json(); + } else { + if (existsSync(filePath)) { + const fileContent = readFileSync(filePath, 'utf-8'); + return JSON.parse(fileContent); + } + + return null; + } + } +} diff --git a/src/common/featured/index.ts b/src/common/featured/index.ts index a3d3e08..ba6610c 100644 --- a/src/common/featured/index.ts +++ b/src/common/featured/index.ts @@ -1,5 +1,7 @@ +export * from './featured-contract'; +export * from './featured-contracts'; +export * from './featured.config'; +export * from './featured.errors'; export * from './featured'; export * from './featured.types'; export * from './featured.utils'; -export * from './featured.errors'; -export * from './featured.enums'; diff --git a/src/common/index.ts b/src/common/index.ts index 69fd9fd..f21936a 100644 --- a/src/common/index.ts +++ b/src/common/index.ts @@ -1,10 +1,12 @@ export * from './abis'; -export * from '../reader/block-range-scanner'; +export * from './block-range-scanner'; export * from './block-state'; -export * from './blockchain'; +export * from './featured'; +export * from './processor-task-queue'; +export * from './types'; +export * from './unprocessed-block-queue'; export * from './common.enums'; export * from './common.errors'; export * from './common.utils'; -export * from './featured'; -export * from '../processor/processor-task-queue'; -export * from './workers'; +export * from './common.types'; +export * from './dependencies'; diff --git a/src/common/processor-task-queue/index.ts b/src/common/processor-task-queue/index.ts new file mode 100644 index 0000000..60a5071 --- /dev/null +++ b/src/common/processor-task-queue/index.ts @@ -0,0 +1,7 @@ +export * from './processor-task-queue'; +export * from './processor-task-queue.config'; +export * from './processor-task.enums'; +export * from './processor-task.errors'; +export * from './processor-task.source'; +export * from './processor-task'; +export * from './processor-task.types'; diff --git a/src/processor/processor-task-queue/processor-task-queue.config.ts b/src/common/processor-task-queue/processor-task-queue.config.ts similarity index 100% rename from src/processor/processor-task-queue/processor-task-queue.config.ts rename to src/common/processor-task-queue/processor-task-queue.config.ts diff --git a/src/common/processor-task-queue/processor-task-queue.ts b/src/common/processor-task-queue/processor-task-queue.ts new file mode 100644 index 0000000..2c39011 --- /dev/null +++ b/src/common/processor-task-queue/processor-task-queue.ts @@ -0,0 +1,61 @@ +import { DataSource, DataSourceError, Mapper, log } from '@alien-worlds/aw-core'; +import { ProcessorTaskSource } from './processor-task.source'; +import { ProcessorTask } from './processor-task'; +import { ProcessorTaskModel } from './processor-task.types'; + +export class ProcessorTaskQueue { + constructor( + protected source: ProcessorTaskSource, + protected mapper: Mapper, + protected unsuccessfulSource: DataSource, + protected onlyAdd = false + ) {} + + public async nextTask(mode?: string): Promise { + // TODO: temporary solution - testing session options + if (this.onlyAdd) { + log(`Operation not allowed, queue created with option onlyAdd`); + return; + } + + try { + const dto = await this.source.nextTask(mode); + if (dto) { + return this.mapper.toEntity(dto); + } + return null; + } catch (error) { + log(`Could not get next task due to: ${error.message}`); + return null; + } + } + + public async addTasks(tasks: ProcessorTask[], unsuccessful?: boolean): Promise { + const source = unsuccessful ? this.unsuccessfulSource : this.source; + try { + const dtos = tasks.map(task => this.mapper.fromEntity(task)); + await source.insert(dtos); + } catch (error) { + const { error: concernError } = error; + const concernErrorMessage = (concernError)?.message || ''; + log(`Could not add tasks due to: ${error.message}. ${concernErrorMessage}`); + } + } + + public async stashUnsuccessfulTask( + task: ProcessorTask, + error: { message: string; stack: string } | Error + ): Promise { + try { + const { message, stack } = error; + const document: ProcessorTaskModel = this.mapper.fromEntity( + task + ) as ProcessorTaskModel; + document.error = { message, stack }; + + await this.unsuccessfulSource.insert([document]); + } catch (sourceError) { + log(`Could not stash failed task due to: ${error.message}`); + } + } +} diff --git a/src/common/processor-task-queue/processor-task.enums.ts b/src/common/processor-task-queue/processor-task.enums.ts new file mode 100644 index 0000000..13de22b --- /dev/null +++ b/src/common/processor-task-queue/processor-task.enums.ts @@ -0,0 +1,4 @@ +export enum ProcessorTaskType { + Trace = 'trace', + Delta = 'delta', +} diff --git a/src/common/processor-task-queue/processor-task.errors.ts b/src/common/processor-task-queue/processor-task.errors.ts new file mode 100644 index 0000000..c115c9f --- /dev/null +++ b/src/common/processor-task-queue/processor-task.errors.ts @@ -0,0 +1,5 @@ +export class UnknownProcessorTypeError extends Error { + constructor(type: string) { + super(`Unknown processor type: ${type}`); + } +} diff --git a/src/common/processor-task-queue/processor-task.source.ts b/src/common/processor-task-queue/processor-task.source.ts new file mode 100644 index 0000000..2c7fdda --- /dev/null +++ b/src/common/processor-task-queue/processor-task.source.ts @@ -0,0 +1,5 @@ +import { DataSource } from '@alien-worlds/aw-core'; + +export abstract class ProcessorTaskSource extends DataSource { + public abstract nextTask(mode?: string): Promise; +} diff --git a/src/processor/processor-task-queue/processor-task.ts b/src/common/processor-task-queue/processor-task.ts similarity index 52% rename from src/processor/processor-task-queue/processor-task.ts rename to src/common/processor-task-queue/processor-task.ts index 1b21329..1afc3b2 100644 --- a/src/processor/processor-task-queue/processor-task.ts +++ b/src/common/processor-task-queue/processor-task.ts @@ -1,27 +1,16 @@ import crypto from 'crypto'; import { serialize } from 'v8'; -import { - MongoDB, - parseToBigInt, - removeUndefinedProperties, -} from '@alien-worlds/api-core'; -import { ActionTrace, DeltaRow } from '../../common/blockchain/contract'; -import { - DeltaProcessorContentModel, - ProcessorTaskDocument, - ProcessorTaskError, -} from './processor-task.types'; - -export enum ProcessorTaskType { - Action = 'action', - Delta = 'delta', -} +import { DeltaProcessorContentModel, ProcessorTaskError } from './processor-task.types'; +import { ActionTrace } from '../types'; +import { ProcessorTaskType } from './processor-task.enums'; +import { Row } from '@alien-worlds/aw-core'; export class ProcessorTask { public static createActionProcessorTask( abi: string, mode: string, shipTraceMessageName: string, + shipMessageName: string, transactionId: string, actionTrace: ActionTrace, blockNumber: bigint, @@ -29,16 +18,15 @@ export class ProcessorTask { isFork: boolean ) { const { - shipMessageName, act: { account, name, data }, receipt, } = actionTrace; const buffer = serialize({ - transactionId, - actionTrace, - blockNumber, - blockTimestamp, + transaction_id: transactionId, + action_trace: actionTrace, + block_num: blockNumber.toString(), + block_timestamp: blockTimestamp, }); const hashBuffer = serialize({ account, @@ -60,7 +48,7 @@ export class ProcessorTask { shortId, label, null, - ProcessorTaskType.Action, + ProcessorTaskType.Trace, mode, buffer, hash, @@ -73,27 +61,29 @@ export class ProcessorTask { public static createDeltaProcessorTask( abi: string, mode: string, - shipDeltaMessageName: string, + type: string, name: string, code: string, scope: string, table: string, blockNumber: bigint, blockTimestamp: Date, - row: DeltaRow, + row: Row, isFork: boolean ) { + const { present, data } = row; const content: DeltaProcessorContentModel = { - shipDeltaMessageName, + ship_delta_message_name: type, name, - row, - blockNumber, - blockTimestamp, + present, + data, + block_num: blockNumber, + block_timestamp: blockTimestamp, }; const buffer = serialize(content); const hash = crypto.createHash('sha1').update(buffer).digest('hex'); const shortId = `${code}:${scope}:${table}`; - const label = `${shipDeltaMessageName}:${name}:${shortId}`; + const label = `${type}:${name}:${shortId}`; return new ProcessorTask( null, @@ -111,41 +101,7 @@ export class ProcessorTask { ); } - public static fromDocument(document: ProcessorTaskDocument) { - const { - abi, - short_id, - label, - content, - timestamp, - hash, - type, - mode, - _id, - block_number, - block_timestamp, - error, - is_fork, - } = document; - - return new ProcessorTask( - _id ? _id.toString() : '', - abi, - short_id, - label, - timestamp, - type, - mode, - content.buffer, - hash, - parseToBigInt(block_number), - block_timestamp, - is_fork, - error - ); - } - - private constructor( + constructor( public readonly id: string, public readonly abi: string, public readonly shortId: string, @@ -160,43 +116,4 @@ export class ProcessorTask { public readonly isFork: boolean, public readonly error?: ProcessorTaskError ) {} - - public toDocument(): ProcessorTaskDocument { - const { - id, - abi, - shortId, - label, - timestamp, - type, - mode, - content, - hash, - blockNumber, - isFork, - blockTimestamp, - error, - } = this; - - const document: ProcessorTaskDocument = { - abi, - short_id: shortId, - label, - timestamp, - type, - mode, - content: new MongoDB.Binary(content), - hash, - block_number: MongoDB.Long.fromBigInt(blockNumber), - block_timestamp: blockTimestamp, - is_fork: isFork, - error, - }; - - if (id) { - document._id = new MongoDB.ObjectId(id); - } - - return removeUndefinedProperties(document); - } } diff --git a/src/common/processor-task-queue/processor-task.types.ts b/src/common/processor-task-queue/processor-task.types.ts new file mode 100644 index 0000000..b78da2d --- /dev/null +++ b/src/common/processor-task-queue/processor-task.types.ts @@ -0,0 +1,37 @@ +import { ActionTrace } from '../types'; + +export type ProcessorTaskError = { + message: string; + stack: string; +}; + +export type ProcessorTaskModel = { + id: string; + isFork: string; + abi: string; + path: string; + label: string; + timestamp: Date; + type: string; + mode: string; + content: Buffer; + hash: string; + error?: ProcessorTaskError; +}; + +export type DeltaProcessorContentModel = { + ship_delta_message_name: string; + name: string; + block_num: bigint; + block_timestamp: Date; + present: boolean; + data: Uint8Array; +}; + +export type ActionProcessorContentModel = { + ship_trace_message_name: string; + transaction_id: string; + block_num: bigint; + block_timestamp: Date; + action_trace: ActionTrace; +}; diff --git a/src/common/ship/index.ts b/src/common/ship/index.ts deleted file mode 100644 index 2f49e19..0000000 --- a/src/common/ship/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './ship-abis'; -export * from './ship-abi.source'; diff --git a/src/common/ship/ship-abi.source.ts b/src/common/ship/ship-abi.source.ts deleted file mode 100644 index 2d2cbc0..0000000 --- a/src/common/ship/ship-abi.source.ts +++ /dev/null @@ -1,55 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -/* eslint-disable @typescript-eslint/no-unsafe-call */ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -/* eslint-disable @typescript-eslint/no-unsafe-return */ -import { - CollectionMongoSource, - Failure, - MongoDB, - MongoSource, - Result, -} from '@alien-worlds/api-core'; -import { AbiNotFoundError } from '../blockchain/block-reader/block-reader.errors'; -import { Abi } from '../blockchain/abi'; - -export type ShipAbiDocument = { - _id?: MongoDB.ObjectId; - last_modified_timestamp: Date; - version: string; - abi: string; -}; - -export class ShipAbiSource { - private collection: CollectionMongoSource; - - constructor(mongoSource: MongoSource) { - this.collection = new CollectionMongoSource(mongoSource, 'history_tools.ship_abis'); - } - - public async updateAbi(abi: Abi): Promise> { - try { - await this.collection.update({ - version: abi.version, - last_modified_timestamp: new Date(), - abi: abi.toHex(), - }); - - return Result.withoutContent(); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); - } - } - - public async getAbi(version: string): Promise> { - try { - const document = await this.collection.findOne({ filter: { version } }); - - if (document) { - return Result.withContent(Abi.fromHex(document.abi)); - } - return Result.withFailure(Failure.fromError(new AbiNotFoundError())); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); - } - } -} diff --git a/src/common/ship/ship-abis.ts b/src/common/ship/ship-abis.ts deleted file mode 100644 index a9f62f3..0000000 --- a/src/common/ship/ship-abis.ts +++ /dev/null @@ -1,37 +0,0 @@ -import { MongoConfig, MongoSource, Result, isMongoConfig } from '@alien-worlds/api-core'; -import { ShipAbiSource } from './ship-abi.source'; -import { Abi } from '../blockchain/abi'; - -export class ShipAbis { - public static async create(mongo: MongoConfig | MongoSource) { - let mongoSource; - - if (isMongoConfig(mongo)) { - mongoSource = await MongoSource.create(mongo); - } else { - mongoSource = mongo; - } - - return new ShipAbis(new ShipAbiSource(mongoSource)); - } - - private cache: Map = new Map(); - - private constructor(private mongo: ShipAbiSource) {} - - public async getAbi(version: string): Promise> { - if (this.cache.has(version)) { - return Result.withContent(this.cache.get(version)); - } - - const result = await this.mongo.getAbi(version); - - if (result.isFailure) { - return result; - } - - this.cache.set(version, result.content); - - return result; - } -} diff --git a/src/common/types/action-trace.types.ts b/src/common/types/action-trace.types.ts new file mode 100644 index 0000000..b4a80be --- /dev/null +++ b/src/common/types/action-trace.types.ts @@ -0,0 +1,45 @@ +export type AuthSequence = { + account: string; + sequence: string; +}; + +export type Receipt = { + receiver: string; + act_digest: string; + global_sequence: string; + recv_sequence: string; + auth_sequence: AuthSequence[]; + code_sequence: number; + abi_sequence: number; +}; + +export type ReceiptByName = [string, Receipt]; + +export type ActAuth = { + actor: string; + permission: string; +}; + +export type Act = { + account: string; + name: string; + authorization: ActAuth; + data: Uint8Array; +}; + +export type ActionTrace = { + ship_message_name?: string; + action_ordinal?: number; + creator_action_ordinal?: number; + receipt?: ReceiptByName; + receiver?: string; + act?: Act; + context_free?: boolean; + elapsed?: string; + console?: string; + account_ram_deltas?: unknown[]; + except?: unknown; + error_code?: string | number; +}; + +export type ActionTraceByName = [string, ActionTrace]; diff --git a/src/common/types/block.types.ts b/src/common/types/block.types.ts new file mode 100644 index 0000000..21f8adb --- /dev/null +++ b/src/common/types/block.types.ts @@ -0,0 +1,20 @@ +export type BlockNumberWithIdModel = { + block_num?: unknown; + block_id?: string; +}; + +export type BlockModel< + BlockType = Uint8Array, + TracesType = Uint8Array, + DeltasType = Uint8Array +> = { + head?: BlockNumberWithIdModel; + this_block?: BlockNumberWithIdModel; + last_irreversible?: BlockNumberWithIdModel; + prev_block?: BlockNumberWithIdModel; + block?: BlockType; + traces?: TracesType; + deltas?: DeltasType; + abi_version?: string; + [key: string]: unknown; +}; diff --git a/src/common/types/delta.types.ts b/src/common/types/delta.types.ts new file mode 100644 index 0000000..4c06f9b --- /dev/null +++ b/src/common/types/delta.types.ts @@ -0,0 +1,8 @@ +import { Row } from '@alien-worlds/aw-core'; + +export type Delta = { + name?: string; + rows?: Row[]; +}; + +export type DeltaByName = [string, Delta]; diff --git a/src/common/types/index.ts b/src/common/types/index.ts new file mode 100644 index 0000000..d179343 --- /dev/null +++ b/src/common/types/index.ts @@ -0,0 +1,6 @@ +export * from './action-trace.types'; +export * from './signed-block.types'; +export * from './delta.types'; +export * from './trace.types'; +export * from './block.types'; +export * from './transaction.types'; diff --git a/src/common/blockchain/contract/signed-block/signed-block.dtos.ts b/src/common/types/signed-block.types.ts similarity index 65% rename from src/common/blockchain/contract/signed-block/signed-block.dtos.ts rename to src/common/types/signed-block.types.ts index 1d736f3..acfe49e 100644 --- a/src/common/blockchain/contract/signed-block/signed-block.dtos.ts +++ b/src/common/types/signed-block.types.ts @@ -1,6 +1,6 @@ -import { TransactionDto } from '../transaction/transaction.dtos'; +import { Transaction } from './transaction.types'; -export type SignedBlockJson = { +export type SignedBlock = { timestamp: string; producer: string; confirmed: number; @@ -11,5 +11,5 @@ export type SignedBlockJson = { new_producers: unknown; header_extensions: unknown[]; producer_signature: string; - transactions: TransactionDto[]; + transactions: Transaction[]; }; diff --git a/src/common/types/trace.types.ts b/src/common/types/trace.types.ts new file mode 100644 index 0000000..5bb8ddd --- /dev/null +++ b/src/common/types/trace.types.ts @@ -0,0 +1,33 @@ +import { ActionTraceByName } from './action-trace.types'; + +export type Partial = { + expiration: string; + ref_block_num: number; + ref_block_prefix: number; + max_net_usage_words: number; + max_cpu_usage_ms: number; + delay_sec: number; + transaction_extensions: unknown[]; + signatures: unknown[]; + context_free_data: unknown[]; +}; + +export type PartialByType = [string, Partial]; + +export type Trace = { + id?: string; + status?: number; + cpu_usage_us?: number; + net_usage_words?: number; + elapsed?: string; + net_usage?: string; + scheduled?: boolean; + action_traces?: ActionTraceByName[]; + account_ram_delta?: unknown; + except?: unknown; + error_code?: number | string; + failed_dtrx_trace?: unknown; + partial?: PartialByType; +}; + +export type TraceByName = [string, Trace]; diff --git a/src/common/blockchain/contract/transaction/transaction.dtos.ts b/src/common/types/transaction.types.ts similarity index 56% rename from src/common/blockchain/contract/transaction/transaction.dtos.ts rename to src/common/types/transaction.types.ts index adcab21..f3022ef 100644 --- a/src/common/blockchain/contract/transaction/transaction.dtos.ts +++ b/src/common/types/transaction.types.ts @@ -1,14 +1,14 @@ -export type PackedTrxDto = { +export type PackedTrx = { signatures: string[]; compression: number; packed_context_free_data: unknown; packed_trx: Uint8Array; }; -export type TrxByNameDto = [string, PackedTrxDto | string]; +export type TrxByName = [string, PackedTrx | string]; -export type TransactionDto = { +export type Transaction = { status: number; cpu_usage_us: number; net_usage_words: number; - trx: TrxByNameDto; + trx: TrxByName; }; diff --git a/src/common/unprocessed-block-queue/index.ts b/src/common/unprocessed-block-queue/index.ts new file mode 100644 index 0000000..3cc98c1 --- /dev/null +++ b/src/common/unprocessed-block-queue/index.ts @@ -0,0 +1,4 @@ +export * from './unprocessed-block-queue'; +export * from './unprocessed-block-queue.source'; +export * from './unprocessed-block-queue.errors'; +export * from './unprocessed-block-queue.types'; diff --git a/src/reader/unprocessed-block-queue/unprocessed-block-queue.errors.ts b/src/common/unprocessed-block-queue/unprocessed-block-queue.errors.ts similarity index 100% rename from src/reader/unprocessed-block-queue/unprocessed-block-queue.errors.ts rename to src/common/unprocessed-block-queue/unprocessed-block-queue.errors.ts diff --git a/src/common/unprocessed-block-queue/unprocessed-block-queue.source.ts b/src/common/unprocessed-block-queue/unprocessed-block-queue.source.ts new file mode 100644 index 0000000..d0bfebe --- /dev/null +++ b/src/common/unprocessed-block-queue/unprocessed-block-queue.source.ts @@ -0,0 +1,6 @@ +import { DataSource } from '@alien-worlds/aw-core'; + +export abstract class UnprocessedBlockSource extends DataSource { + public abstract next(): Promise; + public abstract bytesSize(): Promise; +} diff --git a/src/common/unprocessed-block-queue/unprocessed-block-queue.ts b/src/common/unprocessed-block-queue/unprocessed-block-queue.ts new file mode 100644 index 0000000..e74390e --- /dev/null +++ b/src/common/unprocessed-block-queue/unprocessed-block-queue.ts @@ -0,0 +1,155 @@ +import { + Block, + DataSourceError, + Failure, + log, + Mapper, + parseToBigInt, + Result, +} from '@alien-worlds/aw-core'; +import { + BlockNotFoundError, + DuplicateBlocksError, + UnprocessedBlocksOverloadError, +} from './unprocessed-block-queue.errors'; +import { UnprocessedBlockSource } from './unprocessed-block-queue.source'; +import { BlockModel } from '../types/block.types'; + +export abstract class UnprocessedBlockQueueReader { + public abstract next(): Promise>; +} + +export class UnprocessedBlockQueue + implements UnprocessedBlockQueueReader +{ + protected cache: Block[] = []; + protected overloadHandler: (size: number) => void; + protected beforeSendBatchHandler: () => void; + protected afterSendBatchHandler: () => void; + + constructor( + protected collection: UnprocessedBlockSource, + protected mapper: Mapper, + protected maxBytesSize: number, + protected batchSize: number, + protected fastLaneBatchSize: number + ) {} + + private async sendBatch() { + const addedBlockNumbers = []; + this.beforeSendBatchHandler(); + const documnets = this.cache.map(block => this.mapper.fromEntity(block)); + const result = await this.collection.insert(documnets); + result.forEach(model => { + addedBlockNumbers.push(parseToBigInt((model as BlockModel).this_block.block_num)); + }); + this.cache = []; + + if (this.maxBytesSize > 0 && this.overloadHandler) { + const sorted = addedBlockNumbers.sort(); + const min = sorted[0]; + const max = sorted.reverse()[0]; + + const currentSize = await this.collection.bytesSize(); + if (currentSize >= this.maxBytesSize) { + this.overloadHandler(currentSize); + throw new UnprocessedBlocksOverloadError(min, max); + } + } + + this.afterSendBatchHandler(); + + return addedBlockNumbers; + } + + public async getBytesSize(): Promise> { + try { + const currentSize = await this.collection.bytesSize(); + return Result.withContent(currentSize); + } catch (error) { + return Result.withFailure(Failure.fromError(error)); + } + } + + public async add( + block: Block, + options?: { isFastLane?: boolean; isLast?: boolean; predictedRangeSize?: number } + ): Promise> { + const { isFastLane, isLast, predictedRangeSize } = options || {}; + try { + let addedBlockNumbers: bigint[] = []; + const currentBatchSize = isFastLane + ? predictedRangeSize < this.fastLaneBatchSize + ? predictedRangeSize + : this.fastLaneBatchSize + : predictedRangeSize < this.batchSize + ? predictedRangeSize + : this.batchSize; + + if (this.cache.length < currentBatchSize) { + this.cache.push(block); + } + + if (this.cache.length === currentBatchSize || isLast) { + addedBlockNumbers = await this.sendBatch(); + } + + return Result.withContent(addedBlockNumbers); + } catch (error) { + // it is important to clear the cache in case of errors + this.cache = []; + + if (error instanceof DataSourceError && error.isDuplicateError) { + this.afterSendBatchHandler(); + return Result.withFailure(Failure.fromError(new DuplicateBlocksError())); + } + return Result.withFailure(Failure.fromError(error)); + } + } + + public async next(): Promise> { + try { + const document = await this.collection.next(); + if (document) { + if (this.maxBytesSize > -1 && this.afterSendBatchHandler) { + if ((await this.collection.count()) === 0 && this.afterSendBatchHandler) { + this.afterSendBatchHandler(); + } + } + + return Result.withContent(this.mapper.toEntity(document)); + } + return Result.withFailure(Failure.fromError(new BlockNotFoundError())); + } catch (error) { + log(`Could not get next task due to: ${error.message}`); + return Result.withFailure(Failure.fromError(error)); + } + } + + public async getMax(): Promise> { + try { + const documents = await this.collection.aggregate({ + pipeline: [{ $sort: { 'this_block.block_num': -1 } }, { $limit: 1 }], + }); + if (documents.length > 0) { + return Result.withContent(this.mapper.toEntity(documents[0])); + } + return Result.withFailure(Failure.fromError(new BlockNotFoundError())); + } catch (error) { + log(`Could not get block with highest block number due to: ${error.message}`); + return Result.withFailure(Failure.fromError(error)); + } + } + + public afterSendBatch(handler: () => void): void { + this.afterSendBatchHandler = handler; + } + + public beforeSendBatch(handler: () => void): void { + this.beforeSendBatchHandler = handler; + } + + public onOverload(handler: (size: number) => void): void { + this.overloadHandler = handler; + } +} diff --git a/src/common/unprocessed-block-queue/unprocessed-block-queue.types.ts b/src/common/unprocessed-block-queue/unprocessed-block-queue.types.ts new file mode 100644 index 0000000..4d6c6e8 --- /dev/null +++ b/src/common/unprocessed-block-queue/unprocessed-block-queue.types.ts @@ -0,0 +1,7 @@ +export type UnprocessedBlockQueueConfig = { + maxBytesSize: number; + batchSize: number; + fastLaneBatchSize: number; + sizeCheckInterval?: number; + [key: string]: unknown; +}; diff --git a/src/common/workers/index.ts b/src/common/workers/index.ts deleted file mode 100644 index fee6f93..0000000 --- a/src/common/workers/index.ts +++ /dev/null @@ -1,9 +0,0 @@ -export * from './worker-message'; -export * from './worker-pool'; -export * from './worker'; -export * from './worker.enums'; -export * from './worker.errors'; -export * from './worker.utils'; -export * from './worker.types'; -export * from './worker-container'; -export * from './worker-loader'; diff --git a/src/common/workers/worker-container.ts b/src/common/workers/worker-container.ts deleted file mode 100644 index e6c8148..0000000 --- a/src/common/workers/worker-container.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { WorkerClass } from "./worker.types"; - -export class WorkerContainer { - private bindings: Map = new Map(); - - bind(label: string, workerClass: WorkerClass): void { - this.bindings.set(label, workerClass); - } - - get(label: string): WorkerClass { - return this.bindings.get(label) as WorkerClass; - } -} diff --git a/src/common/workers/worker-loader/index.ts b/src/common/workers/worker-loader/index.ts deleted file mode 100644 index 010f2df..0000000 --- a/src/common/workers/worker-loader/index.ts +++ /dev/null @@ -1,3 +0,0 @@ -export * from './worker-loader'; -export * from './worker-loader.types'; -export * from './worker-loader.utils'; diff --git a/src/common/workers/worker-loader/worker-loader-script.ts b/src/common/workers/worker-loader/worker-loader-script.ts deleted file mode 100644 index 910e6ef..0000000 --- a/src/common/workers/worker-loader/worker-loader-script.ts +++ /dev/null @@ -1,50 +0,0 @@ -import async from 'async'; -import { workerData, parentPort } from 'worker_threads'; -import { WorkerMessage, WorkerMessageName } from '../worker-message'; -import { WorkerData } from '../worker.types'; -import { Worker } from '../worker'; -import { getWorkerLoader } from './worker-loader.utils'; -import { WorkerLoader } from './worker-loader'; - -let worker: Worker; -let workerLoader: WorkerLoader; - -export const messageHandler = async (message: WorkerMessage) => { - const { pointer, sharedData, options } = workerData as WorkerData; - if (message.name === WorkerMessageName.Setup) { - // - try { - workerLoader = getWorkerLoader(options?.workerLoaderPath); - await workerLoader.setup(sharedData); - parentPort.postMessage(WorkerMessage.setupComplete(message.workerId)); - } catch (error) { - parentPort.postMessage(WorkerMessage.setupFailure(message.workerId, error)); - } - } else if (message.name === WorkerMessageName.Load) { - // - try { - const { data } = >message; - worker = await workerLoader.load(data || pointer); - parentPort.postMessage(WorkerMessage.loadComplete(message.workerId)); - } catch (error) { - parentPort.postMessage(WorkerMessage.loadFailure(message.workerId, error)); - } - } else if (message.name === WorkerMessageName.Dispose) { - // - try { - worker = null; - parentPort.postMessage(WorkerMessage.disposeComplete(message.workerId)); - } catch (error) { - parentPort.postMessage(WorkerMessage.disposeFailure(message.workerId, error)); - } - } else if (message.name === WorkerMessageName.RunTask) { - // - worker.run(message.data); - } -}; - -const queue = async.queue(messageHandler); - -parentPort.on('message', (message: WorkerMessage) => { - queue.push(message); -}); diff --git a/src/common/workers/worker-loader/worker-loader.errors.ts b/src/common/workers/worker-loader/worker-loader.errors.ts deleted file mode 100644 index 253855c..0000000 --- a/src/common/workers/worker-loader/worker-loader.errors.ts +++ /dev/null @@ -1,7 +0,0 @@ -export class UndefinedPointerError extends Error { - constructor() { - super( - `Undefined pointer. The worker loader does not know which Worker class to initialize. If you use only one class, override your worker loader's "load" method so that it returns instance of your worker instead of calling super.load()` - ); - } -} diff --git a/src/common/workers/worker-loader/worker-loader.ts b/src/common/workers/worker-loader/worker-loader.ts deleted file mode 100644 index 35e0133..0000000 --- a/src/common/workers/worker-loader/worker-loader.ts +++ /dev/null @@ -1,53 +0,0 @@ -/* eslint-disable @typescript-eslint/no-var-requires */ -/* eslint-disable @typescript-eslint/no-unused-vars */ -import { existsSync } from 'fs'; -import { Worker } from '../worker'; -import { buildPath } from './worker-loader.utils'; -import { WorkerClass } from '../worker.types'; -import { WorkerConstructorArgs } from './worker-loader.types'; -import { UndefinedPointerError } from './worker-loader.errors'; - -export abstract class WorkerLoader { - public abstract bindings: Map; - - public abstract setup(sharedData: SharedDataType, ...args: unknown[]): Promise; - public abstract load( - pointer: string, - workerConstructorArgs?: WorkerConstructorArgs - ): Promise; -} - -export class DefaultWorkerLoader - implements WorkerLoader -{ - public bindings: Map = new Map(); - protected sharedData: SharedDataType; - - public async setup(sharedData: SharedDataType, ...args: unknown[]): Promise { - this.sharedData = sharedData; - } - public async load( - pointer: string, - workerConstructorArgs?: WorkerConstructorArgs - ): Promise { - let WorkerClass; - - if (!pointer) { - throw new UndefinedPointerError(); - } - - const filePath = buildPath(pointer); - if (existsSync(filePath)) { - WorkerClass = require(filePath).default; - } else if (this.bindings.has(pointer)) { - WorkerClass = this.bindings.get(pointer); - } else { - throw new Error( - `A valid path to a worker was not specified or a worker was not assigned to the given name ${pointer}` - ); - } - - const worker = new WorkerClass(workerConstructorArgs) as Worker; - return worker; - } -} diff --git a/src/common/workers/worker-loader/worker-loader.types.ts b/src/common/workers/worker-loader/worker-loader.types.ts deleted file mode 100644 index dbdf3c1..0000000 --- a/src/common/workers/worker-loader/worker-loader.types.ts +++ /dev/null @@ -1,2 +0,0 @@ -export type WorkerLoaderClass = new (...args: never[]) => void; -export type WorkerConstructorArgs = { [key: string]: unknown }; diff --git a/src/common/workers/worker-loader/worker-loader.utils.ts b/src/common/workers/worker-loader/worker-loader.utils.ts deleted file mode 100644 index 5fb79f2..0000000 --- a/src/common/workers/worker-loader/worker-loader.utils.ts +++ /dev/null @@ -1,32 +0,0 @@ -/* eslint-disable @typescript-eslint/no-var-requires */ - -import { existsSync } from 'fs'; -import path from 'path'; -import { InvalidPathError } from '../worker.errors'; -import { DefaultWorkerLoader, WorkerLoader } from './worker-loader'; - -export const buildPath = (filePath: string): string => { - if (filePath.endsWith('.ts')) { - require('ts-node').register(); - return path.resolve(process.cwd(), 'src', `${filePath}`); - } else { - return path.resolve( - process.cwd(), - 'build', - `${filePath}${filePath.endsWith('.js') ? '' : '.js'}` - ); - } -}; - -export const getWorkerLoader = (path: string): WorkerLoader => { - if (path) { - const loaderPath = buildPath(path); - if (existsSync(loaderPath) === false) { - throw new InvalidPathError(loaderPath); - } - const WorkerLoaderClass = require(loaderPath).default; - return new WorkerLoaderClass() as WorkerLoader; - } - - return new DefaultWorkerLoader(); -}; diff --git a/src/common/workers/worker-message.ts b/src/common/workers/worker-message.ts deleted file mode 100644 index d00e582..0000000 --- a/src/common/workers/worker-message.ts +++ /dev/null @@ -1,227 +0,0 @@ -export type ErrorJson = { - name?: string; - message?: string; - stack?: string; - [key: string]: unknown; -}; - -export type WorkerMessageContent = { - workerId: number; - type: string; - name: string; - data?: DataType; - error?: ErrorJson; -}; - -export type WorkerMessageHandler = (message: WorkerMessage) => void; - -export class WorkerMessage { - public static create({ - workerId, - type, - name, - data, - error, - }: WorkerMessageContent) { - let errorJson: ErrorJson; - if (error) { - const { message, stack, name: errorName, ...rest } = error; - errorJson = { - message, - stack, - name: errorName, - ...rest, - }; - } - - return new WorkerMessage(workerId, type, name, data, errorJson); - } - - public static setup(workerId: number) { - return new WorkerMessage(workerId, WorkerMessageType.System, WorkerMessageName.Setup); - } - - public static setupComplete(workerId: number) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.SetupComplete - ); - } - - public static setupFailure(workerId: number, error: Error) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.SetupFailure, - error, - error - ); - } - - public static load(workerId: number, pointer: string) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.Load, - pointer - ); - } - - public static loadComplete(workerId: number) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.LoadComplete - ); - } - - public static loadFailure(workerId: number, error: Error) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.LoadFailure, - error, - error - ); - } - - public static dispose(workerId: number) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.Dispose - ); - } - - public static disposeComplete(workerId: number) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.DisposeComplete - ); - } - - public static disposeFailure(workerId: number, error: Error) { - return new WorkerMessage( - workerId, - WorkerMessageType.System, - WorkerMessageName.DisposeComplete, - error, - error - ); - } - - public static runTask(workerId: number, value: DataType) { - return new WorkerMessage( - workerId, - WorkerMessageType.Info, - WorkerMessageName.RunTask, - value - ); - } - - public static use(workerId: number, value: DataType) { - return new WorkerMessage( - workerId, - WorkerMessageType.Info, - WorkerMessageName.PassData, - value - ); - } - - public static taskResolved(workerId: number, value: DataType) { - return new WorkerMessage( - workerId, - WorkerMessageType.Info, - WorkerMessageName.TaskResolved, - value - ); - } - - public static taskRejected(workerId: number, error: Error) { - return new WorkerMessage( - workerId, - WorkerMessageType.Error, - WorkerMessageName.TaskRejected, - null, - error - ); - } - - public static taskProgress(workerId: number, value: DataType) { - return new WorkerMessage( - workerId, - WorkerMessageType.Info, - WorkerMessageName.TaskProgress, - value - ); - } - - private constructor( - public readonly workerId: number, - public readonly type: string, - public readonly name: string, - public readonly data?: DataType, - public readonly error?: ErrorJson - ) {} - - public isTaskResolved(): boolean { - return this.name === WorkerMessageName.TaskResolved; - } - - public isTaskRejected(): boolean { - return this.name === WorkerMessageName.TaskRejected; - } - - public isTaskProgress(): boolean { - return this.name === WorkerMessageName.TaskProgress; - } - - public toJson(): object { - const { workerId, type, name, data, error } = this; - let errorJson = {}; - if (error) { - const { message, stack, name: errorName, ...rest } = error; - errorJson = { - message, - stack, - name: errorName, - ...rest, - }; - } - return { - workerId, - type, - name, - data, - error: errorJson, - }; - } -} - -export enum WorkerMessageType { - Error = 'error', - Info = 'info', - Warning = 'warning', - Task = 'task', - System = 'system', -} - -export enum WorkerMessageName { - Setup = 'setup', - SetupComplete = 'setup_complete', - SetupFailure = 'setup_failure', - Load = 'load', - LoadComplete = 'load_complete', - LoadFailure = 'load_failure', - Dispose = 'dispose', - DisposeComplete = 'dispose_complete', - DisposeFailure = 'dispose_failure', - RunTask = 'run_task', - PassData = 'pass_data', - DataPassed = 'data_passed', - TaskResolved = 'task_resolved', - TaskRejected = 'task_rejected', - TaskProgress = 'task_progress', -} diff --git a/src/common/workers/worker-pool.ts b/src/common/workers/worker-pool.ts deleted file mode 100644 index 53cd41b..0000000 --- a/src/common/workers/worker-pool.ts +++ /dev/null @@ -1,114 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-assignment */ -import { log } from '@alien-worlds/api-core'; -import { WorkerProxy } from './worker-proxy'; -import { WorkerPoolOptions } from './worker.types'; -import { getWorkersCount } from './worker.utils'; - -type WorkerReleaseHandler = (id: number, data?: unknown) => Promise | void; - -export class WorkerPool { - public static async create(options: WorkerPoolOptions) { - const pool = new WorkerPool(); - await pool.setup(options); - return pool; - } - - public workerMaxCount: number; - - private workerLoaderPath: string; - private availableWorkers: WorkerProxy[] = []; - private activeWorkersByPid = new Map(); - private sharedData: unknown; - private workerReleaseHandler: WorkerReleaseHandler; - - public async setup(options: WorkerPoolOptions) { - const { - threadsCount, - inviolableThreadsCount, - sharedData, - workerLoaderPath, - } = options; - this.workerLoaderPath = workerLoaderPath; - this.sharedData = sharedData; - this.workerMaxCount = - threadsCount > inviolableThreadsCount - ? getWorkersCount(threadsCount, inviolableThreadsCount) - : threadsCount; - - for (let i = 0; i < this.workerMaxCount; i++) { - const worker = await this.createWorker(); - this.availableWorkers.push(worker); - } - } - - public get workerCount() { - return this.availableWorkers.length + this.activeWorkersByPid.size; - } - - private async createWorker(): Promise { - const { sharedData, workerLoaderPath } = this; - const proxy = new WorkerProxy(sharedData, { workerLoaderPath }); - await proxy.setup(); - return proxy; - } - - public async getWorker(pointer?: string): Promise { - const { activeWorkersByPid, workerMaxCount, availableWorkers } = - this; - - if (activeWorkersByPid.size < workerMaxCount) { - // When workers are to run common or concrete process, - // we use instance from the list (if there is any available) - const worker = availableWorkers.shift(); - activeWorkersByPid.set(worker.id, worker); - await worker.load(pointer); - return worker as WorkerType & WorkerProxy; - } else { - return null; - } - } - - public async releaseWorker(id: number, data?: unknown): Promise { - const { activeWorkersByPid, availableWorkers, workerMaxCount, workerReleaseHandler } = - this; - const worker = activeWorkersByPid.get(id); - - if (worker) { - await worker.dispose(); - this.activeWorkersByPid.delete(id); - if (availableWorkers.length < workerMaxCount) { - availableWorkers.push(worker); - } - if (workerReleaseHandler) { - await workerReleaseHandler(id, data); - } - } else { - log(`No worker with the specified ID #${id} was found`); - } - } - - public removeWorkers() { - this.activeWorkersByPid.forEach(worker => worker.remove()); - this.availableWorkers.forEach(worker => worker.remove()); - } - - public hasAvailableWorker(): boolean { - return this.workerMaxCount - this.activeWorkersByPid.size > 0; - } - - public hasActiveWorkers(): boolean { - return this.activeWorkersByPid.size > 0; - } - - public countAvailableWorkers(): number { - return this.workerMaxCount - this.activeWorkersByPid.size; - } - - public countActiveWorkers(): number { - return this.activeWorkersByPid.size; - } - - public onWorkerRelease(handler: WorkerReleaseHandler): void { - this.workerReleaseHandler = handler; - } -} diff --git a/src/common/workers/worker-proxy.ts b/src/common/workers/worker-proxy.ts deleted file mode 100644 index 499905c..0000000 --- a/src/common/workers/worker-proxy.ts +++ /dev/null @@ -1,127 +0,0 @@ -import { log } from '@alien-worlds/api-core'; -import { Worker } from 'worker_threads'; -import { - WorkerMessage, - WorkerMessageContent, - WorkerMessageName, - WorkerMessageType, -} from './worker-message'; -import { WorkerProxyOptions } from './worker.types'; - -export class WorkerProxy { - private _pointer: string; - private worker: Worker; - - constructor(sharedData: unknown, options: WorkerProxyOptions) { - this.worker = new Worker(`${__dirname}/worker-loader/worker-loader-script`, { - workerData: { sharedData, options }, - }); - } - - public get id(): number { - return this.worker.threadId; - } - - public get pointer(): string { - return this._pointer; - } - - public async setup(): Promise { - const { worker } = this; - worker.removeAllListeners(); - return new Promise((resolveSetup, rejectSetup) => { - worker.on('message', (content: WorkerMessageContent) => { - const { type, name, data } = content; - if ( - type === WorkerMessageType.System && - name === WorkerMessageName.SetupComplete - ) { - worker.removeAllListeners(); - resolveSetup(); - } else if ( - type === WorkerMessageType.System && - name === WorkerMessageName.SetupFailure - ) { - worker.removeAllListeners(); - rejectSetup(data); - } - }); - worker.postMessage(WorkerMessage.setup(worker.threadId).toJson()); - }); - } - - public async load(pointer: string): Promise { - this._pointer = pointer; - const { worker } = this; - worker.removeAllListeners(); - return new Promise((resolveLoad, rejectLoad) => { - worker.on('message', (content: WorkerMessageContent) => { - const { type, name, data } = content; - if ( - type === WorkerMessageType.System && - name === WorkerMessageName.LoadComplete - ) { - worker.removeAllListeners(); - resolveLoad(); - } else if ( - type === WorkerMessageType.System && - name === WorkerMessageName.LoadFailure - ) { - worker.removeAllListeners(); - rejectLoad(data); - } - }); - worker.postMessage(WorkerMessage.load(worker.threadId, pointer).toJson()); - }); - } - - public async dispose(): Promise { - const { worker } = this; - worker.removeAllListeners(); - return new Promise((resolveDispose, rejectDispose) => { - worker.on('message', (content: WorkerMessageContent) => { - const { type, name, data } = content; - if ( - type === WorkerMessageType.System && - name === WorkerMessageName.DisposeComplete - ) { - worker.removeAllListeners(); - resolveDispose(); - } else if ( - type === WorkerMessageType.System && - name === WorkerMessageName.DisposeFailure - ) { - worker.removeAllListeners(); - rejectDispose(data); - } - }); - worker.postMessage(WorkerMessage.dispose(worker.threadId).toJson()); - }); - } - - public run(data?: DataType): void { - const { worker } = this; - worker.postMessage(WorkerMessage.runTask(worker.threadId, data).toJson()); - } - - public onMessage(handler: (message: WorkerMessage) => Promise) { - this.worker.on('message', (content: WorkerMessageContent) => { - if (content.type !== WorkerMessageType.System) { - handler(WorkerMessage.create(content)).catch(log); - } - }); - } - - public onError(handler: (workerId: number, error: Error) => void) { - this.worker.on('error', error => handler(this.worker.threadId, error)); - } - - public onExit(handler: (workerId: number, code: number) => void) { - this.worker.on('exit', code => handler(this.worker.threadId, code)); - } - - public async remove(): Promise { - const code = await this.worker.terminate(); - return code; - } -} diff --git a/src/common/workers/worker.enums.ts b/src/common/workers/worker.enums.ts deleted file mode 100644 index 4e87f41..0000000 --- a/src/common/workers/worker.enums.ts +++ /dev/null @@ -1,4 +0,0 @@ -export enum WorkerStatus { - complete = 'complete', - error = 'complete', -} \ No newline at end of file diff --git a/src/common/workers/worker.errors.ts b/src/common/workers/worker.errors.ts deleted file mode 100644 index b43dcd6..0000000 --- a/src/common/workers/worker.errors.ts +++ /dev/null @@ -1,27 +0,0 @@ -export class MissingWorkerPathError extends Error {} - -export class InvalidPathError extends Error { - constructor(path: string) { - super(`The given path is invalid: ${path}`); - } -} - -export class WorkerPathMismatchError extends Error { - constructor(path: string, globalPath: string) { - super(`You cannot use path (${path}) when global is specified (${globalPath})`); - } -} - -export class WorkerNotFoundError extends Error { - constructor(id: number) { - super(`No worker with the specified ID (${id}) was found`); - } -} - -export class WorkerPoolPathsConflictError extends Error { - constructor() { - super( - `default worker path and "containerPath" cannot be specified at the same time, both options are mutually exclusive.` - ); - } -} diff --git a/src/common/workers/worker.ts b/src/common/workers/worker.ts deleted file mode 100644 index 81a2948..0000000 --- a/src/common/workers/worker.ts +++ /dev/null @@ -1,47 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-call */ -/* eslint-disable @typescript-eslint/no-unused-vars */ -/* eslint-disable @typescript-eslint/require-await */ -import { parentPort, threadId } from 'worker_threads'; -import { WorkerMessage } from './worker-message'; - -export type TaskResolved = 'task_resolved'; -export type TaskRejected = 'task_rejected'; -export type TaskProgress = 'task_progress'; -export type TaskStatus = TaskResolved | TaskRejected | TaskProgress; - -export class Worker { - public get id(): number { - return threadId; - } - - protected sharedData: SharedDataType; - private isRejected = false; - - public run(...args: unknown[]): void { - throw new Error('Method not implemented'); - } - - public deserialize(data: unknown): unknown { - throw new Error('Method not implemented'); - } - - public resolve(data?: DataType): TaskResolved { - if (this.isRejected === false) { - parentPort.postMessage( - WorkerMessage.taskResolved(threadId, data).toJson() - ); - return 'task_resolved'; - } - } - - public reject(error?: Error): TaskRejected { - parentPort.postMessage(WorkerMessage.taskRejected(threadId, error).toJson()); - this.isRejected = true; - return 'task_rejected'; - } - - public progress(data?: DataType): TaskProgress { - parentPort.postMessage(WorkerMessage.taskProgress(threadId, data).toJson()); - return 'task_progress'; - } -} diff --git a/src/common/workers/worker.types.ts b/src/common/workers/worker.types.ts deleted file mode 100644 index adb88a1..0000000 --- a/src/common/workers/worker.types.ts +++ /dev/null @@ -1,25 +0,0 @@ -export type PathsByNames = { - default?: string; - [key: string]: string; -}; - -export type WorkersConfig = { - threadsCount?: number; - inviolableThreadsCount?: number; - sharedData?: SharedDataType; - [key: string]: unknown; -}; - -export type WorkerProxyOptions = { - workerLoaderPath?: string; -}; - -export type WorkerPoolOptions = WorkersConfig & WorkerProxyOptions; - -export type WorkerData = { - pointer: string; - sharedData?: unknown; - options?: WorkerProxyOptions; -}; - -export type WorkerClass = new (...args: unknown[]) => T; diff --git a/src/common/workers/worker.utils.ts b/src/common/workers/worker.utils.ts deleted file mode 100644 index 7e1f612..0000000 --- a/src/common/workers/worker.utils.ts +++ /dev/null @@ -1,25 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-member-access */ -import * as os from 'os'; -import { workerData } from 'worker_threads'; -/** - * Get the number of workers from configuration - * or based on the number of available CPU cores. - * The number of CPU cores is reduced by - * a constant specified in the configuration. - * - * @param {Config} config - * @returns {number} - */ -export const getWorkersCount = ( - threadsCount: number, - inviolableThreadsCount = 0 -): number => { - if (threadsCount === 0 || isNaN(threadsCount)) { - const cpus = os.cpus().length; - return cpus - inviolableThreadsCount; - } - - return threadsCount; -}; - -export const getSharedData = () => workerData.sharedData as T; diff --git a/src/config/config.types.ts b/src/config/config.types.ts index f05bf20..8814ec2 100644 --- a/src/config/config.types.ts +++ b/src/config/config.types.ts @@ -1,8 +1,8 @@ import { ApiConfig } from '../api'; import { BootstrapConfig } from '../bootstrap'; -import { ReaderConfig } from '../reader'; import { FilterConfig } from '../filter'; import { ProcessorConfig } from '../processor'; +import { ReaderConfig } from '../reader'; export type HistoryToolsConfig = { api: ApiConfig; @@ -11,4 +11,4 @@ export type HistoryToolsConfig = { filter: FilterConfig; processor: ProcessorConfig; [key: string]: unknown; -}; \ No newline at end of file +}; diff --git a/src/config/index.ts b/src/config/index.ts index cd6f6d7..5bfce4c 100644 --- a/src/config/index.ts +++ b/src/config/index.ts @@ -1,143 +1,145 @@ -import { ProcessorTaskQueueConfig } from '../processor/processor-task-queue/processor-task-queue.config'; -import { - ConfigVars, - buildBroadcastConfig, - buildMongoConfig, - parseToBigInt, -} from '@alien-worlds/api-core'; -import { ApiConfig } from '../api'; -import { BootstrapConfig, BootstrapCommandOptions } from '../bootstrap'; -import { ReaderConfig, ReaderCommandOptions } from '../reader'; -import { FilterConfig, FilterCommandOptions } from '../filter'; -import { ProcessorConfig, ProcessorCommandOptions } from '../processor'; +import { buildBroadcastConfig } from '@alien-worlds/aw-broadcast'; +import { BlockReaderConfig, ConfigVars, parseToBigInt, UnknownObject } from '@alien-worlds/aw-core'; +import { WorkersConfig } from '@alien-worlds/aw-workers'; + +import { ApiCommandOptions, ApiConfig } from '../api'; +import { BlockchainConfig, BootstrapCommandOptions, BootstrapConfig } from '../bootstrap'; +import { AbisConfig, AbisServiceConfig, BlockRangeScanConfig, FeaturedConfig, FeaturedContractDataCriteria, ProcessorTaskQueueConfig, UnprocessedBlockQueueConfig } from '../common'; +import { FilterCommandOptions, FilterConfig } from '../filter'; +import { ProcessorCommandOptions, ProcessorConfig } from '../processor'; +import { ReaderCommandOptions, ReaderConfig } from '../reader'; +import { Api } from './../api/api'; import { HistoryToolsConfig } from './config.types'; -import { - AbisConfig, - AbisServiceConfig, - BlockRangeScanConfig, - BlockReaderConfig, - ContractReaderConfig, - FeaturedConfig, - WorkersConfig, -} from '../common'; export * from './config.types'; -export const buildBlockchainConfig = ( - vars: ConfigVars -): { endpoint: string; chainId: string } => ({ +export const buildBlockchainConfig = (vars: ConfigVars): BlockchainConfig => ({ endpoint: vars.getStringEnv('BLOCKCHAIN_ENDPOINT'), chainId: vars.getStringEnv('BLOCKCHAIN_CHAIN_ID'), }); -export const buildContractReaderConfig = (vars: ConfigVars): ContractReaderConfig => ({ - url: vars.getStringEnv('HYPERION_URL'), +export const buildFeaturedConfig = (vars: ConfigVars): FeaturedConfig => ({ + rpcUrl: vars.getStringEnv('BLOCKCHAIN_ENDPOINT'), + serviceUrl: vars.getStringEnv('HYPERION_URL'), }); export const buildBlockRangeScanConfig = ( vars: ConfigVars, scanKey?: string ): BlockRangeScanConfig => ({ - maxChunkSize: vars.getNumberEnv('SCANNER_NODES_MAX_CHUNK_SIZE'), + maxChunkSize: vars.getNumberEnv('SCANNER_NODES_MAX_CHUNK_SIZE') || 100, scanKey: scanKey || vars.getStringEnv('SCANNER_SCAN_KEY'), }); export const buildAbisServiceConfig = (vars: ConfigVars): AbisServiceConfig => ({ url: vars.getStringEnv('HYPERION_URL'), - limit: vars.getNumberEnv('ABIS_SERVICE_LIMIT'), - filter: vars.getStringEnv('ABIS_SERVICE_FILTER'), + limit: vars.getNumberEnv('ABIS_SERVICE_LIMIT') || 100, + filter: vars.getStringEnv('ABIS_SERVICE_FILTER') || 'eosio:setabi', }); export const buildAbisConfig = ( vars: ConfigVars, - featured: FeaturedConfig + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject ): AbisConfig => ({ service: buildAbisServiceConfig(vars), - mongo: buildMongoConfig(vars), - featured, + database: databaseConfigBuilder(vars), }); export const buildBlockReaderConfig = (vars: ConfigVars): BlockReaderConfig => ({ - mongo: buildMongoConfig(vars), endpoints: vars.getArrayEnv('BLOCK_READER_ENDPOINTS'), - shouldFetchDeltas: vars.getBooleanEnv('BLOCK_READER_FETCH_DELTAS'), - shouldFetchTraces: vars.getBooleanEnv('BLOCK_READER_FETCH_TRACES'), + shouldFetchBlock: vars.getBooleanEnv('BLOCK_READER_FETCH_BLOCK') || true, + shouldFetchDeltas: vars.getBooleanEnv('BLOCK_READER_FETCH_DELTAS') || true, + shouldFetchTraces: vars.getBooleanEnv('BLOCK_READER_FETCH_TRACES') || true, }); export const buildReaderWorkersConfig = ( vars: ConfigVars, threadsCount?: number ): WorkersConfig => ({ - threadsCount: threadsCount || vars.getNumberEnv('READER_MAX_THREADS'), - inviolableThreadsCount: vars.getNumberEnv('READER_INVIOLABLE_THREADS_COUNT'), + threadsCount: threadsCount || vars.getNumberEnv('READER_MAX_THREADS') || 1, + inviolableThreadsCount: vars.getNumberEnv('READER_INVIOLABLE_THREADS_COUNT') || 0, }); export const buildProcessorWorkersConfig = ( vars: ConfigVars, threadsCount?: number ): WorkersConfig => ({ - threadsCount: threadsCount || vars.getNumberEnv('PROCESSOR_MAX_THREADS'), - inviolableThreadsCount: vars.getNumberEnv('PROCESSOR_INVIOLABLE_THREADS_COUNT'), + threadsCount: threadsCount || vars.getNumberEnv('PROCESSOR_MAX_THREADS') || 1, + inviolableThreadsCount: vars.getNumberEnv('PROCESSOR_INVIOLABLE_THREADS_COUNT') || 0, }); export const buildFilterWorkersConfig = ( vars: ConfigVars, options?: FilterCommandOptions ): WorkersConfig => ({ - threadsCount: options?.threads || vars.getNumberEnv('FILTER_MAX_THREADS'), - inviolableThreadsCount: vars.getNumberEnv('FILTER_INVIOLABLE_THREADS_COUNT'), + threadsCount: options?.threads || vars.getNumberEnv('FILTER_MAX_THREADS') || 1, + inviolableThreadsCount: vars.getNumberEnv('FILTER_INVIOLABLE_THREADS_COUNT') || 0, }); export const buildProcessorTaskQueueConfig = ( vars: ConfigVars ): ProcessorTaskQueueConfig => ({ - interval: vars.getNumberEnv('PROCESSOR_TASK_QUEUE_CHECK_INTERVAL'), + interval: vars.getNumberEnv('PROCESSOR_TASK_QUEUE_CHECK_INTERVAL') || 5000, }); -export const buildApiConfig = (vars: ConfigVars): ApiConfig => ({ - port: vars.getNumberEnv('API_PORT'), - mongo: buildMongoConfig(vars), +export const buildUnprocessedBlockQueueConfig = ( + vars: ConfigVars +): UnprocessedBlockQueueConfig => ({ + maxBytesSize: vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_MAX_BYTES_SIZE') || 256000000, + sizeCheckInterval: + vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_SIZE_CHECK_INTERVAL') || 2000, + batchSize: vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_BATCH_SIZE') || 100, + fastLaneBatchSize: + vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_FAST_LANE_BATCH_SIZE') || 1, +}); + +export const buildApiConfig = ( + vars: ConfigVars, + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, + options?: ApiCommandOptions +): ApiConfig => ({ + host: vars.getStringEnv('API_HOST') || options.host || 'localhost', + port: vars.getNumberEnv('API_PORT') || options.port || 8080, + database: databaseConfigBuilder(vars), }); export const buildBootstrapConfig = ( vars: ConfigVars, - featured: FeaturedConfig, + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, options?: BootstrapCommandOptions ): BootstrapConfig => ({ - mongo: buildMongoConfig(vars), + database: databaseConfigBuilder(vars), broadcast: buildBroadcastConfig(vars), blockchain: buildBlockchainConfig(vars), - contractReader: buildContractReaderConfig(vars), + featured: buildFeaturedConfig(vars), scanner: buildBlockRangeScanConfig(vars, options?.scanKey), startBlock: options?.startBlock ? parseToBigInt(options?.startBlock) : vars.getStringEnv('START_BLOCK') - ? parseToBigInt(vars.getStringEnv('START_BLOCK')) - : null, + ? parseToBigInt(vars.getStringEnv('START_BLOCK')) + : null, endBlock: options?.endBlock ? parseToBigInt(options?.endBlock) : vars.getStringEnv('END_BLOCK') - ? parseToBigInt(vars.getStringEnv('END_BLOCK')) - : null, - startFromHead: vars.getBooleanEnv('START_FROM_HEAD'), - mode: options?.mode || vars.getStringEnv('MODE'), - featured, + ? parseToBigInt(vars.getStringEnv('END_BLOCK')) + : null, + startFromHead: vars.getBooleanEnv('START_FROM_HEAD') || false, + mode: options?.mode || vars.getStringEnv('MODE') || 'default', abis: buildAbisServiceConfig(vars), - maxBlockNumber: vars.getNumberEnv('MAX_BLOCK_NUMBER'), + maxBlockNumber: vars.getNumberEnv('MAX_BLOCK_NUMBER') || 0xffffffff, }); export const buildReaderConfig = ( vars: ConfigVars, + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, options?: ReaderCommandOptions ): ReaderConfig => ({ - mongo: buildMongoConfig(vars), + database: databaseConfigBuilder(vars), broadcast: buildBroadcastConfig(vars), scanner: buildBlockRangeScanConfig(vars, options?.scanKey), - mode: options?.mode || vars.getStringEnv('MODE'), - maxBlockNumber: vars.getNumberEnv('MAX_BLOCK_NUMBER'), - blockQueueMaxBytesSize: vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_MAX_BYTES_SIZE'), - blockQueueSizeCheckInterval: vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_SIZE_CHECK_INTERVAL'), - blockQueueBatchSize: vars.getNumberEnv('UNPROCESSED_BLOCK_QUEUE_BATCH_SIZE'), + mode: options?.mode || vars.getStringEnv('MODE') || 'default', + maxBlockNumber: vars.getNumberEnv('MAX_BLOCK_NUMBER') || 0xffffffff, + unprocessedBlockQueue: buildUnprocessedBlockQueueConfig(vars), workers: buildReaderWorkersConfig(vars, options?.threads), blockReader: buildBlockReaderConfig(vars), startBlock: options?.startBlock ? parseToBigInt(options?.startBlock) : null, @@ -146,38 +148,43 @@ export const buildReaderConfig = ( export const buildFilterConfig = ( vars: ConfigVars, - featured: FeaturedConfig, + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, options?: FilterCommandOptions ): FilterConfig => ({ - mode: options?.mode || vars.getStringEnv('MODE'), + mode: options?.mode || vars.getStringEnv('MODE') || 'default', broadcast: buildBroadcastConfig(vars), workers: buildFilterWorkersConfig(vars, options), - featured, - abis: buildAbisConfig(vars, featured), - contractReader: buildContractReaderConfig(vars), - mongo: buildMongoConfig(vars), - queue: buildProcessorTaskQueueConfig(vars), + abis: buildAbisConfig(vars, databaseConfigBuilder), + featured: buildFeaturedConfig(vars), + database: databaseConfigBuilder(vars), + processorTaskQueue: buildProcessorTaskQueueConfig(vars), + unprocessedBlockQueue: buildUnprocessedBlockQueueConfig(vars), }); export const buildProcessorConfig = ( vars: ConfigVars, - featured: FeaturedConfig, + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, options?: ProcessorCommandOptions ): ProcessorConfig => ({ broadcast: buildBroadcastConfig(vars), workers: buildProcessorWorkersConfig(vars, options?.threads), - featured, - mongo: buildMongoConfig(vars), + featured: buildFeaturedConfig(vars), + database: databaseConfigBuilder(vars), queue: buildProcessorTaskQueueConfig(vars), }); export const buildHistoryToolsConfig = ( vars: ConfigVars, - featured: FeaturedConfig + databaseConfigBuilder: (vars: ConfigVars, ...args: unknown[]) => UnknownObject, + featuredCriteria?: FeaturedContractDataCriteria, + bootstrapOptions?: BootstrapCommandOptions, + readerOptions?: ReaderCommandOptions, + filterOptions?: FilterCommandOptions, + processorOptions?: ProcessorCommandOptions ): HistoryToolsConfig => ({ - api: buildApiConfig(vars), - bootstrap: buildBootstrapConfig(vars, featured), - reader: buildReaderConfig(vars), - filter: buildFilterConfig(vars, featured), - processor: buildProcessorConfig(vars, featured), + api: buildApiConfig(vars, databaseConfigBuilder), + bootstrap: buildBootstrapConfig(vars, databaseConfigBuilder, bootstrapOptions), + reader: buildReaderConfig(vars, databaseConfigBuilder, readerOptions), + filter: buildFilterConfig(vars, databaseConfigBuilder, filterOptions), + processor: buildProcessorConfig(vars, databaseConfigBuilder, processorOptions), }); diff --git a/src/filter/deserialized-block.ts b/src/filter/deserialized-block.ts deleted file mode 100644 index b148783..0000000 --- a/src/filter/deserialized-block.ts +++ /dev/null @@ -1,72 +0,0 @@ -import { - Abi, - Delta, - SignedBlock, - SignedBlockJson, - Trace, -} from '../common/blockchain'; -import { TraceJson } from '../common/blockchain/contract/trace'; -import { DeltaJson } from '../common/blockchain/contract/delta'; -import { deserializeMessage } from '../reader'; -import { Block, BlockNumberWithId } from '../common/blockchain/block-reader/block'; - -export class DeserializedBlock { - public static create(block: Block, abi: Abi): DeserializedBlock { - const { head, lastIrreversible, prevBlock, thisBlock } = block; - const types = abi.getTypesMap(); - let traces: Trace[] = []; - let deltas: Delta[] = []; - let signedBlock: SignedBlock; - - if (block.block && block.block.length > 0) { - const deserializedBlock = deserializeMessage( - 'signed_block', - block.block, - types - ); - signedBlock = SignedBlock.create(deserializedBlock); - } - - if (block.traces && block.traces.length > 0) { - const tracesByType = deserializeMessage<[[string, TraceJson]]>( - 'transaction_trace[]', - block.traces, - types - ); - traces = tracesByType.map(([shipMessageName, traceJson]) => - Trace.create(shipMessageName, traceJson) - ); - } - - if (block.deltas && block.deltas.length > 0) { - const deltasByType = deserializeMessage<[[string, DeltaJson]]>( - 'table_delta[]', - block.deltas, - types - ); - deltas = deltasByType.map(([shipMessageName, deltaJson]) => - Delta.create(shipMessageName, deltaJson) - ); - } - - return new DeserializedBlock( - head, - thisBlock, - prevBlock, - lastIrreversible, - signedBlock, - traces, - deltas - ); - } - - private constructor( - public readonly head: BlockNumberWithId, - public readonly thisBlock: BlockNumberWithId, - public readonly prevBlock: BlockNumberWithId, - public readonly lastIrreversible: BlockNumberWithId, - public readonly block: SignedBlock, - public readonly traces: Trace[], - public readonly deltas: Delta[] - ) {} -} diff --git a/src/filter/filter.command.ts b/src/filter/filter.command.ts new file mode 100644 index 0000000..c28bbb1 --- /dev/null +++ b/src/filter/filter.command.ts @@ -0,0 +1,10 @@ +import { Command } from 'commander'; + +export const filterCommand = new Command(); + +filterCommand + .version('1.0', '-v, --version') + .option('-k, --scan-key ', 'Scan key') + .option('-m, --mode ', 'Mode (default/replay/test)') + .option('-t, --threads ', 'Number of threads') + .parse(process.argv); diff --git a/src/filter/filter.config.ts b/src/filter/filter.config.ts new file mode 100644 index 0000000..7c346ab --- /dev/null +++ b/src/filter/filter.config.ts @@ -0,0 +1,23 @@ +import { UnknownObject } from '@alien-worlds/aw-core'; +import { BroadcastConfig } from '@alien-worlds/aw-broadcast'; +import { WorkersConfig } from '@alien-worlds/aw-workers'; +import { + AbisConfig, + FeaturedConfig, + ProcessorTaskQueueConfig, + UnprocessedBlockQueueConfig, +} from '../common'; + +export type FilterConfig = { + mode: string; + broadcast: BroadcastConfig; + workers: WorkersConfig; + featured: FeaturedConfig; + abis: AbisConfig; + database: DatabaseConfig; + processorTaskQueue: ProcessorTaskQueueConfig; + unprocessedBlockQueue: UnprocessedBlockQueueConfig; + maxBytesSize?: number; + batchSize?: number; + [key: string]: unknown; +}; diff --git a/src/filter/filter.dependencies.ts b/src/filter/filter.dependencies.ts new file mode 100644 index 0000000..83a983a --- /dev/null +++ b/src/filter/filter.dependencies.ts @@ -0,0 +1,27 @@ +import { Result } from '@alien-worlds/aw-core'; +import { BroadcastClient } from '@alien-worlds/aw-broadcast'; +import { UnprocessedBlockQueue } from '../common/unprocessed-block-queue'; +import { Dependencies } from '../common/dependencies'; +import { FilterConfig } from './filter.config'; +import { DatabaseConfigBuilder } from '../common'; +import { FilterAddons } from './filter.types'; + +/** + * An abstract class representing a Filter dependencies. + * @class FilterDependencies + */ +export abstract class FilterDependencies< + UnprocessedBlockModel = unknown +> extends Dependencies { + public broadcastClient: BroadcastClient; + public unprocessedBlockQueue: UnprocessedBlockQueue; + public workerLoaderPath?: string; + public workerLoaderDependenciesPath: string; + + public databaseConfigBuilder: DatabaseConfigBuilder; + + public abstract initialize( + config: FilterConfig, + addons?: FilterAddons + ): Promise; +} diff --git a/src/filter/filter.runner.ts b/src/filter/filter.runner.ts index 330d2c6..26eed52 100644 --- a/src/filter/filter.runner.ts +++ b/src/filter/filter.runner.ts @@ -1,33 +1,8 @@ -import { log } from '@alien-worlds/api-core'; -import { WorkerMessage, WorkerPool } from '../common/workers'; -import { FilterAddons, FilterConfig } from './filter.types'; -import { filterWorkerLoaderPath } from './filter.consts'; -import { BlockNotFoundError } from '../reader/unprocessed-block-queue/unprocessed-block-queue.errors'; -import { UnprocessedBlockQueue, UnprocessedBlockQueueReader } from '../reader'; -import { BlockJson } from '../common/blockchain/block-reader/block'; +import { BlockJsonModel, log } from '@alien-worlds/aw-core'; +import { WorkerMessage, WorkerPool } from '@alien-worlds/aw-workers'; +import { BlockNotFoundError, UnprocessedBlockQueueReader } from '../common'; export class FilterRunner { - public static async create(config: FilterConfig, addons: FilterAddons) { - const { workers } = config; - const { matchers } = addons || {}; - const blocks = await UnprocessedBlockQueue.create( - config.mongo - ); - - const workerPool = await WorkerPool.create({ - ...workers, - sharedData: { config, matchers }, - workerLoaderPath: filterWorkerLoaderPath, - }); - const runner = new FilterRunner(workerPool, blocks); - - workerPool.onWorkerRelease(() => runner.next()); - - log(` * Worker Pool (max ${workerPool.workerMaxCount} workers) ... [ready]`); - - return runner; - } - private interval: NodeJS.Timeout; private loop: boolean; private transitionHandler: (...args: unknown[]) => void | Promise; @@ -70,7 +45,7 @@ export class FilterRunner { this.loop = false; } else { const worker = await workerPool.getWorker(); - worker.onMessage(async (message: WorkerMessage) => { + worker.onMessage(async (message: WorkerMessage) => { if (message.isTaskRejected()) { log(message.error); } else if (message.isTaskResolved() && this.transitionHandler) { diff --git a/src/filter/filter.types.ts b/src/filter/filter.types.ts index e49c281..3e99f31 100644 --- a/src/filter/filter.types.ts +++ b/src/filter/filter.types.ts @@ -1,12 +1,8 @@ -import { MongoConfig, BroadcastConfig } from '@alien-worlds/api-core'; -import { FeaturedConfig, FeaturedMatchers } from '../common/featured'; -import { ProcessorTaskQueueConfig } from '../processor/processor-task-queue/processor-task-queue.config'; -import { WorkersConfig } from '../common/workers'; -import { AbisConfig } from '../common/abis'; -import { ContractReaderConfig } from '../common/blockchain'; +import { FilterConfig } from "./filter.config"; export type FilterSharedData = { config: FilterConfig; + featuredCriteriaPath: string; }; export type FilterCommandOptions = { @@ -14,19 +10,7 @@ export type FilterCommandOptions = { mode: string; }; -export type FilterConfig = { - mode: string; - broadcast: BroadcastConfig; - workers: WorkersConfig; - featured: FeaturedConfig; - abis: AbisConfig; - contractReader: ContractReaderConfig; - mongo: MongoConfig; - queue: ProcessorTaskQueueConfig; - [key: string]: unknown; -}; - export type FilterAddons = { - matchers?: FeaturedMatchers; + matchers?: unknown; [key: string]: unknown; }; diff --git a/src/filter/filter.utils.ts b/src/filter/filter.utils.ts deleted file mode 100644 index f1a7665..0000000 --- a/src/filter/filter.utils.ts +++ /dev/null @@ -1,44 +0,0 @@ -import { log } from '@alien-worlds/api-core'; -import { Serialize } from 'eosjs'; - -type DeltaAllocation = { - code: string; - scope: string; - table: string; -}; - -export const extractAllocationFromDeltaRow = (value: Uint8Array): DeltaAllocation => { - const sb = new Serialize.SerialBuffer({ - textEncoder: new TextEncoder(), - textDecoder: new TextDecoder(), - array: value, - }); - - try { - sb.get(); // ? - const code = sb.getName(); - const scope = sb.getName(); - const table = sb.getName(); - - return { code, scope, table }; - } catch (error) { - log( - `Unable to extract data, most likely data cannot be deserialized using eosjs.Serialize or the contract does not contain tables. ${error.message}` - ); - return null; - } -}; - -export const extractValues = (value: string): Set => { - const result = new Set(); - if (!value || value.includes('*')) { - result.add('*'); - } else { - value.split(',').forEach(entry => { - if (/^(\*|[A-Za-z0-9_.]*)$/g.test(entry)) { - result.add(entry); - } - }); - } - return result; -}; diff --git a/src/filter/filter.worker-loader.dependencies.ts b/src/filter/filter.worker-loader.dependencies.ts new file mode 100644 index 0000000..758ec21 --- /dev/null +++ b/src/filter/filter.worker-loader.dependencies.ts @@ -0,0 +1,22 @@ +import { Serializer } from '@alien-worlds/aw-core'; +import { ProcessorTaskQueue } from '../common/processor-task-queue'; +import { WorkerLoaderDependencies } from '@alien-worlds/aw-workers'; +import { FeaturedContracts } from '../common/featured'; +import { Abis } from '../common'; +import { FilterConfig } from './filter.config'; + +/** + * An abstract class representing a FilterWorkerLoader dependencies. + * @class FilterWorkerLoaderDependencies + */ +export abstract class FilterWorkerLoaderDependencies extends WorkerLoaderDependencies { + public processorTaskQueue: ProcessorTaskQueue; + public abis: Abis; + public featuredContracts: FeaturedContracts; + public serializer: Serializer; + + public abstract initialize( + config: FilterConfig, + featuredCriteriaPath: string + ): Promise; +} diff --git a/src/filter/filter.worker-loader.ts b/src/filter/filter.worker-loader.ts index cac25ab..091c86c 100644 --- a/src/filter/filter.worker-loader.ts +++ b/src/filter/filter.worker-loader.ts @@ -1,62 +1,20 @@ -import { MongoSource } from '@alien-worlds/api-core'; -import { Worker } from '../common/workers'; -import { DefaultWorkerLoader } from '../common/workers/worker-loader'; +import { Worker, DefaultWorkerLoader } from '@alien-worlds/aw-workers'; import { FilterSharedData } from './filter.types'; -import { - FeaturedContractContent, - FeaturedDelta, - FeaturedTrace, -} from '../common/featured'; -import { ContractReader } from '../common/blockchain'; -import { Abis } from '../common/abis'; -import { ProcessorTaskQueue } from '../processor/processor-task-queue'; import FilterWorker from './filter.worker'; -import { ShipAbis } from '../common/ship/ship-abis'; - -export default class FilterWorkerLoader extends DefaultWorkerLoader { - private featuredTraces: FeaturedTrace[]; - private featuredDeltas: FeaturedDelta[]; - private contractReader: ContractReader; - private processorTaskQueue: ProcessorTaskQueue; - private abis: Abis; - private shipAbis: ShipAbis; +import { FilterWorkerLoaderDependencies } from './filter.worker-loader.dependencies'; +export default class FilterWorkerLoader extends DefaultWorkerLoader< + FilterSharedData, + FilterWorkerLoaderDependencies +> { public async setup(sharedData: FilterSharedData): Promise { - super.setup(sharedData); - const { - config: { mongo, featured, abis, contractReader, queue }, - } = sharedData; - const { traces, deltas } = new FeaturedContractContent(featured).toJson(); - - const mongoSource = await MongoSource.create(mongo); - this.abis = await Abis.create(mongoSource, abis.service, featured); - this.contractReader = await ContractReader.create(contractReader, mongoSource); - this.processorTaskQueue = await ProcessorTaskQueue.create(mongoSource, true, queue); - this.shipAbis = await ShipAbis.create(mongoSource); - this.featuredDeltas = deltas; - this.featuredTraces = traces; + const { config, featuredCriteriaPath } = sharedData; + await super.setup(sharedData, config, featuredCriteriaPath); } public async load(): Promise { - const { - abis, - shipAbis, - contractReader, - featuredTraces, - featuredDeltas, - processorTaskQueue, - sharedData, - } = this; - return new FilterWorker( - { - shipAbis, - abis, - contractReader, - featuredTraces, - featuredDeltas, - processorTaskQueue, - }, - sharedData - ); + const { dependencies, sharedData } = this; + + return new FilterWorker(dependencies, sharedData); } } diff --git a/src/filter/filter.worker.ts b/src/filter/filter.worker.ts index baa8925..bf3beff 100644 --- a/src/filter/filter.worker.ts +++ b/src/filter/filter.worker.ts @@ -1,117 +1,105 @@ -import { log } from '@alien-worlds/api-core'; -import { - FeaturedDelta, - FeaturedDeltas, - FeaturedTrace, - FeaturedTraces, -} from '../common/featured'; -import { Worker } from '../common/workers'; -import { DeserializedBlock } from './deserialized-block'; -import { Abis } from '../common/abis'; -import { ContractReader } from '../common/blockchain'; -import { isSetAbiAction } from '../common/common.utils'; -import { ProcessorTask, ProcessorTaskQueue } from '../processor/processor-task-queue'; +import { Worker } from '@alien-worlds/aw-workers'; import { FilterSharedData } from './filter.types'; -import { extractAllocationFromDeltaRow } from './filter.utils'; -import { Block, BlockJson } from '../common/blockchain/block-reader/block'; -import { ShipAbis } from '../common/ship/ship-abis'; +import { + AbiNotFoundError, + Abis, + BlockModel, + DeltaByName, + FeaturedContracts, + ProcessorTask, + ProcessorTaskQueue, + SignedBlock, + TraceByName, + isSetAbiAction, +} from '../common'; +import { Serializer, log, parseToBigInt } from '@alien-worlds/aw-core'; export default class FilterWorker extends Worker { - protected shipAbis: ShipAbis; - protected abis: Abis; - protected contractReader: ContractReader; - protected processorTaskQueue: ProcessorTaskQueue; - protected featuredTraces: FeaturedTrace[]; - protected featuredDeltas: FeaturedDelta[]; - constructor( - components: { - shipAbis: ShipAbis; + protected dependencies: { abis: Abis; - contractReader: ContractReader; + featuredContracts: FeaturedContracts; processorTaskQueue: ProcessorTaskQueue; - featuredTraces: FeaturedTrace[]; - featuredDeltas: FeaturedDelta[]; + serializer: Serializer; }, - sharedData: FilterSharedData + protected sharedData: FilterSharedData ) { super(); - const { - abis, - contractReader, - featuredTraces, - featuredDeltas, - processorTaskQueue, - shipAbis, - } = components; - this.shipAbis = shipAbis; - this.abis = abis; - this.contractReader = contractReader; - this.processorTaskQueue = processorTaskQueue; - this.featuredTraces = featuredTraces; - this.featuredDeltas = featuredDeltas; - this.sharedData = sharedData; } public async createActionProcessorTasks( - deserializedBlock: DeserializedBlock + deserializedBlock: BlockModel ): Promise { const { - featuredTraces, - abis, - contractReader, + dependencies: { abis, featuredContracts }, sharedData: { config }, } = this; const { traces, - thisBlock, + this_block, block: { timestamp }, - prevBlock, + prev_block, } = deserializedBlock; - const featured = new FeaturedTraces(featuredTraces); const list: ProcessorTask[] = []; - for (const trace of traces) { - const { id, actionTraces, shipTraceMessageName } = trace; + for (const [traceType, trace] of traces) { + const { id, action_traces } = trace; - for (const actionTrace of actionTraces) { + for (const [actionType, actionTrace] of action_traces) { const { act: { account, name }, } = actionTrace; - const matchedTraces = await featured.get({ - shipTraceMessageName, - action: name, - contract: account, - }); - - if (matchedTraces.length > 0) { + if (featuredContracts.isFeatured(account)) { try { // If the block in which the contract was created cannot be found or // its index is higher than the current block number, skip it, // the contract did not exist at that time - const initBlockNumber = await contractReader.getInitialBlockNumber(account); - if (initBlockNumber === -1n || initBlockNumber > thisBlock.blockNumber) { + const { content: contracts, failure } = await featuredContracts.readContracts( + [account] + ); + + if (failure) { + log(failure.error); + continue; + } + const contract = contracts[0]; + if ( + contract.initialBlockNumber === -1n || + contract.initialBlockNumber > parseToBigInt(this_block.block_num) + ) { continue; } // get ABI from the database and if it does not exist, try to fetch it - const abi = await abis.getAbi(thisBlock.blockNumber, account, true); - if (!abi && isSetAbiAction(account, name) === false) { - log( - `Action-trace {block_number: ${thisBlock.blockNumber}, account: ${account}, name: ${name}}: no ABI was found. This can be a problem in reading the content.` - ); + const { content: abi, failure: getAbiFailure } = await abis.getAbi( + parseToBigInt(this_block.block_num), + account, + true + ); + + if (getAbiFailure) { + if ( + getAbiFailure.error instanceof AbiNotFoundError && + isSetAbiAction(account, name) === false + ) { + log( + `Action-trace {block_number: ${this_block.block_num}, account: ${account}, name: ${name}}: no ABI was found. This can be a problem in reading the content.` + ); + } } + list.push( ProcessorTask.createActionProcessorTask( abi ? abi.hex : '', config.mode, - shipTraceMessageName, + traceType, + actionType, id, actionTrace, - thisBlock.blockNumber, - timestamp, - thisBlock.blockNumber <= prevBlock.blockNumber + parseToBigInt(this_block.block_num), + new Date(timestamp), + parseToBigInt(this_block.block_num) <= parseToBigInt(prev_block.block_num) ) ); } catch (error) { @@ -125,80 +113,84 @@ export default class FilterWorker extends Worker { } public async createDeltaProcessorTasks( - deserializedBlock: DeserializedBlock + deserializedBlock: BlockModel ): Promise { const { - featuredDeltas, - abis, - contractReader, + dependencies: { abis, featuredContracts, serializer }, sharedData: { config }, } = this; const { deltas, - thisBlock, + this_block, block: { timestamp }, - prevBlock, + prev_block, } = deserializedBlock; const list: ProcessorTask[] = []; - const featured = new FeaturedDeltas(featuredDeltas); - for (const delta of deltas) { - const { name, shipDeltaMessageName } = delta; - const allocations = delta.rows.map(row => extractAllocationFromDeltaRow(row.data)); + for (const [type, delta] of deltas) { + const { name, rows } = delta; - for (let i = 0; i < delta.rows.length; i++) { - const row = delta.rows[i]; - const allocation = allocations[i]; + for (const row of rows) { + const info = await serializer.deserializeTableRow(row); - if (!allocation) { - // contract allocation cannot be extracted + if (!info) { // The contract may not contain tables or may be corrupted continue; } - - const { code, scope, table } = allocation; - const matchedDeltas = await featured.get({ - shipDeltaMessageName, - name, - code, - scope, - table, - }); - - if (matchedDeltas.length > 0) { + const { table, code, scope } = info; + if (featuredContracts.isFeatured(code)) { try { // If the block in which the contract was created cannot be found or // its index is higher than the current block number, skip it, // the contract did not exist at that time - const initBlockNumber = await contractReader.getInitialBlockNumber(code); - if (initBlockNumber === -1n || initBlockNumber > thisBlock.blockNumber) { + const { content: contracts, failure } = await featuredContracts.readContracts( + [code] + ); + + if (failure) { + log(failure.error); + continue; + } + const contract = contracts[0]; + if ( + contract.initialBlockNumber === -1n || + contract.initialBlockNumber > parseToBigInt(this_block.block_num) + ) { continue; } // get ABI from the database and if it does not exist, try to fetch it - const abi = await abis.getAbi(thisBlock.blockNumber, code, true); - if (!abi) { - log( - `Delta {block_number: ${thisBlock.blockNumber}, code: ${code}, scope: ${scope}, table: ${table}}: no ABI was found. This can be a problem in reading the content.` - ); + const { content: abi, failure: getAbiFailure } = await abis.getAbi( + parseToBigInt(this_block.block_num), + code, + true + ); + + if (getAbiFailure) { + if (getAbiFailure.error instanceof AbiNotFoundError) { + log( + `Delta {block_number: ${this_block.block_num}, code: ${code}, scope: ${scope}, table: ${table}}: no ABI was found. This can be a problem in reading the content.` + ); + } } + list.push( ProcessorTask.createDeltaProcessorTask( abi ? abi.hex : '', config.mode, - shipDeltaMessageName, + type, name, code, scope, table, - thisBlock.blockNumber, - timestamp, + parseToBigInt(this_block.block_num), + new Date(timestamp), row, - thisBlock.blockNumber <= prevBlock.blockNumber + parseToBigInt(this_block.block_num) <= parseToBigInt(prev_block.block_num) ) ); } catch (error) { - log(`Delta (table row) not handled`, error); + log(error); } } } @@ -207,34 +199,35 @@ export default class FilterWorker extends Worker { return list; } - public async run(json: BlockJson): Promise { + public async run(json: BlockModel): Promise { try { - const { processorTaskQueue, shipAbis } = this; - const { content: abi, failure } = await shipAbis.getAbi(json.abi_version); - - if (failure) { - log('SHiP Abi not found.'); - this.reject(failure.error); - } + const { + dependencies: { serializer, processorTaskQueue }, + } = this; - const deserializedBlock = DeserializedBlock.create(Block.fromJson(json), abi); + const deserializedBlock = await serializer.deserializeBlock< + BlockModel, + BlockModel + >(json); const { - thisBlock: { blockNumber }, + this_block: { block_num }, } = deserializedBlock; + const [actionProcessorTasks, deltaProcessorTasks] = await Promise.all([ this.createActionProcessorTasks(deserializedBlock), this.createDeltaProcessorTasks(deserializedBlock), ]); + const tasks = [...actionProcessorTasks, ...deltaProcessorTasks]; if (tasks.length > 0) { log( - `Block #${blockNumber} contains ${actionProcessorTasks.length} actions and ${deltaProcessorTasks.length} deltas to process (${tasks.length} tasks in total).` + `Block #${block_num} contains ${actionProcessorTasks.length} actions and ${deltaProcessorTasks.length} deltas to process (${tasks.length} tasks in total).` ); processorTaskQueue.addTasks(tasks); } else { log( - `The block (${blockNumber}) does not contain actions and deltas that could be processed.` + `The block (${block_num}) does not contain actions and deltas that could be processed.` ); } diff --git a/src/filter/index.ts b/src/filter/index.ts index d4633ce..e7019e0 100644 --- a/src/filter/index.ts +++ b/src/filter/index.ts @@ -1,7 +1,10 @@ -export * from './deserialized-block'; +export * from './filter.config'; +export * from './filter.command'; +export * from './filter.dependencies'; export * from './filter.consts'; +export * from './filter.runner'; export * from './filter.types'; -export * from './filter.worker-loader'; export * from './filter.worker'; -export * from './filter.runner'; +export * from './filter.worker-loader'; +export * from './filter.worker-loader.dependencies'; export * from './start-filter'; diff --git a/src/filter/start-filter.ts b/src/filter/start-filter.ts index 35d109a..062cafc 100644 --- a/src/filter/start-filter.ts +++ b/src/filter/start-filter.ts @@ -1,46 +1,81 @@ -import { Broadcast, log } from '@alien-worlds/api-core'; -import { FilterAddons, FilterConfig } from './filter.types'; +import { FilterAddons, FilterCommandOptions } from './filter.types'; import { InternalBroadcastChannel, - InternalBroadcastClientName, InternalBroadcastMessageName, ProcessorBroadcastMessage, } from '../broadcast'; -import { InternalBroadcastMessage } from '../broadcast/internal-broadcast.message'; import { FilterRunner } from './filter.runner'; import { FilterBroadcastMessage } from '../broadcast/messages/filter-broadcast.message'; +import { buildFilterConfig } from '../config'; +import { filterCommand } from './filter.command'; +import { filterWorkerLoaderPath } from './filter.consts'; +import { log, ConfigVars } from '@alien-worlds/aw-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; +import { WorkerPool } from '@alien-worlds/aw-workers'; +import { FilterConfig } from './filter.config'; +import { FilterDependencies } from './filter.dependencies'; -/** - * - * @param featuredContent - * @param broadcastMessageMapper - * @param config - */ -export const startFilter = async (config: FilterConfig, addons?: FilterAddons) => { +export const filter = async ( + config: FilterConfig, + dependencies: FilterDependencies, + featuredCriteriaPath: string, + addons?: FilterAddons +) => { log(`Filter ... [starting]`); - const broadcast = await Broadcast.createClient({ - ...config.broadcast, - clientName: InternalBroadcastClientName.Filter, + const { matchers } = addons || {}; + const initResult = await dependencies.initialize(config, addons); + + if (initResult.isFailure) { + throw initResult.failure.error; + } + + const { + broadcastClient, + unprocessedBlockQueue, + workerLoaderPath, + workerLoaderDependenciesPath, + } = dependencies; + + const workerPool = await WorkerPool.create({ + ...config.workers, + sharedData: { config, matchers, featuredCriteriaPath }, + workerLoaderPath: workerLoaderPath || filterWorkerLoaderPath, + workerLoaderDependenciesPath, }); - const runner = await FilterRunner.create(config, addons); + + const runner = new FilterRunner(workerPool, unprocessedBlockQueue); + runner.onTransition(() => { - broadcast.sendMessage(ProcessorBroadcastMessage.refresh()); + broadcastClient.sendMessage(ProcessorBroadcastMessage.refresh()); }); - - broadcast.onMessage( + workerPool.onWorkerRelease(() => runner.next()); + broadcastClient.onMessage( InternalBroadcastChannel.Filter, - async (message: InternalBroadcastMessage) => { - if (message.content.name === InternalBroadcastMessageName.FilterRefresh) { + async (message: BroadcastMessage) => { + if (message.name === InternalBroadcastMessageName.FilterRefresh) { runner.next(); } } ); - await broadcast.connect(); + await broadcastClient.connect(); // Everything is ready, notify bootstrap that the process is ready to work - broadcast.sendMessage(FilterBroadcastMessage.ready()); + broadcastClient.sendMessage(FilterBroadcastMessage.ready()); // start filter in case the queue already contains blocks runner.next(); log(`Filter ... [ready]`); }; + +export const startFilter = ( + args: string[], + dependencies: FilterDependencies, + featuredCriteriaPath: string, + addons?: FilterAddons +) => { + const vars = new ConfigVars(); + const options = filterCommand.parse(args).opts(); + const config = buildFilterConfig(vars, dependencies.databaseConfigBuilder, options); + + filter(config, dependencies, featuredCriteriaPath, addons).catch(log); +}; diff --git a/src/index.ts b/src/index.ts index ad0dabb..5eeaddb 100644 --- a/src/index.ts +++ b/src/index.ts @@ -1,8 +1,12 @@ export * from './api'; export * from './bootstrap'; -export * from './common'; export * from './config'; +export * from './common'; export * from './filter'; export * from './broadcast'; export * from './processor'; export * from './reader'; + +export * from '@alien-worlds/aw-core'; +export * from '@alien-worlds/aw-broadcast'; +export * from '@alien-worlds/aw-workers'; diff --git a/src/processor/index.ts b/src/processor/index.ts index d7bb2a4..36c4f4d 100644 --- a/src/processor/index.ts +++ b/src/processor/index.ts @@ -1,11 +1,11 @@ -export * from './processors/action-trace.processor'; -export * from './processors/action-trace.processor.input'; -export * from './processors/delta.processor'; -export * from './processors/delta.processor.input'; -export * from './processors/processor'; - -export * from './processor.types'; +export * from './processors'; +export * from './processor.config'; +export * from './processor.command'; +export * from './processor.dependencies'; +export * from './processor.consts'; export * from './processor.enum'; -export * from './processor.errors'; +export * from './processor.runner'; +export * from './processor.types'; export * from './processor.worker-loader'; +export * from './processor.worker-loader.dependencies'; export * from './start-processor'; diff --git a/src/processor/processor-task-queue/data-sources/processor-task.source.ts b/src/processor/processor-task-queue/data-sources/processor-task.source.ts deleted file mode 100644 index 51d67fc..0000000 --- a/src/processor/processor-task-queue/data-sources/processor-task.source.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { - CollectionMongoSource, - DataSourceOperationError, - MongoDB, - MongoSource, -} from '@alien-worlds/api-core'; -import { ProcessorTaskQueueConfig } from '../processor-task-queue.config'; -import { ProcessorTaskDocument } from '../processor-task.types'; - -export class ProcessorTaskSource extends CollectionMongoSource { - private transactionOptions: MongoDB.TransactionOptions; - - constructor(mongoSource: MongoSource, private config: ProcessorTaskQueueConfig) { - super(mongoSource, 'history_tools.processor_tasks', { - indexes: [ - { - key: { block_number: 1 }, - background: true, - }, - { - key: { timestamp: 1, block_number: 1 }, - background: true, - }, - { - key: { mode: 1, type: 1 }, - background: true, - }, - { - key: { short_id: 1, mode: 1, type: 1 }, - background: true, - }, - { - key: { short_id: 1, mode: 1, block_number: 1, hash: 1 }, - unique: true, - background: true, - }, - ], - }); - } - - public async nextTask(mode?: string): Promise { - try { - const filter = mode ? { mode } : {}; - const result = await this.collection.findOneAndDelete(filter); - - return result.value; - } catch (error) { - throw DataSourceOperationError.fromError(error); - } - } -} diff --git a/src/processor/processor-task-queue/data-sources/unsuccessful-processor-task.source.ts b/src/processor/processor-task-queue/data-sources/unsuccessful-processor-task.source.ts deleted file mode 100644 index 1497648..0000000 --- a/src/processor/processor-task-queue/data-sources/unsuccessful-processor-task.source.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { - CollectionMongoSource, - MongoSource, -} from '@alien-worlds/api-core'; -import { ProcessorTaskDocument } from '../processor-task.types'; - -export class UnsuccessfulProcessorTaskSource extends CollectionMongoSource { - constructor(mongoSource: MongoSource) { - super(mongoSource, 'history_tools.unsuccessful_processor_tasks', { - indexes: [ - { - key: { block_number: 1 }, - background: true, - }, - { - key: { timestamp: 1, block_number: 1 }, - background: true, - }, - { - key: { mode: 1, type: 1 }, - background: true, - }, - { - key: { short_id: 1, mode: 1, type: 1 }, - background: true, - }, - { - key: { short_id: 1, mode: 1, block_number: 1, hash: 1 }, - unique: true, - background: true, - }, - ], - }); - } -} diff --git a/src/processor/processor-task-queue/index.ts b/src/processor/processor-task-queue/index.ts deleted file mode 100644 index 964e5f6..0000000 --- a/src/processor/processor-task-queue/index.ts +++ /dev/null @@ -1,5 +0,0 @@ -export * from './processor-task-queue'; -export * from './data-sources/processor-task.source'; -export * from './data-sources/unsuccessful-processor-task.source'; -export * from './processor-task'; -export * from './processor-task.types'; diff --git a/src/processor/processor-task-queue/processor-task-queue.ts b/src/processor/processor-task-queue/processor-task-queue.ts deleted file mode 100644 index bc63af7..0000000 --- a/src/processor/processor-task-queue/processor-task-queue.ts +++ /dev/null @@ -1,97 +0,0 @@ -import { - DataSourceBulkWriteError, - log, - MongoConfig, - MongoSource, -} from '@alien-worlds/api-core'; -import { ProcessorTaskSource } from './data-sources/processor-task.source'; -import { ProcessorTask } from './processor-task'; -import { UnsuccessfulProcessorTaskSource } from './data-sources/unsuccessful-processor-task.source'; -import { ProcessorTaskQueueConfig } from './processor-task-queue.config'; -import { ErrorJson } from '../../common/workers/worker-message'; - -export class ProcessorTaskQueue { - public static async create( - mongo: MongoSource | MongoConfig, - onlyAdd: boolean, - queueConfig?: ProcessorTaskQueueConfig - ) { - log(` * Processor Queue ... [starting]`); - - let state: ProcessorTaskQueue; - - if (mongo instanceof MongoSource) { - if (!mongo.client) { - throw new Error( - 'ProcessorTaskQueue requires MongoSource to provide a mongo client. Create Mongo Source using "MongoSource.create()"' - ); - } - - state = new ProcessorTaskQueue(mongo, queueConfig, onlyAdd); - } else { - const mongoSource = await MongoSource.create(mongo); - state = new ProcessorTaskQueue(mongoSource, queueConfig, onlyAdd); - } - - log(` * Processor Queue ... [ready]`); - return state; - } - - private source: ProcessorTaskSource; - private unsuccessfulSource: UnsuccessfulProcessorTaskSource; - - private constructor( - mongo: MongoSource, - config: ProcessorTaskQueueConfig, - private onlyAdd = false - ) { - this.source = new ProcessorTaskSource(mongo, config); - this.unsuccessfulSource = new UnsuccessfulProcessorTaskSource(mongo); - } - - public async nextTask(mode?: string): Promise { - // TODO: temporary solution - testing session options - if (this.onlyAdd) { - log(`Operation not allowed, queue created with option onlyAdd`); - return; - } - - try { - const dto = await this.source.nextTask(mode); - if (dto) { - return ProcessorTask.fromDocument(dto); - } - return null; - } catch (error) { - log(`Could not get next task due to: ${error.message}`); - return null; - } - } - - public async addTasks(tasks: ProcessorTask[], unsuccessful?: boolean): Promise { - const source = unsuccessful ? this.unsuccessfulSource : this.source; - try { - const dtos = tasks.map(task => task.toDocument()); - await source.insertMany(dtos); - } catch (error) { - const { concernError } = error; - const concernErrorMessage = (concernError)?.message || ''; - log(`Could not add tasks due to: ${error.message}. ${concernErrorMessage}`); - } - } - - public async stashUnsuccessfulTask( - task: ProcessorTask, - error: ErrorJson | Error - ): Promise { - try { - const { message, stack } = error; - const dto = task.toDocument(); - dto.error = { message, stack }; - - await this.unsuccessfulSource.insert(dto); - } catch (sourceError) { - log(`Could not stash failed task due to: ${error.message}`); - } - } -} diff --git a/src/processor/processor-task-queue/processor-task.types.ts b/src/processor/processor-task-queue/processor-task.types.ts deleted file mode 100644 index c7c187d..0000000 --- a/src/processor/processor-task-queue/processor-task.types.ts +++ /dev/null @@ -1,54 +0,0 @@ -import { MongoDB } from '@alien-worlds/api-core'; -import { ActionTraceModel } from '../../common/blockchain/contract/action-trace'; -import { DeltaRowModel } from '../../common/blockchain/contract/delta'; - -export type ProcessorTaskError = { - message: string; - stack: string; -}; - -export type ProcessorTaskDocument = { - _id?: MongoDB.ObjectId; - abi?: string; - is_fork?: boolean; - short_id?: string; - label?: string; - timestamp?: Date; - type?: string; - mode?: string; - content?: MongoDB.Binary; - hash?: string; - block_number?: MongoDB.Long; - block_timestamp?: Date; - error?: ProcessorTaskError; -}; - -export type ProcessorTaskModel = { - id: string; - isFork: string; - abi: string; - path: string; - label: string; - timestamp: Date; - type: string; - mode: string; - content: Buffer; - hash: string; - error?: ProcessorTaskError; -}; - -export type DeltaProcessorContentModel = { - shipDeltaMessageName: string; - name: string; - blockNumber: bigint; - blockTimestamp: Date; - row: DeltaRowModel; -}; - -export type ActionProcessorContentModel = { - shipTraceMessageName: string; - transactionId: string; - blockNumber: bigint; - blockTimestamp: Date; - actionTrace: ActionTraceModel; -}; diff --git a/src/processor/processor.command.ts b/src/processor/processor.command.ts new file mode 100644 index 0000000..55545fa --- /dev/null +++ b/src/processor/processor.command.ts @@ -0,0 +1,8 @@ +import { Command } from 'commander'; + +export const processorCommand = new Command(); + +processorCommand + .version('1.0', '-v, --version') + .option('-t, --threads ', 'Number of threads') + .parse(process.argv); diff --git a/src/processor/processor.config.ts b/src/processor/processor.config.ts new file mode 100644 index 0000000..b2b7567 --- /dev/null +++ b/src/processor/processor.config.ts @@ -0,0 +1,23 @@ +import { BroadcastConfig } from '@alien-worlds/aw-broadcast'; +import { WorkersConfig } from '@alien-worlds/aw-workers'; +import { FeaturedConfig, ProcessorMatcher, ProcessorTaskQueueConfig } from '../common'; +import { UnknownObject } from '@alien-worlds/aw-core'; + +export type ProcessorConfig = { + broadcast: BroadcastConfig; + workers: WorkersConfig; + featured: FeaturedConfig; + database: DatabaseConfig; + queue: ProcessorTaskQueueConfig; + processorLoaderPath?: string; + [key: string]: unknown; +}; + +export type ProcessorAddons = { + matchers?: { + traces?: ProcessorMatcher; + deltas?: ProcessorMatcher; + [key: string]: ProcessorMatcher; + }; + [key: string]: unknown; +}; diff --git a/src/processor/processor.consts.ts b/src/processor/processor.consts.ts index 6eef0c7..1261bfe 100644 --- a/src/processor/processor.consts.ts +++ b/src/processor/processor.consts.ts @@ -1 +1,2 @@ -export const processorWorkerLoaderPath = `${__dirname}/processor.worker-loader`; \ No newline at end of file +export const processorWorkerLoaderPath = `${__dirname}/processor.worker-loader`; +export const processorWorkerLoaderDependenciesPath = `${__dirname}/processor.worker-loader.dependencies`; diff --git a/src/processor/processor.dependencies.ts b/src/processor/processor.dependencies.ts new file mode 100644 index 0000000..a148df6 --- /dev/null +++ b/src/processor/processor.dependencies.ts @@ -0,0 +1,35 @@ +import { Result, Serializer } from '@alien-worlds/aw-core'; +import { Dependencies } from '../common/dependencies'; +import { + ContractDeltaMatchCriteria, + ContractTraceMatchCriteria, + Featured, +} from '../common/featured'; +import { ProcessorTaskQueue } from '../common/processor-task-queue'; +import { BroadcastClient } from '@alien-worlds/aw-broadcast'; +import { ProcessorAddons, ProcessorConfig } from './processor.config'; +import { DatabaseConfigBuilder } from '../common'; + +/** + * An abstract class representing a Processor dependencies. + * @class ProcessorDependencies + */ +export abstract class ProcessorDependencies extends Dependencies { + public workerLoaderPath?: string; + public workerLoaderDependenciesPath: string; + public broadcastClient: BroadcastClient; + public serializer: Serializer; + public featuredTraces: Featured; + public featuredDeltas: Featured; + public processorTaskQueue: ProcessorTaskQueue; + public processorsPath: string; + + public databaseConfigBuilder: DatabaseConfigBuilder; + + public abstract initialize( + config: ProcessorConfig, + featuredCriteriaPath: string, + processorsPath: string, + addons?: ProcessorAddons + ): Promise; +} diff --git a/src/processor/processor.errors.ts b/src/processor/processor.errors.ts deleted file mode 100644 index 45f4005..0000000 --- a/src/processor/processor.errors.ts +++ /dev/null @@ -1,5 +0,0 @@ -export class AbiNotFoundProcessorError extends Error { - constructor() { - super(''); - } -} \ No newline at end of file diff --git a/src/processor/processor.model.factory.ts b/src/processor/processor.model.factory.ts new file mode 100644 index 0000000..47f3f83 --- /dev/null +++ b/src/processor/processor.model.factory.ts @@ -0,0 +1,84 @@ +import { Serializer } from '@alien-worlds/aw-core'; +import { + ActionProcessorContentModel, + DeltaProcessorContentModel, + ProcessorTask, + ProcessorTaskType, + UnknownProcessorTypeError, +} from '../common'; +import { ActionTraceProcessorModel, DeltaProcessorModel } from './processor.types'; +import { deserialize } from 'v8'; + +export class ProcessorModelFactory { + constructor(protected serializer: Serializer) {} + + protected async buildActionTraceProcessorModel( + model: ProcessorTask + ): Promise> { + const { serializer } = this; + const { abi, content: buffer } = model; + const content: ActionProcessorContentModel = deserialize(buffer); + const { + action_trace: { act, receipt }, + block_num, + block_timestamp, + transaction_id, + } = content; + + const [receiptType, receiptContent] = receipt; + const { global_sequence, recv_sequence } = receiptContent; + const data = await serializer.deserializeActionData( + act.account, + act.name, + act.data, + abi + ); + + return { + block_number: block_num.toString(), + block_timestamp, + transaction_id, + account: act.account, + name: act.name, + recv_sequence, + global_sequence, + data: data as DataType, + }; + } + + protected async buildDeltaProcessorModel( + model: ProcessorTask + ): Promise> { + const { serializer } = this; + const { abi, content: buffer } = model; + const delta: DeltaProcessorContentModel = deserialize(buffer); + const { name, block_num, block_timestamp } = delta; + const row = await serializer.deserializeTableRow(delta, abi); + const { code, scope, table, primary_key, payer, data, present } = row; + + return { + name, + block_number: block_num.toString(), + block_timestamp, + code, + scope, + table, + present, + primary_key, + payer, + data: data as DataType, + }; + } + + public async create( + task: ProcessorTask + ): Promise | DeltaProcessorModel> { + if (task.type === ProcessorTaskType.Trace) { + return this.buildActionTraceProcessorModel(task); + } else if (task.type === ProcessorTaskType.Delta) { + return this.buildDeltaProcessorModel(task); + } else { + throw new UnknownProcessorTypeError(task.type); + } + } +} diff --git a/src/processor/processor.runner.ts b/src/processor/processor.runner.ts index f6ca242..41571d2 100644 --- a/src/processor/processor.runner.ts +++ b/src/processor/processor.runner.ts @@ -1,73 +1,54 @@ -import { log } from '@alien-worlds/api-core'; -import { FeaturedContractContent } from '../common/featured'; +import { Serializer, log } from '@alien-worlds/aw-core'; +import { WorkerPool, WorkerMessage } from '@alien-worlds/aw-workers'; import { + Featured, + ContractTraceMatchCriteria, + ContractDeltaMatchCriteria, ProcessorTaskQueue, ProcessorTask, + ProcessorTaskType, + UnknownProcessorTypeError, ProcessorTaskModel, -} from './processor-task-queue'; -import { WorkerMessage, WorkerPool } from '../common/workers'; -import { ProcessorAddons, ProcessorConfig } from './processor.types'; -import { processorWorkerLoaderPath } from './processor.consts'; +} from '../common'; +import { ProcessorModelFactory } from './processor.model.factory'; export class ProcessorRunner { - private static instance: ProcessorRunner; - private static creatorPromise; - - private static async creator(config: ProcessorConfig, addons: ProcessorAddons) { - const { workers } = config; - const { matchers } = addons; - const queue = await ProcessorTaskQueue.create(config.mongo, false, config.queue); - const featuredContent = new FeaturedContractContent(config.featured, matchers); - const workerPool = await WorkerPool.create({ - ...workers, - workerLoaderPath: config.processorLoaderPath || processorWorkerLoaderPath, - }); - const runner = new ProcessorRunner(workerPool, queue, featuredContent); - - workerPool.onWorkerRelease(() => runner.next()); - - log(` * Worker Pool (max ${workerPool.workerMaxCount} workers) ... [ready]`); - ProcessorRunner.creatorPromise = null; - ProcessorRunner.instance = runner; - - return runner; - } - - public static async getInstance( - config: ProcessorConfig, - addons: ProcessorAddons - ): Promise { - if (ProcessorRunner.instance) { - return ProcessorRunner.instance; - } - - if (!ProcessorRunner.creatorPromise) { - ProcessorRunner.creatorPromise = ProcessorRunner.creator(config, addons); - } - - return ProcessorRunner.creatorPromise; - } - private interval: NodeJS.Timeout; private loop: boolean; - + private modelFactory: ProcessorModelFactory; constructor( - private workerPool: WorkerPool, - private queue: ProcessorTaskQueue, - private featuredContent: FeaturedContractContent + protected featuredTraces: Featured, + protected featuredDeltas: Featured, + protected workerPool: WorkerPool, + protected queue: ProcessorTaskQueue, + serializer: Serializer ) { + this.modelFactory = new ProcessorModelFactory(serializer); + this.interval = setInterval(async () => { if (this.workerPool.hasActiveWorkers() === false) { log(`All workers are available, checking if there is something to do...`); this.next(); } }, 5000); + + workerPool.onWorkerRelease(() => this.next()); } private async assignTask(task: ProcessorTask) { - const { queue, workerPool, featuredContent } = this; + const { queue, workerPool, featuredTraces, featuredDeltas } = this; + let processorName: string; + + if (task.type === ProcessorTaskType.Delta) { + processorName = await featuredDeltas.getProcessor(task.label); + } else if (task.type === ProcessorTaskType.Trace) { + processorName = await featuredTraces.getProcessor(task.label); + } else { + this.queue.stashUnsuccessfulTask(task, new UnknownProcessorTypeError(task.type)); + log(`Unknown processor task type "${task.label}". Task has been deleted.`); + return; + } - const processorName = await featuredContent.getProcessor(task.type, task.label); // If there is a processor name, it then gets a worker from the worker pool. if (processorName) { const worker = await workerPool.getWorker(processorName); @@ -83,7 +64,7 @@ export class ProcessorRunner { ); workerPool.releaseWorker(message.workerId); } else if (message.isTaskRejected()) { - queue.stashUnsuccessfulTask(task, message.error); + queue.stashUnsuccessfulTask(task, message.error as Error); log(message.error); log( `Worker #${worker.id} has completed (unsuccessfully) work on the task "${task.id}". @@ -100,8 +81,11 @@ export class ProcessorRunner { queue.stashUnsuccessfulTask(task, error); workerPool.releaseWorker(id); }); + + const model = await this.modelFactory.create(task); + // start worker - worker.run(task); + worker.run(model); log(`Worker #${worker.id} has been assigned to process task ${task.id}`); } else { await this.queue.addTasks([task]); diff --git a/src/processor/processor.types.ts b/src/processor/processor.types.ts index a834988..b59d61f 100644 --- a/src/processor/processor.types.ts +++ b/src/processor/processor.types.ts @@ -1,27 +1,34 @@ -import { MongoConfig, BroadcastConfig } from '@alien-worlds/api-core'; -import { FeaturedConfig, FeaturedMatchers } from '../common/featured'; -import { ProcessorTaskQueueConfig } from './processor-task-queue/processor-task-queue.config'; -import { WorkersConfig } from '../common/workers'; +import { ProcessorConfig } from './processor.config'; export type ProcessorCommandOptions = { threads: number; }; -export type ProcessorConfig = { - broadcast: BroadcastConfig; - workers: WorkersConfig; - featured: FeaturedConfig; - mongo: MongoConfig; - queue: ProcessorTaskQueueConfig; - processorLoaderPath?: string; - [key: string]: unknown; +export type ProcessorSharedData = { + config: ProcessorConfig; + processorsPath: string; }; -export type ProcessorAddons = { - matchers?: FeaturedMatchers; - [key: string]: unknown; +export type DeltaProcessorModel = { + name: string; + code: string; + scope: string; + table: string; + payer: string; + present: boolean; + primary_key: string; + block_number: string; + block_timestamp: Date; + data: DataType; }; -export type ProcessorSharedData = { - config: ProcessorConfig; +export type ActionTraceProcessorModel = { + account: string; + name: string; + block_timestamp: Date; + block_number: string; + global_sequence: string; + recv_sequence: string; + transaction_id: string; + data: DataType; }; diff --git a/src/processor/processor.worker-loader.dependencies.ts b/src/processor/processor.worker-loader.dependencies.ts new file mode 100644 index 0000000..bc623cc --- /dev/null +++ b/src/processor/processor.worker-loader.dependencies.ts @@ -0,0 +1,16 @@ +import { WorkerLoaderDependencies } from '@alien-worlds/aw-workers'; +import { ProcessorConfig } from './processor.config'; + +/** + * An abstract class representing a ProcessorWorkerLoader dependencies. + * @class ProcessorWorkerLoaderDependencies + */ +export abstract class ProcessorWorkerLoaderDependencies extends WorkerLoaderDependencies { + public dataSource: unknown; + public processorsPath: string; + + public abstract initialize( + config: ProcessorConfig, + processorsPath: string + ): Promise; +} diff --git a/src/processor/processor.worker-loader.ts b/src/processor/processor.worker-loader.ts index fae66f1..b60e4fe 100644 --- a/src/processor/processor.worker-loader.ts +++ b/src/processor/processor.worker-loader.ts @@ -1,18 +1,34 @@ -import { MongoSource } from '@alien-worlds/api-core'; -import { Worker } from '../common/workers'; -import { DefaultWorkerLoader } from '../common/workers/worker-loader'; +import { Worker, DefaultWorkerLoader, WorkerContainer } from '@alien-worlds/aw-workers'; import { ProcessorSharedData } from './processor.types'; +import { Container } from '@alien-worlds/aw-core'; +import { ProcessorWorkerLoaderDependencies } from './processor.worker-loader.dependencies'; -export default class ProcessorWorkerLoader extends DefaultWorkerLoader { - private mongoSource: MongoSource; +export default class ProcessorWorkerLoader extends DefaultWorkerLoader< + ProcessorSharedData, + ProcessorWorkerLoaderDependencies +> { + protected workers: WorkerContainer; + protected ioc: Container; public async setup(sharedData: ProcessorSharedData): Promise { - super.setup(sharedData); - this.mongoSource = await MongoSource.create(sharedData.config.mongo); + const { config, processorsPath } = sharedData; + await super.setup(sharedData, config, processorsPath); + this.ioc = new Container(); } public async load(pointer: string): Promise { - const { mongoSource } = this; - return super.load(pointer, { mongoSource }); + const { + dependencies: { dataSource, processorsPath }, + } = this; + const { ioc, sharedData } = this; + const processorClasses = await import(processorsPath); + const worker: Worker = new processorClasses[pointer]( + { + ioc, + dataSource, + }, + sharedData + ) as Worker; + return worker; } } diff --git a/src/processor/processors/action-trace.processor.input.ts b/src/processor/processors/action-trace.processor.input.ts deleted file mode 100644 index 312b426..0000000 --- a/src/processor/processors/action-trace.processor.input.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { deserialize } from 'v8'; -import { AbisSerialize } from '../../common/abis/abis.serialize'; -import { - ActionProcessorContentModel, - ProcessorTaskModel, -} from '../processor-task-queue/processor-task.types'; - -export class ActionTraceProcessorInput { - public static create(model: ProcessorTaskModel) { - const { abi, content: buffer } = model; - const content: ActionProcessorContentModel = deserialize(buffer); - const { - actionTrace: { - act: { account, data, name }, - receipt: { recvSequence, globalSequence }, - }, - blockNumber, - blockTimestamp, - transactionId, - } = content; - - const deserializedData = AbisSerialize.deserializeAction( - account, - name, - data, - abi - ); - - return new ActionTraceProcessorInput( - blockNumber, - blockTimestamp, - transactionId, - account, - name, - recvSequence, - globalSequence, - deserializedData - ); - } - - private constructor( - public readonly blockNumber: bigint, - public readonly blockTimestamp: Date, - public readonly transactionId: string, - public readonly account: string, - public readonly name: string, - public readonly recvSequence: bigint, - public readonly globalSequence: bigint, - public readonly data: DataType - ) {} -} diff --git a/src/processor/processors/action-trace.processor.ts b/src/processor/processors/action-trace.processor.ts index 2cd9ffb..da61556 100644 --- a/src/processor/processors/action-trace.processor.ts +++ b/src/processor/processors/action-trace.processor.ts @@ -1,16 +1,20 @@ /* eslint-disable @typescript-eslint/no-unused-vars */ import { Processor } from './processor'; -import { ActionTraceProcessorInput } from './action-trace.processor.input'; -import { ProcessorTaskModel } from '../processor-task-queue/processor-task.types'; -import { ProcessorSharedData } from '../processor.types'; +import { ActionTraceProcessorModel, ProcessorSharedData } from '../processor.types'; +import { Container, Serializer } from '@alien-worlds/aw-core'; export class ActionTraceProcessor< DataType = unknown, SharedDataType = ProcessorSharedData -> extends Processor { - protected input: ActionTraceProcessorInput; - - public async run(data: ProcessorTaskModel): Promise { - this.input = ActionTraceProcessorInput.create(data); +> extends Processor, SharedDataType> { + constructor( + protected dependencies: { + ioc: Container; + dataSource: unknown; + serializer: Serializer; + }, + protected sharedData: SharedDataType + ) { + super(); } } diff --git a/src/processor/processors/delta.processor.input.ts b/src/processor/processors/delta.processor.input.ts deleted file mode 100644 index 249ac01..0000000 --- a/src/processor/processors/delta.processor.input.ts +++ /dev/null @@ -1,65 +0,0 @@ -import { Serialize } from 'eosjs'; -import { deserialize } from 'v8'; -import { AbisSerialize } from '../../common/abis/abis.serialize'; -import { - DeltaProcessorContentModel, - ProcessorTaskModel, -} from '../processor-task-queue/processor-task.types'; - -export class DeltaProcessorInput { - public static create(model: ProcessorTaskModel) { - const { abi, content: buffer } = model; - const content: DeltaProcessorContentModel = deserialize(buffer); - const { - name, - row: { present, data }, - blockNumber, - blockTimestamp, - } = content; - - const sb = new Serialize.SerialBuffer({ - textEncoder: new TextEncoder(), - textDecoder: new TextDecoder(), - array: data, - }); - sb.get(); // version - const code = sb.getName(); // code - const scope = sb.getName(); // scope - const table = sb.getName(); // table - const primaryKey = Buffer.from(sb.getUint8Array(8)).readBigInt64BE(); // primary_key - const payer = sb.getName(); // payer - const bytes = sb.getBytes(); // data bytes - - const deserializedData = AbisSerialize.deserializeTable( - code, - table, - bytes, - abi - ); - return new DeltaProcessorInput( - name, - code, - scope, - table, - payer, - present, - primaryKey, - blockNumber, - blockTimestamp, - deserializedData - ); - } - - private constructor( - public readonly name: string, - public readonly code: string, - public readonly scope: string, - public readonly table: string, - public readonly payer: string, - public readonly present: number, - public readonly primaryKey: bigint, - public readonly blockNumber: bigint, - public readonly blockTimestamp: Date, - public readonly data: DataType - ) {} -} diff --git a/src/processor/processors/delta.processor.ts b/src/processor/processors/delta.processor.ts index f877b5e..15fc313 100644 --- a/src/processor/processors/delta.processor.ts +++ b/src/processor/processors/delta.processor.ts @@ -1,16 +1,19 @@ -/* eslint-disable @typescript-eslint/no-unused-vars */ -import { ProcessorTaskModel } from '../processor-task-queue/processor-task.types'; -import { DeltaProcessorInput } from './delta.processor.input'; import { Processor } from './processor'; -import { ProcessorSharedData } from '../processor.types'; +import { DeltaProcessorModel, ProcessorSharedData } from '../processor.types'; +import { Container, Serializer } from '@alien-worlds/aw-core'; export class DeltaProcessor< - DataType, + DataType = unknown, SharedDataType = ProcessorSharedData -> extends Processor { - protected input: DeltaProcessorInput; - - public async run(data: ProcessorTaskModel): Promise { - this.input = DeltaProcessorInput.create(data); +> extends Processor, SharedDataType> { + constructor( + protected dependencies: { + ioc: Container; + dataSource: unknown; + serializer: Serializer; + }, + protected sharedData: SharedDataType + ) { + super(); } } diff --git a/src/processor/processors/index.ts b/src/processor/processors/index.ts index a09bbd6..e4e9ec1 100644 --- a/src/processor/processors/index.ts +++ b/src/processor/processors/index.ts @@ -1,5 +1,3 @@ export * from './action-trace.processor'; -export * from './action-trace.processor.input'; export * from './delta.processor'; -export * from './delta.processor.input'; export * from './processor'; diff --git a/src/processor/processors/processor.ts b/src/processor/processors/processor.ts index e90dd72..387fe5d 100644 --- a/src/processor/processors/processor.ts +++ b/src/processor/processors/processor.ts @@ -1,12 +1,12 @@ /* eslint-disable @typescript-eslint/no-unused-vars */ -import { ProcessorTaskModel } from '../processor-task-queue/processor-task.types'; -import { Worker } from '../../common/workers/worker'; +import { Worker } from '@alien-worlds/aw-workers'; import { ProcessorSharedData } from '../processor.types'; export class Processor< + ModelType, SharedDataType = ProcessorSharedData > extends Worker { - public run(data: ProcessorTaskModel): Promise { + public run(model: ModelType): Promise { throw new Error('Method not implemented'); } } diff --git a/src/processor/start-processor.ts b/src/processor/start-processor.ts index 5baf540..a723cb7 100644 --- a/src/processor/start-processor.ts +++ b/src/processor/start-processor.ts @@ -1,45 +1,88 @@ -import { Broadcast, log } from '@alien-worlds/api-core'; -import { ProcessorAddons, ProcessorConfig } from './processor.types'; +import { processorWorkerLoaderPath } from './processor.consts'; +import { ProcessorRunner } from './processor.runner'; import { InternalBroadcastChannel, - InternalBroadcastClientName, InternalBroadcastMessageName, ProcessorBroadcastMessage, } from '../broadcast'; -import { InternalBroadcastMessage } from '../broadcast/internal-broadcast.message'; -import { ProcessorRunner } from './processor.runner'; +import { processorCommand } from './processor.command'; +import { buildProcessorConfig } from '../config'; +import { ProcessorCommandOptions } from './processor.types'; +import { log, ConfigVars } from '@alien-worlds/aw-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; +import { WorkerPool } from '@alien-worlds/aw-workers'; +import { ProcessorConfig, ProcessorAddons } from './processor.config'; +import { ProcessorDependencies } from './processor.dependencies'; -/** - * - * @param featuredContent - * @param broadcastMessageMapper - * @param config - */ -export const startProcessor = async ( +export const process = async ( config: ProcessorConfig, + dependencies: ProcessorDependencies, + processorsPath: string, + featuredCriteriaPath: string, addons: ProcessorAddons = {} ) => { log(`Processor ... [starting]`); - const broadcast = await Broadcast.createClient({ - ...config.broadcast, - clientName: InternalBroadcastClientName.Processor, + + const initResult = await dependencies.initialize( + config, + featuredCriteriaPath, + processorsPath, + addons + ); + + if (initResult.isFailure) { + throw initResult.failure.error; + } + + const { + broadcastClient, + featuredTraces, + featuredDeltas, + processorTaskQueue, + workerLoaderDependenciesPath, + serializer, + } = dependencies; + const workerPool = await WorkerPool.create({ + ...config.workers, + sharedData: { config, featuredCriteriaPath, processorsPath }, + workerLoaderPath: config.processorLoaderPath || processorWorkerLoaderPath, + workerLoaderDependenciesPath, }); - const runner = await ProcessorRunner.getInstance(config, addons); + const runner = new ProcessorRunner( + featuredTraces, + featuredDeltas, + workerPool, + processorTaskQueue, + serializer + ); - broadcast.onMessage( + broadcastClient.onMessage( InternalBroadcastChannel.Processor, - async (message: InternalBroadcastMessage) => { - if (message.content.name === InternalBroadcastMessageName.ProcessorRefresh) { + async (message: BroadcastMessage) => { + if (message.name === InternalBroadcastMessageName.ProcessorRefresh) { runner.next(); } } ); - await broadcast.connect(); + await broadcastClient.connect(); // Everything is ready, notify the block-range that the process is ready to work - broadcast.sendMessage(ProcessorBroadcastMessage.ready()); + broadcastClient.sendMessage(ProcessorBroadcastMessage.ready()); // start processor in case the queue already contains tasks runner.next(); log(`Processor ... [ready]`); }; + +export const startProcessor = ( + args: string[], + dependencies: ProcessorDependencies, + processorsPath: string, + featuredCriteriaPath: string, + addons?: ProcessorAddons +) => { + const vars = new ConfigVars(); + const options = processorCommand.parse(args).opts(); + const config = buildProcessorConfig(vars, dependencies.databaseConfigBuilder, options); + process(config, dependencies, processorsPath, featuredCriteriaPath, addons).catch(log); +}; diff --git a/src/reader/block-range-scanner/block-range-scan.mongo.source.ts b/src/reader/block-range-scanner/block-range-scan.mongo.source.ts deleted file mode 100644 index 027aeb2..0000000 --- a/src/reader/block-range-scanner/block-range-scan.mongo.source.ts +++ /dev/null @@ -1,199 +0,0 @@ -/* eslint-disable @typescript-eslint/no-unsafe-return */ - -import { - CollectionMongoSource, - MongoSource, - parseToBigInt, -} from '@alien-worlds/api-core'; -import { Long } from 'mongodb'; -import { BlockRangeScanDocument } from './block-range-scanner.dtos'; -import { BlockNumberOutOfRangeError } from './block-range-scanner.errors'; - -/** - * Block range scan nodes data source from the mongo database - * @class - */ -export class BlockRangeScanMongoSource extends CollectionMongoSource { - public static Token = 'BLOCK_RANGE_SCAN_MONGO_SOURCE'; - - /** - * @constructor - * @param {MongoSource} mongoSource - */ - constructor(mongoSource: MongoSource) { - super(mongoSource, 'history_tools.block_range_scans'); - } - - private async setCurrentBlockProgress( - document: BlockRangeScanDocument, - processedBlock: bigint - ) { - const { _id, is_leaf_node } = document; - const { start, end } = _id; - - if (!is_leaf_node) { - throw new Error( - `(${start.toString()}-${end.toString()}) range has already completed scanning the blockchain.` - ); - } - - if (processedBlock == parseToBigInt(end) - 1n) { - await this.findCompletedParentNode(document); - } else { - await this.collection.updateOne( - { _id }, - { - $set: { - processed_block: Long.fromBigInt(processedBlock), - timestamp: new Date(), - }, - } - ); - } - } - - public async startNextScan(scanKey: string): Promise { - const result = await this.collection.findOneAndUpdate( - { - $and: [ - { is_leaf_node: true }, - { - $or: [ - { timestamp: { $exists: false } }, - /* - The trick to not use the same block range again on another thread/worker - ...Probably this could be handled better. - */ - { timestamp: { $lt: new Date(Date.now() - 1000) } }, - ], - }, - { '_id.scan_key': scanKey }, - ], - }, - { $set: { timestamp: new Date() } }, - { - sort: { timestamp: 1 }, - returnDocument: 'after', - } - ); - - return result.value as BlockRangeScanDocument; - } - - public async countScanNodes( - scanKey: string, - startBlock: bigint, - endBlock: bigint - ): Promise { - const options: unknown[] = [{ '_id.scan_key': scanKey }]; - - if (startBlock) { - options.push({ '_id.start': { $gte: Long.fromBigInt(startBlock) } }); - } - - if (endBlock) { - options.push({ '_id.end': { $lte: Long.fromBigInt(endBlock) } }); - } - - const result = await this.collection.countDocuments({ - $and: options, - }); - - return result; - } - - public async removeAll(scanKey: string) { - await this.collection.deleteMany({ '_id.scan_key': scanKey }); - } - - public async hasScanKey( - scanKey: string, - startBlock?: bigint, - endBlock?: bigint - ): Promise { - const options: unknown[] = [{ '_id.scan_key': scanKey }]; - - if (startBlock) { - options.push({ '_id.start': Long.fromBigInt(startBlock) }); - } - - if (endBlock) { - options.push({ '_id.end': Long.fromBigInt(endBlock) }); - } - const dto = await this.collection.findOne({ $and: options }); - - return !!dto; - } - - public async hasUnscannedNodes( - scanKey: string, - startBlock?: bigint, - endBlock?: bigint - ): Promise { - const options: unknown[] = [ - { '_id.scan_key': scanKey }, - { '_id.tree_depth': { $gt: 0 } }, - { is_leaf_node: true }, - ]; - - if (startBlock) { - options.push({ 'parent_id.start': Long.fromBigInt(startBlock) }); - } - - if (endBlock) { - options.push({ 'parent_id.end': Long.fromBigInt(endBlock) }); - } - - const dto = await this.collection.findOne({ $and: options }); - - return !!dto; - } - - public async findRangeForBlockNumber(blockNumber: bigint, scanKey: string) { - const result = this.collection.find( - { - '_id.start': { $lte: Long.fromBigInt(blockNumber) }, - '_id.end': { $gt: Long.fromBigInt(blockNumber) }, - '_id.scan_key': scanKey, - '_id.tree_depth': { $gt: 0 }, - }, - { sort: { '_id.tree_depth': -1 } } - ); - const document = await result.next(); - return document; - } - - public async findCompletedParentNode(document: BlockRangeScanDocument) { - const { _id, parent_id } = document; - - if (parent_id) { - await this.collection.deleteOne({ _id }); - // fetch all child nodes with parent id that matches this parent_id - const matchingParentCount = await this.collection.countDocuments({ - parent_id, - }); - if (matchingParentCount == 0) { - const parentDocument: BlockRangeScanDocument = await this.collection.findOne({ - _id: parent_id, - }); - await this.findCompletedParentNode(parentDocument); - } - } else if (_id.tree_depth === 0) { - await this.collection.updateOne({ _id }, { $set: { end_timestamp: new Date() } }); - } - } - - public async updateProcessedBlockNumber( - scanKey: string, - blockNumber: bigint - ): Promise { - const range: BlockRangeScanDocument = await this.findRangeForBlockNumber( - blockNumber, - scanKey - ); - - if (range) { - return this.setCurrentBlockProgress(range, blockNumber); - } - } -} diff --git a/src/reader/block-range-scanner/block-range-scanner.dtos.ts b/src/reader/block-range-scanner/block-range-scanner.dtos.ts deleted file mode 100644 index bbc02ef..0000000 --- a/src/reader/block-range-scanner/block-range-scanner.dtos.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { Long } from 'mongodb'; - -/* - We need to keep tree_depth in _id because if the entire scan will use only one node, - we will not be able to create a child document with the same _id and tree_depth: 1. -*/ - -export type BlockRangeScanIdDocument = { - start: Long; - end: Long; - scan_key: string; - tree_depth: number; -}; - -export type BlockRangeScanDocument = { - _id: BlockRangeScanIdDocument; - hash?: string, - processed_block?: Long; - timestamp?: Date; - start_timestamp?: Date; - end_timestamp?: Date; - is_leaf_node?: boolean; - parent_id?: BlockRangeScanIdDocument; -}; diff --git a/src/reader/index.ts b/src/reader/index.ts index efc844c..cc4e5fd 100644 --- a/src/reader/index.ts +++ b/src/reader/index.ts @@ -1,7 +1,10 @@ -export * from '../common/blockchain/block-reader'; -export * from './unprocessed-block-queue'; -export * from './reader.types'; export * from './reader'; -export * from './reader.worker-loader'; +export * from './reader.config'; +export * from './reader.command'; +export * from './reader.consts'; +export * from './reader.dependencies'; +export * from './reader.types'; export * from './reader.worker'; +export * from './reader.worker-loader'; +export * from './reader.worker-loader.dependencies'; export * from './start-reader'; diff --git a/src/reader/reader.command.ts b/src/reader/reader.command.ts new file mode 100644 index 0000000..388e387 --- /dev/null +++ b/src/reader/reader.command.ts @@ -0,0 +1,12 @@ +import { Command } from 'commander'; + +export const readerCommand = new Command(); + +readerCommand + .version('1.0', '-v, --version') + .option('-k, --scan-key ', 'Scan key') + .option('-s, --start-block ', 'Start at this block') + .option('-m, --mode ', 'Mode (default/replay/test)') + .option('-e, --end-block ', 'End block (exclusive)') + .option('-t, --threads ', 'Number of threads') + .parse(process.argv); diff --git a/src/reader/reader.config.ts b/src/reader/reader.config.ts new file mode 100644 index 0000000..d2ffbde --- /dev/null +++ b/src/reader/reader.config.ts @@ -0,0 +1,17 @@ +import { BroadcastConfig } from '@alien-worlds/aw-broadcast'; +import { WorkersConfig } from '@alien-worlds/aw-workers'; +import { BlockRangeScanConfig, UnprocessedBlockQueueConfig } from '../common'; +import { BlockReaderConfig, UnknownObject } from '@alien-worlds/aw-core'; + +export type ReaderConfig = { + broadcast?: BroadcastConfig; + blockReader?: BlockReaderConfig; + database?: DatabaseConfig; + scanner?: BlockRangeScanConfig; + workers?: WorkersConfig; + unprocessedBlockQueue: UnprocessedBlockQueueConfig; + mode?: string; + maxBlockNumber?: number; + startBlock?: bigint; + endBlock?: bigint; +}; diff --git a/src/reader/reader.consts.ts b/src/reader/reader.consts.ts new file mode 100644 index 0000000..0517de3 --- /dev/null +++ b/src/reader/reader.consts.ts @@ -0,0 +1,2 @@ +export const readerWorkerLoaderPath = `${__dirname}/reader.worker-loader`; +export const readerWorkerLoaderDependenciesPath = `${__dirname}/reader.worker-loader.dependencies`; diff --git a/src/reader/reader.dependencies.ts b/src/reader/reader.dependencies.ts new file mode 100644 index 0000000..2a2fec2 --- /dev/null +++ b/src/reader/reader.dependencies.ts @@ -0,0 +1,20 @@ +import { BroadcastClient } from '@alien-worlds/aw-broadcast'; +import { Result } from '@alien-worlds/aw-core'; +import { Dependencies } from '../common/dependencies'; +import { BlockRangeScanner, DatabaseConfigBuilder } from '../common'; +import { ReaderConfig } from './reader.config'; + +/** + * An abstract class representing a reader dependencies. + * @class ReaderDependencies + */ +export abstract class ReaderDependencies extends Dependencies { + public broadcastClient: BroadcastClient; + public scanner: BlockRangeScanner; + public workerLoaderPath?: string; + public workerLoaderDependenciesPath: string; + + public databaseConfigBuilder: DatabaseConfigBuilder; + + public abstract initialize(config: ReaderConfig): Promise; +} diff --git a/src/reader/reader.ts b/src/reader/reader.ts index 2d53d8f..a98be8e 100644 --- a/src/reader/reader.ts +++ b/src/reader/reader.ts @@ -1,51 +1,23 @@ -import { Broadcast, BroadcastClient, log, MongoSource } from '@alien-worlds/api-core'; -import { BlockRangeScanner } from './block-range-scanner'; -import { Mode } from '../common/common.enums'; -import { WorkerMessage, WorkerPool } from '../common/workers'; -import ReaderWorker from './reader.worker'; -import { ReadCompleteData, ReaderConfig, ReadTaskData } from './reader.types'; -import { InternalBroadcastClientName } from '../broadcast'; +import { ReadCompleteData, ReadTaskData } from './reader.types'; import { FilterBroadcastMessage } from '../broadcast/messages'; +import { log } from '@alien-worlds/aw-core'; +import { BroadcastClient } from '@alien-worlds/aw-broadcast'; +import { WorkerPool, WorkerMessage } from '@alien-worlds/aw-workers'; +import { BlockRangeScanner, Mode } from '../common'; export class Reader { - public static async create( - config: ReaderConfig, - broadcastClient?: BroadcastClient - ): Promise { - const mongoSource = await MongoSource.create(config.mongo); - const scanner = await BlockRangeScanner.create(mongoSource, config.scanner); - const workerPool = await WorkerPool.create({ - threadsCount: config.workers?.threadsCount || 1, - sharedData: { config }, - defaultWorkerPath: `${__dirname}/reader.worker`, - workerLoaderPath: `${__dirname}/reader.worker-loader`, - }); - let broadcast: BroadcastClient; - if (!broadcastClient) { - broadcast = await Broadcast.createClient({ - ...config.broadcast, - clientName: InternalBroadcastClientName.Reader, - }); - broadcast.connect(); - } else { - broadcast = broadcastClient; - } - return new Reader(workerPool, scanner, broadcast); - } - private loop = false; - private isAlreadyStarted = false; private initTaskData: ReadTaskData; - protected constructor( - private workerPool: WorkerPool, - private scanner: BlockRangeScanner, - private broadcast: BroadcastClient + constructor( + protected broadcastClient: BroadcastClient, + protected scanner: BlockRangeScanner, + protected workerPool: WorkerPool ) { workerPool.onWorkerRelease(async () => { const { initTaskData } = this; if (initTaskData.mode === Mode.Replay) { - if (await this.scanner.hasUnscannedBlocks(initTaskData.scanKey)) { + if (await scanner.hasUnscannedBlocks(initTaskData.scanKey)) { this.read(initTaskData); } } else { @@ -55,7 +27,7 @@ export class Reader { } private async handleWorkerMessage(message: WorkerMessage) { - const { workerPool, broadcast } = this; + const { workerPool, broadcastClient } = this; const { data, error, workerId } = message; if (message.isTaskResolved()) { @@ -68,13 +40,14 @@ export class Reader { log(`An unexpected error occurred while reading blocks...`, error); workerPool.releaseWorker(workerId); } else if (message.isTaskProgress()) { - broadcast.sendMessage(FilterBroadcastMessage.refresh()); + broadcastClient.sendMessage(FilterBroadcastMessage.refresh()); } } private async handleWorkerError(id: number, error: Error) { + const { workerPool } = this; log(`Worker error:`, error); - this.workerPool.releaseWorker(id); + workerPool.releaseWorker(id); } public async read(task: ReadTaskData) { @@ -92,8 +65,9 @@ export class Reader { ); } + const { workerPool, scanner, initTaskData } = this; + while (this.loop) { - const { initTaskData, workerPool } = this; const worker = await workerPool.getWorker(); if (worker) { worker.onMessage(message => this.handleWorkerMessage(message)); @@ -102,7 +76,7 @@ export class Reader { if (task.mode === Mode.Default || task.mode === Mode.Test) { worker.run({ startBlock: task.startBlock, endBlock: task.endBlock }); } else if (task.mode === Mode.Replay) { - const scan = await this.scanner.getNextScanNode(task.scanKey); + const scan = await scanner.getNextScanNode(task.scanKey); if (scan) { worker.run({ diff --git a/src/reader/reader.types.ts b/src/reader/reader.types.ts index 2d97aae..78387bd 100644 --- a/src/reader/reader.types.ts +++ b/src/reader/reader.types.ts @@ -1,23 +1,3 @@ -import { BroadcastConfig, MongoConfig } from '@alien-worlds/api-core'; -import { BlockRangeScanConfig } from './block-range-scanner'; -import { WorkersConfig } from '../common/workers'; -import { BlockReaderConfig } from '../common/blockchain/block-reader'; - -export type ReaderConfig = { - broadcast?: BroadcastConfig; - blockReader?: BlockReaderConfig; - mongo?: MongoConfig; - scanner?: BlockRangeScanConfig; - workers?: WorkersConfig; - mode?: string; - maxBlockNumber?: number; - blockQueueMaxBytesSize?: number; - blockQueueSizeCheckInterval?: number; - blockQueueBatchSize?: number; - startBlock?: bigint; - endBlock?: bigint; -}; - export type ReaderCommandOptions = { startBlock?: bigint; endBlock?: bigint; diff --git a/src/reader/reader.worker-loader.dependencies.ts b/src/reader/reader.worker-loader.dependencies.ts new file mode 100644 index 0000000..b6f565f --- /dev/null +++ b/src/reader/reader.worker-loader.dependencies.ts @@ -0,0 +1,19 @@ +import { WorkerLoaderDependencies } from '@alien-worlds/aw-workers'; +import { UnprocessedBlockQueue } from '../common/unprocessed-block-queue'; +import { BlockRangeScanner, BlockState } from '../common'; +import { ReaderConfig } from './reader.config'; +import { BlockReader } from '@alien-worlds/aw-core'; + +/** + * An abstract class representing a ReaderWorkerLoader dependencies. + * @class ReaderWorkerLoaderDependencies + */ +export abstract class ReaderWorkerLoaderDependencies extends WorkerLoaderDependencies { + public blockReader: BlockReader; + public blockState: BlockState; + public blockQueue: UnprocessedBlockQueue; + public scanner: BlockRangeScanner; + public config: ReaderConfig; + + public abstract initialize(config: ReaderConfig): Promise; +} diff --git a/src/reader/reader.worker-loader.ts b/src/reader/reader.worker-loader.ts index e20d8dc..a204f03 100644 --- a/src/reader/reader.worker-loader.ts +++ b/src/reader/reader.worker-loader.ts @@ -1,46 +1,29 @@ -import { MongoSource, log } from '@alien-worlds/api-core'; -import { BlockReader } from '../common/blockchain'; -import { Worker } from '../common/workers'; -import { DefaultWorkerLoader } from '../common/workers/worker-loader'; -import { UnprocessedBlockQueue } from './unprocessed-block-queue'; +import { Worker, DefaultWorkerLoader } from '@alien-worlds/aw-workers'; +import { ReaderWorkerLoaderDependencies } from './reader.worker-loader.dependencies'; +import { log } from '@alien-worlds/aw-core'; import ReaderWorker, { ReaderSharedData } from './reader.worker'; -import { BlockRangeScanner, BlockState } from '../common'; - -export default class ReaderWorkerLoader extends DefaultWorkerLoader { - private blockReader: BlockReader; - private blockState: BlockState; - private blocksQueue: UnprocessedBlockQueue; - private scanner: BlockRangeScanner; +export default class ReaderWorkerLoader extends DefaultWorkerLoader< + ReaderSharedData, + ReaderWorkerLoaderDependencies +> { public async setup(sharedData: ReaderSharedData): Promise { - super.setup(sharedData); - + const { config } = sharedData; + await super.setup(sharedData, config); + // + const { + unprocessedBlockQueue: { maxBytesSize, sizeCheckInterval }, + } = config; const { - config: { - blockReader, - mongo, - blockQueueBatchSize, - blockQueueMaxBytesSize, - blockQueueSizeCheckInterval, - scanner, - }, - } = sharedData; - const mongoSource = await MongoSource.create(mongo); - this.blockReader = await BlockReader.create(blockReader); - this.blockState = await BlockState.create(mongoSource); - this.scanner = await BlockRangeScanner.create(mongoSource, scanner); - this.blocksQueue = await UnprocessedBlockQueue.create( - mongoSource, - blockQueueMaxBytesSize, - blockQueueBatchSize - ); - this.blocksQueue.onOverload(size => { - const overload = size - blockQueueMaxBytesSize; + dependencies: { blockQueue: blocksQueue, blockReader }, + } = this; + blocksQueue.onOverload(size => { + const overload = size - maxBytesSize; log(`Overload: ${overload} bytes.`); - this.blockReader.pause(); + blockReader.pause(); let interval = setInterval(async () => { - const { content: size, failure } = await this.blocksQueue.getBytesSize(); + const { content: size, failure } = await blocksQueue.getBytesSize(); if (failure) { log( @@ -48,24 +31,24 @@ export default class ReaderWorkerLoader extends DefaultWorkerLoader { - this.blockReader.pause(); + blocksQueue.beforeSendBatch(() => { + blockReader.pause(); }); - this.blocksQueue.afterSendBatch(() => { - this.blockReader.resume(); + blocksQueue.afterSendBatch(() => { + blockReader.resume(); }); - await this.blockReader.connect(); + await blockReader.connect(); } public async load(): Promise { - const { blockReader, scanner, blocksQueue, blockState, sharedData } = this; - return new ReaderWorker(blockReader, blocksQueue, blockState, scanner, sharedData); + const { dependencies, sharedData } = this; + return new ReaderWorker(dependencies, sharedData); } } diff --git a/src/reader/reader.worker.ts b/src/reader/reader.worker.ts index 8f2f289..49198c9 100644 --- a/src/reader/reader.worker.ts +++ b/src/reader/reader.worker.ts @@ -1,9 +1,8 @@ -import { log, parseToBigInt } from '@alien-worlds/api-core'; -import { Worker } from '../common/workers/worker'; -import { BlockReader } from '../common/blockchain/block-reader'; -import { ReaderConfig } from './reader.types'; -import { UnprocessedBlockQueue } from './unprocessed-block-queue'; -import { BlockRangeScanner, BlockState, Mode } from '../common'; +/* eslint-disable @typescript-eslint/no-unused-vars */ +import { Worker } from '@alien-worlds/aw-workers'; +import { ReaderConfig } from './reader.config'; +import { BlockReader, log, parseToBigInt } from '@alien-worlds/aw-core'; +import { UnprocessedBlockQueue, BlockState, BlockRangeScanner, Mode } from '../common'; export type ReaderSharedData = { config: ReaderConfig; @@ -11,18 +10,22 @@ export type ReaderSharedData = { export default class ReaderWorker extends Worker { constructor( - protected blockReader: BlockReader, - protected blockQueue: UnprocessedBlockQueue, - protected blockState: BlockState, - protected scanner: BlockRangeScanner, - sharedData: ReaderSharedData + protected dependencies: { + blockReader: BlockReader; + blockQueue: UnprocessedBlockQueue; + blockState: BlockState; + scanner: BlockRangeScanner; + }, + protected sharedData: ReaderSharedData ) { super(); this.sharedData = sharedData; } private async updateBlockState(): Promise { - const { blockQueue, blockState } = this; + const { + dependencies: { blockQueue, blockState }, + } = this; const { content: maxBlock } = await blockQueue.getMax(); if (maxBlock) { const { failure } = await blockState.updateBlockNumber( @@ -48,20 +51,26 @@ export default class ReaderWorker extends Worker { private async readInDefaultMode(startBlock: bigint, endBlock: bigint) { const { - blockReader, - blockQueue, + dependencies: { blockReader, blockQueue }, sharedData: { config: { maxBlockNumber, blockReader: { shouldFetchDeltas, shouldFetchTraces, shouldFetchBlock }, - blockQueueMaxBytesSize, + unprocessedBlockQueue, }, }, } = this; - + const rangeSize = endBlock - startBlock; blockReader.onReceivedBlock(async block => { const isLast = endBlock === block.thisBlock.blockNumber; - const { content: addedBlockNumbers, failure } = await blockQueue.add(block, isLast); + const isFastLane = + block.thisBlock.blockNumber >= block.lastIrreversible.blockNumber; + + const { content: addedBlockNumbers, failure } = await blockQueue.add(block, { + isFastLane, + isLast, + predictedRangeSize: Number(rangeSize), + }); if (Array.isArray(addedBlockNumbers) && addedBlockNumbers.length > 0) { this.logProgress(addedBlockNumbers); @@ -74,7 +83,7 @@ export default class ReaderWorker extends Worker { } else if (failure?.error.name === 'UnprocessedBlocksOverloadError') { log(failure.error.message); log( - `The size limit ${blockQueueMaxBytesSize} of the unprocessed blocks collection has been exceeded. Blockchain reading suspended until the collection is cleared.` + `The size limit ${unprocessedBlockQueue.maxBytesSize} of the unprocessed blocks collection has been exceeded. Blockchain reading suspended until the collection is cleared.` ); } else if (failure) { this.reject(failure.error); @@ -104,17 +113,17 @@ export default class ReaderWorker extends Worker { private async readInReplayMode(startBlock: bigint, endBlock: bigint, scanKey: string) { const { - blockReader, - blockQueue, - scanner, + dependencies: { blockReader, blockQueue, scanner }, sharedData: { config: { blockReader: { shouldFetchDeltas, shouldFetchTraces, shouldFetchBlock }, - blockQueueMaxBytesSize, + unprocessedBlockQueue, }, }, } = this; + const rangeSize = endBlock - startBlock; + blockReader.onReceivedBlock(async block => { const { content: addedBlockNumbers, failure } = await blockQueue.add(block); if (Array.isArray(addedBlockNumbers) && addedBlockNumbers.length > 0) { @@ -131,7 +140,7 @@ export default class ReaderWorker extends Worker { } else if (failure?.error.name === 'UnprocessedBlocksOverloadError') { log(failure.error.message); log( - `The size limit ${blockQueueMaxBytesSize} of the unprocessed blocks collection has been exceeded by bytes. Blockchain reading suspended until the collection is cleared.` + `The size limit ${unprocessedBlockQueue.maxBytesSize} of the unprocessed blocks collection has been exceeded by bytes. Blockchain reading suspended until the collection is cleared.` ); } else if (failure) { this.reject(failure.error); diff --git a/src/reader/start-reader.ts b/src/reader/start-reader.ts index 36272f8..71962e3 100644 --- a/src/reader/start-reader.ts +++ b/src/reader/start-reader.ts @@ -2,28 +2,47 @@ /* eslint-disable @typescript-eslint/no-unused-vars */ import { InternalBroadcastChannel, - InternalBroadcastClientName, InternalBroadcastMessageName, } from '../broadcast/internal-broadcast.enums'; -import { Broadcast, log } from '@alien-worlds/api-core'; -import { ReadTaskData, ReaderConfig } from './reader.types'; -import { InternalBroadcastMessage } from '../broadcast/internal-broadcast.message'; +import { ReadTaskData, ReaderCommandOptions } from './reader.types'; import { ReaderBroadcastMessage } from '../broadcast/messages/reader-broadcast.message'; import { Reader } from './reader'; +import { readerCommand } from './reader.command'; +import { buildReaderConfig } from '../config'; +import { readerWorkerLoaderPath } from './reader.consts'; +import { log, ConfigVars } from '@alien-worlds/aw-core'; +import { BroadcastMessage } from '@alien-worlds/aw-broadcast'; +import { WorkerPool } from '@alien-worlds/aw-workers'; import { Mode } from '../common'; +import { ReaderConfig } from './reader.config'; +import { ReaderDependencies } from './reader.dependencies'; /** * * @param config * @returns */ -export const startReader = async (config: ReaderConfig) => { +export const read = async (config: ReaderConfig, dependencies: ReaderDependencies) => { log(`Reader ... [starting]`); - const broadcast = await Broadcast.createClient({ - ...config.broadcast, - clientName: InternalBroadcastClientName.Reader, + + const initResult = await dependencies.initialize(config); + + if (initResult.isFailure) { + throw initResult.failure.error; + } + + const { broadcastClient, scanner, workerLoaderPath, workerLoaderDependenciesPath } = + dependencies; + + const workerPool = await WorkerPool.create({ + ...config.workers, + sharedData: { config }, + workerLoaderPath: workerLoaderPath || readerWorkerLoaderPath, + workerLoaderDependenciesPath, }); - const blockRangeReader = await Reader.create(config, broadcast); + + const reader = new Reader(broadcastClient, scanner, workerPool); + let channel: string; let readyMessage; @@ -36,20 +55,22 @@ export const startReader = async (config: ReaderConfig) => { } log(`Reader started in "listening" mode`); - broadcast.onMessage( - channel, - async (message: InternalBroadcastMessage) => { - const { - content: { data, name }, - } = message; - if (name === InternalBroadcastMessageName.ReaderTask) { - blockRangeReader.read(data); - } + broadcastClient.onMessage(channel, async (message: BroadcastMessage) => { + const { data, name } = message; + if (name === InternalBroadcastMessageName.ReaderTask) { + reader.read(data); } - ); - broadcast.connect(); + }); + broadcastClient.connect(); // Everything is ready, notify the bootstrap that the process is ready to work - broadcast.sendMessage(readyMessage); + broadcastClient.sendMessage(readyMessage); log(`Reader ... [ready]`); }; + +export const startReader = (args: string[], dependencies: ReaderDependencies) => { + const vars = new ConfigVars(); + const options = readerCommand.parse(args).opts(); + const config = buildReaderConfig(vars, dependencies.databaseConfigBuilder, options); + read(config, dependencies).catch(log); +}; diff --git a/src/reader/unprocessed-block-queue/index.ts b/src/reader/unprocessed-block-queue/index.ts deleted file mode 100644 index afcb8e8..0000000 --- a/src/reader/unprocessed-block-queue/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export * from './unprocessed-block-queue.errors'; -export * from './unprocessed-block-queue'; diff --git a/src/reader/unprocessed-block-queue/unprocessed-block-queue.ts b/src/reader/unprocessed-block-queue/unprocessed-block-queue.ts deleted file mode 100644 index 8c7aec2..0000000 --- a/src/reader/unprocessed-block-queue/unprocessed-block-queue.ts +++ /dev/null @@ -1,194 +0,0 @@ -import { - CollectionMongoSource, - DataSourceBulkWriteError, - DataSourceOperationError, - Failure, - isMongoConfig, - log, - MongoConfig, - MongoSource, - parseToBigInt, - Result, -} from '@alien-worlds/api-core'; -import { - BlockNotFoundError, - DuplicateBlocksError, - UnprocessedBlocksOverloadError, -} from './unprocessed-block-queue.errors'; -import { Block, BlockDocument } from '../../common/blockchain/block-reader/block'; - -export class BlockMongoCollection extends CollectionMongoSource { - constructor(mongoSource: MongoSource) { - super(mongoSource, 'history_tools.unprocessed_blocks', { - indexes: [ - { - key: { 'this_block.block_num': 1 }, - unique: true, - background: true, - }, - ], - }); - } - - public async next(): Promise { - try { - const result = await this.collection.findOneAndDelete({}); - return result.value; - } catch (error) { - throw DataSourceOperationError.fromError(error); - } - } - - public async bytesSize(): Promise { - const stats = await this.collection.stats(); - return stats.size; - } -} - -export abstract class UnprocessedBlockQueueReader { - public abstract next(): Promise>; -} - -export class UnprocessedBlockQueue implements UnprocessedBlockQueueReader { - private cache: Block[]; - private mongo: BlockMongoCollection; - private overloadHandler: (size: number) => void; - private beforeSendBatchHandler: () => void; - private afterSendBatchHandler: () => void; - - public static async create< - T extends UnprocessedBlockQueue | UnprocessedBlockQueueReader - >( - mongo: MongoConfig | MongoSource, - maxBytesSize?: number, - batchSize?: number - ): Promise { - let mongoSource: MongoSource; - if (isMongoConfig(mongo)) { - mongoSource = await MongoSource.create(mongo); - } else { - mongoSource = mongo; - } - return new UnprocessedBlockQueue( - mongoSource, - maxBytesSize || 0, - batchSize | 100 - ) as T; - } - - private constructor( - mongoSource: MongoSource, - private maxBytesSize: number, - private batchSize: number - ) { - this.mongo = new BlockMongoCollection(mongoSource); - this.cache = []; - } - - private async sendBatch() { - const addedBlockNumbers = []; - this.beforeSendBatchHandler(); - const documnets = this.cache.map(block => block.toDocument()); - const result = await this.mongo.insertMany(documnets); - result.forEach(document => { - addedBlockNumbers.push(parseToBigInt(document.this_block.block_num)); - }); - this.cache = []; - - if (this.maxBytesSize > 0 && this.overloadHandler) { - const sorted = addedBlockNumbers.sort(); - const min = sorted[0]; - const max = sorted.reverse()[0]; - - const currentSize = await this.mongo.bytesSize(); - if (currentSize >= this.maxBytesSize) { - this.overloadHandler(currentSize); - throw new UnprocessedBlocksOverloadError(min, max); - } - } - - this.afterSendBatchHandler(); - - return addedBlockNumbers; - } - - public async getBytesSize(): Promise> { - try { - const currentSize = await this.mongo.bytesSize(); - return Result.withContent(currentSize); - } catch (error) { - return Result.withFailure(Failure.fromError(error)); - } - } - - public async add(block: Block, isLast = false): Promise> { - try { - let addedBlockNumbers: bigint[] = []; - - if (this.cache.length < this.batchSize) { - this.cache.push(block); - } - - if (this.cache.length === this.batchSize || isLast) { - addedBlockNumbers = await this.sendBatch(); - } - - return Result.withContent(addedBlockNumbers); - } catch (error) { - // it is important to clear the cache in case of errors - this.cache = []; - - if (error instanceof DataSourceBulkWriteError && error.onlyDuplicateErrors) { - this.afterSendBatchHandler(); - return Result.withFailure(Failure.fromError(new DuplicateBlocksError())); - } - return Result.withFailure(Failure.fromError(error)); - } - } - - public async next(): Promise> { - try { - const document = await this.mongo.next(); - if (document) { - if (this.maxBytesSize > -1 && this.afterSendBatchHandler) { - if ((await this.mongo.count({})) === 0 && this.afterSendBatchHandler) { - this.afterSendBatchHandler(); - } - } - - return Result.withContent(Block.fromDocument(document)); - } - return Result.withFailure(Failure.fromError(new BlockNotFoundError())); - } catch (error) { - log(`Could not get next task due to: ${error.message}`); - return Result.withFailure(Failure.fromError(error)); - } - } - - public async getMax(): Promise> { - try { - const documents = await this.mongo.aggregate({ - pipeline: [{ $sort: { 'this_block.block_num': -1 } }, { $limit: 1 }], - }); - if (documents.length > 0) { - return Result.withContent(Block.fromDocument(documents[0])); - } - return Result.withFailure(Failure.fromError(new BlockNotFoundError())); - } catch (error) { - log(`Could not get block with highest block number due to: ${error.message}`); - return Result.withFailure(Failure.fromError(error)); - } - } - - public afterSendBatch(handler: () => void): void { - this.afterSendBatchHandler = handler; - } - - public beforeSendBatch(handler: () => void): void { - this.beforeSendBatchHandler = handler; - } - - public onOverload(handler: (size: number) => void): void { - this.overloadHandler = handler; - } -} diff --git a/tutorials/config-vars.md b/tutorials/config-vars.md new file mode 100644 index 0000000..8e4eb0a --- /dev/null +++ b/tutorials/config-vars.md @@ -0,0 +1,52 @@ +# Description of configuration variables + +This is more of a description than a tutorial. Here you will find information about the configuration variables that must be provided for history tools to work. + +[Back to Readme](../README.md) + +## Basic Variables + +| **Name** | **Type** | **Description** | **Default** | +| :------- | :------: | --------------- | :---------: | +|`BLOCKCHAIN_ENDPOINT`| _string_ | Blockchain API addres| none | +|`BLOCKCHAIN_CHAIN_ID`| _string_ | Blockchain ID string | none | +|`HYPERION_URL`| _string_ | Hyperion API url | none | +|`BLOCK_READER_ENDPOINTS`| _string_ | Comma separated list of State History Plugin WS paths | none | +|`BLOCK_READER_FETCH_BLOCK`| _number_ | Number (0/1) value that specifies whether to fetch the signed block data | 1 | +|`BLOCK_READER_FETCH_DELTAS`| _number_ | Number (0/1) value that specifies whether to fetch deltas | 1 | +|`BLOCK_READER_FETCH_TRACES`| _number_ | Number (0/1) value that specifies whether to fetch traces | 1 | +|`READER_MAX_THREADS`| _number_ | specify the maximum number of threads dedicated to the reader process | 1 | +|`FILTER_MAX_THREADS`| _number_ | specify the maximum number of threads dedicated to the reader process | 1 | +|`PROCESSOR_MAX_THREADS`| _number_ | specify the maximum number of threads dedicated to the reader process | 1 | +|`API_PORT`| _number_ | History Tools API port number | none | +|`BROADCAST_PORT`| _number_ | Broadcast port number | none | +|`BROADCAST_HOST`| _string_ | Broadcast host | none | +|`MONGO_HOSTS`| _string_ | Comma separated list of database hosts | none | +|`MONGO_PORTS`| _string_ | Comma separated list of database ports | none | +|`MONGO_DB_NAME`| _string_ | Name of the database | none | +|`MONGO_USER`| _string_ | Database user | none | +|`MONGO_PASSWORD`| _string_ | Database user password | none | +|`MODE`| _string_ | History Tools run mode label "default"/"replay" | "default" | +|`SCANNER_SCAN_KEY`| _string_ | Label for scanned blocks in replay mode. It serves the main purpose of separating and keeping in the database logs the history of which blocks were scanned for what purpose. It may happen that blocks in the same instance will have to be downloaded again, for this you need to enter a new label. | none | +|`START_BLOCK`| _number_ | Beginning of block scanning in replay mode. | none | +|`END_BLOCK`| _number_ | End of block scan in replay mode. | none | + + +## Advanced Variables + +The following settings are additional for more advanced users who want to tweak the work of the tools to use more available resources + +| **Name** | **Type** | **Description** | **Default** | +| :-------------------- | :------: | --------------------- | :---------: | +| `SCANNER_NODES_MAX_CHUNK_SIZE` | _number_ | Specifies the number of blocks in a subset of the block range. Modifying this value may affect the scanning speed, more smaller subsets will speed up the process in case of multi-threaded or multiple reader instances. | 100 | +| `ABIS_SERVICE_LIMIT` | _number_ | "setabi" action fetch limit | 100 | +| `ABIS_SERVICE_FILTER` | _string_ | "setabi" action filter | "eosio:setabi" | +| `READER_INVIOLABLE_THREADS_COUNT` | _number_ | The number of threads that cannot be allocated to the reader process. | 0 | +| `PROCESSOR_INVIOLABLE_THREADS_COUNT` | _number_ | The number of threads that cannot be allocated to the processor process. | 0 | +| `FILTER_INVIOLABLE_THREADS_COUNT` | _number_ | The number of threads that cannot be allocated to the filter process. | 0 | +| `START_FROM_HEAD` | _number_ | Specifies (1 = true/ 0 = false) whether reading a blocks should start with the head or the last irreversible block number. | 0 | +| `UNPROCESSED_BLOCK_QUEUE_MAX_BYTES_SIZE` | _number_ | The maximum size of the queue in bytes. | 256000000 | +| `UNPROCESSED_BLOCK_QUEUE_SIZE_CHECK_INTERVAL` | _number_ | Specifies the waiting time in milliseconds to check that the current queue size in bytes does not exceed the maximum allowed. | 2000 | +| `UNPROCESSED_BLOCK_QUEUE_BATCH_SIZE` | _number_ | Batch size of unprocessed blocks sent to the database at one time. The batch setting can be modified to optimize the transfer consumption to the database. | 100 | +| `UNPROCESSED_BLOCK_QUEUE_FAST_LANE_BATCH_SIZE` | _number_ | Value used when block number is greater than last irreversible block number. Batch size of unprocessed blocks sent to the database at one time. The batch setting can be modified to optimize the transfer consumption to the database. | 1 | +| `PROCESSOR_TASK_QUEUE_CHECK_INTERVAL` | _number_ | Time to wait in milliseconds to check if there are new processor tasks available. This option is needed when the filter finishes its work and will not send information about the update to the processor. | 5000 | diff --git a/tutorials/extending-history-tools.md b/tutorials/extending-history-tools.md new file mode 100644 index 0000000..3834fa4 --- /dev/null +++ b/tutorials/extending-history-tools.md @@ -0,0 +1,315 @@ +# Extending History Tools + +In this tutorial, we'll briefly go over how to extend the history tools and possibly change the resources. Such a situation may occur when the basic construction of the tools does not meet all expectations. in our case, this happened when creating the leaderboard api. We could rely on the available architecture and put all the work of updating the leaderboards in processors, but we wanted to separate this process from others and leave the standard processes (boot, filter, reader, processor) unchanged. We needed another process whose task would be to update the leaderboard database and which would be able to scale properly to increase work efficiency. We called the new process writer and its role is to download records in the leaderboard_updates collection and create updates in the target leaderboar collections (of which we have several). On this own example, we will describe ways to extend the capabilities of history tools. + +First, decide how you want to extend the tools. You can download the history-tools repository and add your own implementations of components responsible for communication with the blockchain and the database. The second method is to create a new repository and import history tools along with [Starter Kit](https://github.com/Alien-Worlds/aw-history-starter-kit). Since we will continue to use mongodb and eos we will choose second option. + + +[Back to Readme](../README.md) + +# Table of Contents + +- [Extending History Tools](#extending-history-tools) + - [1. Project Preparation](#1-project-preparation) + - [2. Creating a List of Contracts](#2-creating-a-list-of-contracts) + - [Step 3: Create additional processes](#step-3-create-additional-processes) + - [3.1 Starter](#31-starter) + - [3.2 Worker](#32-worker) + - [3.3 WorkerLoader](#33-workerloader) + - [3.4 WorkerLoaderDependencies](#34-workerloaderdependencies) + - [Step 4: Create Processes](#step-4-create-processes) + - [4.1 Bootstrap](#41-bootstrap) + - [4.2 Broadcasting](#42-broadcasting) + - [4.3 Reader](#43-reader) + - [4.4 Filter](#44-filter) + - [4.5 Processor](#45-processor) + - [Step 5: Create necessary processors](#step-5-create-necessary-processors) +- [Using other resources](#using-other-resources) + +### 1. Project Preparation + +The first step to create history tools is to set up a new project and import the necessary dependencies. + +Your `package.json` file should look similar to this: + +```json +{ + "name": "your-history-tools", + "version": "0.0.1", + "description": "your-description", + "packageManager": "yarn@3.2.3", + "main": "build/index.js", + "types": "build/index.d.ts", + "scripts": { + "broadcast": "node build/broadcast/index.js", + "boot": "node build/bootstrap/index.js", + "reader": "node build/reader/index.js", + "filter": "node build/filter/index.js", + "processor": "node build/processor/index.js", + "writer": "node build/writer/index.js", + ... + }, + ... + "dependencies": { + "@alien-worlds/aw-history": "^0.0.136", + ... all your contract packages and other typescript dependencies + } +} +``` + +Note that in `sripts` we have a new `writer` process which is located in a separate directory like the others. + +### 2. Creating a List of Contracts + +Now you need to define what data you want to download from the blockchain. In our case, this will be the `logmine` action from the `notify.world` contract: + +```json +{ + "traces": [ + { + "shipTraceMessageName": ["transaction_trace_v0"], // Optional + "shipActionTraceMessageName": ["action_trace_v0", "action_trace_v1"], // Optional + "contract": ["dao.worlds"], + "action": ["*"], + "processor": "NotifyWorldTraceProcessor" + } + ], + "deltas": [] +} +``` + +## Step 3: Create additional processes + +Knowing what we will be listening to and what data we want to download from the blockchain, we need to implement additional processes from scratch. In our case, it will be one whose task is to download data, update the leaderboard and save changes to the appropriate collection. + +#### 3.1 Starter + +Let's start with the startup file, just like the rest of the processes, it should create the config based on given options or enviroment variables, pass the necessary dependencies and run the program. + +```typescript +export const startWriter = (args: string[]) => { + const vars = new ConfigVars(); + const options = bootstrapCommand.parse(args).opts(); + const config: LeaderboardWriterConfig = buildWriterConfig(vars, options); // you should implement your config builder + + const workerPool = await WorkerPool.create({ + ...config.workers, + sharedData: { config }, + workerLoaderPath: `${__dirname}/leaderboard.worker-loader`, + workerLoaderDependenciesPath: `${__dirname}/leaderboard.worker-loader.dependencies`, + }); + + ... + + const writer = await LeaderboardWriter.create(config, workerPool); + writer.next(); +} +``` + +If you want to use workers in your process, like we do in leaderboard writer, you need to add three more worker files, worker loader and dependencies worker loader. + +#### 3.2 Worker + +The worker is launched from the workerPool which you can implement directly in the starter or a separate class. In the Worker class, you need to implement all the worker logic in the body of the `run` method and finally call the `resolve` or `reject` method so as not to block the `workerPool`. Don't forget to override the standard constructor to have access to the `dependencies` provided by the loader. + +```typescript +export default class LeaderboardWorker extends Worker { + constructor( + protected dependencies: { + updatesRepository: LeaderboardUpdateRepository; + updateLeaderboardUseCase: UpdateLeaderboardUseCase; + }, + protected sharedData: LeaderboardSharedData + ) { + super(); + } + + public async run(): Promise { + const { dependencies: { updatesRepository, updateLeaderboardUseCase } } = this; + try { + ... update leaerboards logic + + this.resolve(); + } catch (error) { + this.reject(error); + } + } +} +``` + +#### 3.3 WorkerLoader + +As you know, the worker loader is used to create a worker instance and initialize and transfer the dependencies needed by the worker. The `setup` method should be overridden as in the example below if you have dependencies and you need to pass arguments to the `initialize` method (in our case this is the config object). Another reason may also be the need to perform additional settings, e.g. adding event listeners to the created dependencies. In the `load` method, we create instances of our worker and pass the created `dependencies`. + +```typescript +export default class LeaderboardWorkerLoader extends DefaultWorkerLoader< +LeaderboardSharedData, +LeaderboardWorkerLoaderDependencies +> { + public async setup(sharedData: LeaderboardSharedData): Promise { + const { config } = sharedData; + await super.setup(sharedData, config); // initialize dependencies + ... additional work eg. event listeners + } + + public async load(): Promise { + const { dependencies, sharedData } = this; + return new LeaderboardWorker(dependencies, sharedData); + } +} +``` + +#### 3.4 WorkerLoaderDependencies + +At this point, the matter is simple, you need to implement all necessary depenedencies. To do this, add the appropriate code in the initialize method, which will be called in the setup worker loader method. + +```typescript +export class LeaderboardWorkerLoaderDependencies extends WorkerLoaderDependencies { + public updatesRepository: LeaderboardUpdateRepository; + public updateLeaderboardUseCase: UpdateLeaderboardUseCase; + + public async initialize(config: LeaderboardConfig): Promise { + ... initialize all dependencies + } +} +``` +Remember that when creating a `workerPool` you will have to give paths to the worker loader and its dependencies. + +## Step 4: Create Processes + +Next, create a folder for each history tools process (`broadcast`, `bootstrap`, `filter`, `reader`, `processor`). Each of these should have an `index.ts` file within. + +#### 4.1 Bootstrap + +In the `bootstrap` directory, create an `index.ts` file and add the following content: + +```typescript +import { + startBootstrap, + DefaultBootstrapDependencies, +} from '@alien-worlds/aw-history-starter-kit'; +import path from 'path'; + +startBootstrap( + process.argv, + new DefaultBootstrapDependencies(), + path.join(__dirname, '../../your.featured.json') +); +``` + +#### 4.2 Broadcasting + +In the `broadcast` directory, create an `index.ts` file and add the following content: + +```typescript +import { startBroadcast } from '@alien-worlds/aw-history-starter-kit'; + +startBroadcast(); +``` + +#### 4.3 Reader + +In the `reader` directory, create an `index.ts` file and add the following content: + +```typescript +import { + startReader, + DefaultReaderDependencies, +} from '@alien-worlds/aw-history-starter-kit'; + +startReader(process.argv, new DefaultReaderDependencies()); +``` + +#### 4.4 Filter + +In the `filter` directory, create an `index.ts` file and add the following content: + +```typescript +import { + startFilter, + DefaultFilterDependencies, +} from '@alien-worlds/aw-history-starter-kit'; +import path from 'path'; + +startFilter( + process.argv, + new DefaultFilterDependencies(), + path.join(__dirname, '../../your.featured.json') +); +``` + +#### 4.5 Processor + +In the `processor` directory, create an `index.ts` file and add the following content: + +```typescript +import { + startProcessor, + DefaultProcessorDependencies, +} from '@alien-worlds/aw-history-starter-kit'; +import path from 'path'; + +startProcessor( + process.argv, + new DefaultProcessorDependencies(), + path.join(__dirname, './processors'), + path.join(__dirname, '../../your.featured.json') +); +``` + +## Step 5: Create necessary processors + +Finally, create a `processors` folder where you will store all of the processor files. +To ensure that `ProcessorWorkerLoader` will create insance of the processor, you have to make sure that processor classes are **exported individually** in the 'processors' folder and not as default exports, follow these instructions: + +1. Create the 'processors' folder in your processor directory if it doesn't exist already. + +2. Inside the 'processors' folder, create a separate TypeScript file for each class you want to export. For example, Processor1.ts, Processor2.ts, etc. + +3. In each class file (e.g., Processor1.ts, Processor2.ts, etc.), define your classes using the `export` keyword. Each class should have its own file. + +Example: + +```typescript +// ./processors/index.ts +export { NotifyWorldTraceProcessor } from './notify-world.trace-processor'; + +... + +// ./processors/notify-world.trace-processor.ts +import { ActionTraceProcessor, ProcessorTaskModel } from '@alien-worlds/aw-history-starter-kit'; + +export class NotifyWorldTraceProcessor extends ActionTraceProcessor { + public async run(model: ProcessorTaskModel): Promise { + try { + if (name === NotifyWorldActionName.Logmine) { + const update = LeaderboardUpdate.fromLogmineJson( + blockNumber, + blockTimestamp, + data, + tlmDecimalPrecision + ); + + const json = update.toJson(); + + ... + + const updateResult = await leaderboardUpdates.add(LeaderboardUpdate.fromJson(json)); + } + } + + this.resolve(); + } catch (error) { + this.reject(error); + } + } +} +``` + +That's it, the examples shown are of course partial but you should get a general idea of what to do if you want to extend History Tools by adding more processes. + +## Using other resources + +If, apart from simply adding new processes or modifying existing ones, you want to use another database (replacing the default mongodb) or another blockchain (reading tools). You should check [Starter Kit](https://github.com/Alien-Worlds/aw-history-starter-kit) implementation and write your own based on all interfaces used in [Api Core](https://github.com/Alien-Worlds/api-core), **History Tools**. Check the contents of this package and replace e.g. all mongo.\* components with your own. Theoretically, if you follow the interface guidelines, everything should work fine, including serialization and block reader. The **kit** prepared in this way should be imported into your history tools implementation and then follow the guidelines mentioned above. + +Remember, if your **kit** works and meets all requirements, it's worth thinking about sharing it with other users. More _starter-kit_ type repositories may be useful and maybe more users will benefit from your work. Good luck! diff --git a/tutorials/what-is-featured-content.md b/tutorials/what-is-featured-content.md new file mode 100644 index 0000000..c432f07 --- /dev/null +++ b/tutorials/what-is-featured-content.md @@ -0,0 +1,128 @@ +# What is "featured" content? + +## Table of Contents + +1. [Introduction](#introduction) +2. [JSON object construction](#json-object-construction) +3. [Using featured content in the project](#using-featured-content-in-the-project) + - [Featured](#featured) + - [FeaturedContracts](#featuredcontracts) + +## Introduction + +The history tools work by reading from the blockchain and providing guidelines to extract and process data as per requirements. This data extraction is facilitated through the use of a JSON object containing two tables: traces and deltas. This JSON object sets the criteria defining what action or table we are looking for and the processor to use to process the data contained in this action or delta. + +Without this JSON configuration, only reading blocks will occur, and no data will be extracted from them. Thus, the correct configuration of this JSON object is crucial. + +## JSON object construction + +Below is an example of a JSON object configuration that we use in our history tools. It contains two tables: traces and deltas. + +```json +{ + "traces": [ + { + "shipTraceMessageName": ["transaction_trace_v0"], // Optional + "shipActionTraceMessageName": ["action_trace_v0", "action_trace_v1"], // Optional + "contract": ["dao.worlds"], + "action": ["*"], + "processor": "NotifyWorldTraceProcessor" + } + ], + "deltas": [ + { + "shipDeltaMessageName": ["table_delta_v0"], // Optional + "name": ["contract_row"], + "code": ["ref.worlds"], + "scope": ["*"], + "table": ["*"], + "processor": "RefWorldsDeltaProcessor" + } + ] +} +``` + +In traces, you need to provide `contract`, `action`, and `processor`. They are arrays because if you want to use the same processor for several different actions or contracts, there's no need to create a new object for each one. Instead, you can enter the name of the contract or action in the table. The symbol "\*" stands for "all possible" options. Avoid using the wildcard in the contract name tables (`contract` and `code`) as this might result in too much unnecessary data being downloaded. + +```json +// example of using one processor for all actions of the listed contracts +"contract": ["dao.worlds", "alien.worlds", "index.words"], +"action": ["*"], +"processor": "YourProcessorClassName", + +// example of using one processor for selected contract actions +"contract": ["dao.worlds"], +"action": ["appointcust", "flagcandprof", "newperiod"], +"processor": "YourProcessorClassName", +``` + +The same rules apply to `deltas` where the `name` will always be "contract_row", followed by `code`, `scope`, and `table`. + +In `traces` and `deltas`, optional keys: `shipTraceMessageName`, `shipActionTraceMessageName`, and `shipDeltaMessageName` are present. They are the names with versions of the structs containing data for transactions and deltas, respectively. If you use our **Starter Kit**, you don't need to provide these values in JSON because they will be added by featured scripts. However, remember about them if there's an update of one of the versions. Keep references to the previous ones (`v0` at the moment) to be able to read blocks generated before the API change. + +## Using featured content in the project + +### Featured + +An instance of the `Featured` class represents a set of criteria contained in the JSON configuration. Two instances are created, one for `traces` and the other for `deltas`. + +```typescript +const featuredTraces: Featured = new Featured( + featuredCriteria.traces, + { + shipTraceMessageName: [], + shipActionTraceMessageName: [], + contract: [], + action: [], + }, + { + shipTraceMessageName: ['transaction_trace_v0'], + shipActionTraceMessageName: ['action_trace_v0', 'action_trace_v1'], + } +); + +const featuredDeltas: Featured = new Featured( + featuredCriteria.deltas, + { + shipDeltaMessageName: [], + name: [], + code: [], + scope: [], + table: [], + }, + { shipDeltaMessageName: ['table_delta_v0'] } +); +``` + +The order of keys in the pattern is important. The processor tasks have `short_id` and `label` fields built from the values given in JSON objects. + +```json +"short_id" : "notify.world:logmine", +"label" : "transaction_trace_v0:action_trace_v1:notify.world:logmine", + +// corresponds to the values in the JSON object: +{ + "shipTraceMessageName": ["transaction_trace_v0"], + "shipActionTraceMessageName": ["action_trace_v1"], + "contract": ["notify.world"], + "action": ["logmine"], +} +``` + +The `label` string is composed of names separated by ':' and the order corresponds to that defined in the pattern for featured traces. + +_Note: The current solution may change as we are considering different alternatives._ + +### FeaturedContracts + +The `FeaturedContracts` is a repository containing the contract data you put in the JSON file. The `FeaturedUtils.readFeaturedContracts` function can be used to extract the names of all contracts and retrieve their data via `SmartContractService`. + +To create `FeaturedContracts`, you can use the `FeaturedContractsCreator` available in the [Starter Kit](https://github.com/Alien-Worlds/aw-history-starter-kit). If you want to use other sources and there is no suitable KIT for them, check the wizard implementation from the starter kit and implement it according to your needs. + +### FeaturedUtils + +FeaturedUtils includes some tools used inside other featured components or History Tools processes + +- `readFeaturedContracts(data)`: Extracts contract names from the provided json object +- `fetchCriteria(filePath)`: Gets the criteria json object from the given local file or URL + diff --git a/yarn.lock b/yarn.lock index cb8112f..01a9f13 100644 --- a/yarn.lock +++ b/yarn.lock @@ -2,30 +2,31 @@ # yarn lockfile v1 -"@acuminous/bitsyntax@^0.1.2": - version "0.1.2" - resolved "https://registry.npmjs.org/@acuminous/bitsyntax/-/bitsyntax-0.1.2.tgz" - integrity sha512-29lUK80d1muEQqiUsSo+3A0yP6CdspgC95EnKBMi22Xlwt79i/En4Vr67+cXhU+cZjbti3TgGGC5wy1stIywVQ== +"@alien-worlds/aw-broadcast@^0.0.6": + version "0.0.6" + resolved "https://npm.pkg.github.com/download/@alien-worlds/aw-broadcast/0.0.6/361699552df577c674cfd07d55a89173a8ec69a1#361699552df577c674cfd07d55a89173a8ec69a1" + integrity sha512-n08okhDoxCr1urpCpKXlXXUcqz2MO7ifjz4jGkgwJgjycJvgi8qm5O5t6HVRjv9MYV3OSJSjAoYls8Q6sPFdHQ== dependencies: - buffer-more-ints "~1.0.0" - debug "^4.3.4" - safe-buffer "~5.1.2" + "@alien-worlds/aw-core" "^0.0.13" + nanoid "^3.0.0" -"@alien-worlds/api-core@^0.0.101": - version "0.0.101" - resolved "https://npm.pkg.github.com/download/@alien-worlds/api-core/0.0.101/c67f2c0741a464be6d0391b98691ca9c2ac74eb9#c67f2c0741a464be6d0391b98691ca9c2ac74eb9" - integrity sha512-9hgizhXjYL+PsxsYnrwwL3KRHVvuiVaPjovM2w6U06PTbLXYDkalHQ+em15OQ29DkOd8WGoM1cWW+LTLx6Ec+w== +"@alien-worlds/aw-core@^0.0.13": + version "0.0.13" + resolved "https://npm.pkg.github.com/download/@alien-worlds/aw-core/0.0.13/f9d8ad2b2db4b99c52da69c075a4c34950602e73#f9d8ad2b2db4b99c52da69c075a4c34950602e73" + integrity sha512-v82R+oqik+3IaXH9KllFQGOfkP9P6C97zTt++TleWlB1V7wQdQdFtItgN/0eEo+VdHjdmUryLUEkQM4WNn1kyw== dependencies: - "@google-cloud/bigquery" "^6.0.3" - amqplib "^0.10.3" - eosjs "^22.1.0" inversify "^6.0.1" - mongodb "4.9.1" - nanoid "^3.0.0" - node-fetch "2" - redis "^4.6.5" + node-fetch "2.6.6" reflect-metadata "^0.1.13" +"@alien-worlds/aw-workers@^0.0.2": + version "0.0.2" + resolved "https://npm.pkg.github.com/download/@alien-worlds/aw-workers/0.0.2/38b0f98ca4327d71df0e5b21a59b7a13b760a654#38b0f98ca4327d71df0e5b21a59b7a13b760a654" + integrity sha512-Sdm5OI7lTJbBXXxXh3bGsc2fB/I6v55hZKFY8omPIi68M+P9xlfqJDU8AoZ+zfb4G9iH60Mn6kiLLNe0yKqeBQ== + dependencies: + async "^3.2.4" + ts-node "^10.9.1" + "@ampproject/remapping@^2.1.0": version "2.2.0" resolved "https://registry.npmjs.org/@ampproject/remapping/-/remapping-2.2.0.tgz" @@ -321,13 +322,6 @@ dependencies: "@jridgewell/trace-mapping" "0.3.9" -"@eosrio/node-abieos@^1": - version "1.0.6" - resolved "https://registry.npmjs.org/@eosrio/node-abieos/-/node-abieos-1.0.6.tgz" - integrity sha512-Y/c0Hi8ylvlHAwUiyI198orbFLfMY2eRtvPWqW4PLudF9yFUXfQwZueDqmitGL2mgTEjXt+WbRGsQc2Oe1C2tA== - dependencies: - node-addon-api "2.0.0" - "@eslint/eslintrc@^1.3.3": version "1.3.3" resolved "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-1.3.3.tgz" @@ -343,57 +337,6 @@ minimatch "^3.1.2" strip-json-comments "^3.1.1" -"@google-cloud/bigquery@^6.0.3": - version "6.0.3" - resolved "https://registry.npmjs.org/@google-cloud/bigquery/-/bigquery-6.0.3.tgz" - integrity sha512-BP464228S9dqDCb4dR99h9D8+N498YZi/AZvoOJUaieg2H6qbiYBE1xlYuaMvyV1WEQT/2/yZTCJnCo5WiaY0Q== - dependencies: - "@google-cloud/common" "^4.0.0" - "@google-cloud/paginator" "^4.0.0" - "@google-cloud/promisify" "^3.0.0" - arrify "^2.0.1" - big.js "^6.0.0" - duplexify "^4.0.0" - extend "^3.0.2" - is "^3.3.0" - p-event "^4.1.0" - readable-stream "^4.0.0" - stream-events "^1.0.5" - uuid "^8.0.0" - -"@google-cloud/common@^4.0.0": - version "4.0.3" - resolved "https://registry.npmjs.org/@google-cloud/common/-/common-4.0.3.tgz" - integrity sha512-fUoMo5b8iAKbrYpneIRV3z95AlxVJPrjpevxs4SKoclngWZvTXBSGpNisF5+x5m+oNGve7jfB1e6vNBZBUs7Fw== - dependencies: - "@google-cloud/projectify" "^3.0.0" - "@google-cloud/promisify" "^3.0.0" - arrify "^2.0.1" - duplexify "^4.1.1" - ent "^2.2.0" - extend "^3.0.2" - google-auth-library "^8.0.2" - retry-request "^5.0.0" - teeny-request "^8.0.0" - -"@google-cloud/paginator@^4.0.0": - version "4.0.1" - resolved "https://registry.npmjs.org/@google-cloud/paginator/-/paginator-4.0.1.tgz" - integrity sha512-6G1ui6bWhNyHjmbYwavdN7mpVPRBtyDg/bfqBTAlwr413On2TnFNfDxc9UhTJctkgoCDgQXEKiRPLPR9USlkbQ== - dependencies: - arrify "^2.0.0" - extend "^3.0.2" - -"@google-cloud/projectify@^3.0.0": - version "3.0.0" - resolved "https://registry.npmjs.org/@google-cloud/projectify/-/projectify-3.0.0.tgz" - integrity sha512-HRkZsNmjScY6Li8/kb70wjGlDDyLkVk3KvoEo9uIoxSjYLJasGiCch9+PqRVDOCGUFvEIqyogl+BeqILL4OJHA== - -"@google-cloud/promisify@^3.0.0": - version "3.0.1" - resolved "https://registry.npmjs.org/@google-cloud/promisify/-/promisify-3.0.1.tgz" - integrity sha512-z1CjRjtQyBOYL+5Qr9DdYIfrdLBe746jRTYfaYU6MeXkqp7UfYs/jX16lFFVzZ7PGEJvqZNqYUEtb1mvDww4pA== - "@humanwhocodes/config-array@^0.11.6": version "0.11.6" resolved "https://registry.npmjs.org/@humanwhocodes/config-array/-/config-array-0.11.6.tgz" @@ -667,40 +610,6 @@ "@nodelib/fs.scandir" "2.1.5" fastq "^1.6.0" -"@redis/bloom@1.2.0": - version "1.2.0" - resolved "https://registry.yarnpkg.com/@redis/bloom/-/bloom-1.2.0.tgz#d3fd6d3c0af3ef92f26767b56414a370c7b63b71" - integrity sha512-HG2DFjYKbpNmVXsa0keLHp/3leGJz1mjh09f2RLGGLQZzSHpkmZWuwJbAvo3QcRY8p80m5+ZdXZdYOSBLlp7Cg== - -"@redis/client@1.5.6": - version "1.5.6" - resolved "https://registry.yarnpkg.com/@redis/client/-/client-1.5.6.tgz#869cc65718d7d5493ef655a71dc40f3bc64a1b28" - integrity sha512-dFD1S6je+A47Lj22jN/upVU2fj4huR7S9APd7/ziUXsIXDL+11GPYti4Suv5y8FuXaN+0ZG4JF+y1houEJ7ToA== - dependencies: - cluster-key-slot "1.1.2" - generic-pool "3.9.0" - yallist "4.0.0" - -"@redis/graph@1.1.0": - version "1.1.0" - resolved "https://registry.yarnpkg.com/@redis/graph/-/graph-1.1.0.tgz#cc2b82e5141a29ada2cce7d267a6b74baa6dd519" - integrity sha512-16yZWngxyXPd+MJxeSr0dqh2AIOi8j9yXKcKCwVaKDbH3HTuETpDVPcLujhFYVPtYrngSco31BUcSa9TH31Gqg== - -"@redis/json@1.0.4": - version "1.0.4" - resolved "https://registry.yarnpkg.com/@redis/json/-/json-1.0.4.tgz#f372b5f93324e6ffb7f16aadcbcb4e5c3d39bda1" - integrity sha512-LUZE2Gdrhg0Rx7AN+cZkb1e6HjoSKaeeW8rYnt89Tly13GBI5eP4CwDVr+MY8BAYfCg4/N15OUrtLoona9uSgw== - -"@redis/search@1.1.2": - version "1.1.2" - resolved "https://registry.yarnpkg.com/@redis/search/-/search-1.1.2.tgz#6a8f66ba90812d39c2457420f859ce8fbd8f3838" - integrity sha512-/cMfstG/fOh/SsE+4/BQGeuH/JJloeWuH+qJzM8dbxuWvdWibWAOAHHCZTMPhV3xIlH4/cUEIA8OV5QnYpaVoA== - -"@redis/time-series@1.0.4": - version "1.0.4" - resolved "https://registry.yarnpkg.com/@redis/time-series/-/time-series-1.0.4.tgz#af85eb080f6934580e4d3b58046026b6c2b18717" - integrity sha512-ThUIgo2U/g7cCuZavucQTQzA9g9JbDDY2f64u3AbAoz/8vE2lt2U37LamDUVChhaDA3IRT9R6VvJwqnUfTJzng== - "@sinonjs/commons@^1.7.0": version "1.8.3" resolved "https://registry.npmjs.org/@sinonjs/commons/-/commons-1.8.3.tgz" @@ -720,11 +629,6 @@ resolved "https://registry.npmjs.org/@tootallnate/once/-/once-1.1.2.tgz" integrity sha512-RbzJvlNzmRq5c3O09UipeuXno4tA1FE6ikOjxZK0tuxVv3412l64l5t1W5pj4+rJq9vpkm/kwiR07aZXnsKPxw== -"@tootallnate/once@2": - version "2.0.0" - resolved "https://registry.npmjs.org/@tootallnate/once/-/once-2.0.0.tgz" - integrity sha512-XCuKFP5PS55gnMVu3dty8KPatLqUoy/ZYzDzAGCQ8JNFCkLXzmI7vNHCR+XpbZaMWQK/vQubr7PkYq8g470J/A== - "@tsconfig/node10@^1.0.7": version "1.0.9" resolved "https://registry.npmjs.org/@tsconfig/node10/-/node10-1.0.9.tgz" @@ -778,40 +682,6 @@ dependencies: "@babel/types" "^7.3.0" -"@types/body-parser@*": - version "1.19.2" - resolved "https://registry.yarnpkg.com/@types/body-parser/-/body-parser-1.19.2.tgz#aea2059e28b7658639081347ac4fab3de166e6f0" - integrity sha512-ALYone6pm6QmwZoAgeyNksccT9Q4AWZQ6PvfwR37GT6r6FWUPguq6sUmNGSMV2Wr761oQoBxwGGa6DR5o1DC9g== - dependencies: - "@types/connect" "*" - "@types/node" "*" - -"@types/connect@*": - version "3.4.35" - resolved "https://registry.yarnpkg.com/@types/connect/-/connect-3.4.35.tgz#5fcf6ae445e4021d1fc2219a4873cc73a3bb2ad1" - integrity sha512-cdeYyv4KWoEgpBISTxWvqYsVy444DOqehiF3fM3ne10AmJ62RSyNkUnxMJXHQWRQQX2eR94m5y1IZyDwBjV9FQ== - dependencies: - "@types/node" "*" - -"@types/express-serve-static-core@^4.17.33": - version "4.17.33" - resolved "https://registry.yarnpkg.com/@types/express-serve-static-core/-/express-serve-static-core-4.17.33.tgz#de35d30a9d637dc1450ad18dd583d75d5733d543" - integrity sha512-TPBqmR/HRYI3eC2E5hmiivIzv+bidAfXofM+sbonAGvyDhySGw9/PQZFt2BLOrjUUR++4eJVpx6KnLQK1Fk9tA== - dependencies: - "@types/node" "*" - "@types/qs" "*" - "@types/range-parser" "*" - -"@types/express@^4.17.17": - version "4.17.17" - resolved "https://registry.yarnpkg.com/@types/express/-/express-4.17.17.tgz#01d5437f6ef9cfa8668e616e13c2f2ac9a491ae4" - integrity sha512-Q4FmmuLGBG58btUnfS1c1r/NQdlp3DMfGDGig8WhfpA2YRUtEkxAjkZb0yvplJGYdF1fsQ81iMDcH24sSCNC/Q== - dependencies: - "@types/body-parser" "*" - "@types/express-serve-static-core" "^4.17.33" - "@types/qs" "*" - "@types/serve-static" "*" - "@types/graceful-fs@^4.1.2": version "4.1.5" resolved "https://registry.npmjs.org/@types/graceful-fs/-/graceful-fs-4.1.5.tgz" @@ -851,11 +721,6 @@ resolved "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.11.tgz" integrity sha512-wOuvG1SN4Us4rez+tylwwwCV1psiNVOkJeM3AUWUNWg/jDQY2+HE/444y5gc+jBmRqASOm2Oeh5c1axHobwRKQ== -"@types/mime@*": - version "3.0.1" - resolved "https://registry.yarnpkg.com/@types/mime/-/mime-3.0.1.tgz#5f8f2bca0a5863cb69bc0b0acd88c96cb1d4ae10" - integrity sha512-Y4XFY5VJAuw0FgAqPNd6NNoV44jbq9Bz2L7Rh/J6jLTiHBSBJa9fxqQIvkIld4GsoDOcCbvzOUAbLPsSKKg+uA== - "@types/node-fetch@2.x": version "2.6.2" resolved "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.2.tgz" @@ -874,47 +739,16 @@ resolved "https://registry.npmjs.org/@types/prettier/-/prettier-2.7.1.tgz" integrity sha512-ri0UmynRRvZiiUJdiz38MmIblKK+oH30MztdBVR95dv/Ubw6neWSb8u1XpRb72L4qsZOhz+L+z9JD40SJmfWow== -"@types/qs@*": - version "6.9.7" - resolved "https://registry.yarnpkg.com/@types/qs/-/qs-6.9.7.tgz#63bb7d067db107cc1e457c303bc25d511febf6cb" - integrity sha512-FGa1F62FT09qcrueBA6qYTrJPVDzah9a+493+o2PCXsesWHIn27G98TsSMs3WPNbZIEj4+VJf6saSFpvD+3Zsw== - -"@types/range-parser@*": - version "1.2.4" - resolved "https://registry.yarnpkg.com/@types/range-parser/-/range-parser-1.2.4.tgz#cd667bcfdd025213aafb7ca5915a932590acdcdc" - integrity sha512-EEhsLsD6UsDM1yFhAvy0Cjr6VwmpMWqFBCb9w07wVugF7w9nfajxLuVmngTIpgS6svCnm6Vaw+MZhoDCKnOfsw== - "@types/semver@^7.3.12": version "7.3.13" resolved "https://registry.npmjs.org/@types/semver/-/semver-7.3.13.tgz" integrity sha512-21cFJr9z3g5dW8B0CVI9g2O9beqaThGQ6ZFBqHfwhzLDKUxaqTIy3vnfah/UPkfOiF2pLq+tGz+W8RyCskuslw== -"@types/serve-static@*": - version "1.15.0" - resolved "https://registry.yarnpkg.com/@types/serve-static/-/serve-static-1.15.0.tgz#c7930ff61afb334e121a9da780aac0d9b8f34155" - integrity sha512-z5xyF6uh8CbjAu9760KDKsH2FcDxZ2tFCsA4HIMWE6IkiYMXfVoa+4f9KX+FN0ZLsaMw1WNG2ETLA6N+/YA+cg== - dependencies: - "@types/mime" "*" - "@types/node" "*" - "@types/stack-utils@^2.0.0": version "2.0.1" resolved "https://registry.npmjs.org/@types/stack-utils/-/stack-utils-2.0.1.tgz" integrity sha512-Hl219/BT5fLAaz6NDkSuhzasy49dwQS/DSdu4MdggFB8zcXv7vflBI3xp7FEmkmdDkBUI2bPUNeMttp2knYdxw== -"@types/webidl-conversions@*": - version "7.0.0" - resolved "https://registry.npmjs.org/@types/webidl-conversions/-/webidl-conversions-7.0.0.tgz" - integrity sha512-xTE1E+YF4aWPJJeUzaZI5DRntlkY3+BCVJi0axFptnjGmAoWxkyREIh/XMrfxVLejwQxMCfDXdICo0VLxThrog== - -"@types/whatwg-url@^8.2.1": - version "8.2.2" - resolved "https://registry.npmjs.org/@types/whatwg-url/-/whatwg-url-8.2.2.tgz" - integrity sha512-FtQu10RWgn3D9U4aazdwIE2yzphmTJREDqNdODHrbrZmmMqI0vMheC/6NE/J1Yveaj8H+ela+YwWTjq5PGmuhA== - dependencies: - "@types/node" "*" - "@types/webidl-conversions" "*" - "@types/yargs-parser@*": version "21.0.0" resolved "https://registry.npmjs.org/@types/yargs-parser/-/yargs-parser-21.0.0.tgz" @@ -1014,21 +848,6 @@ abab@^2.0.3, abab@^2.0.5: resolved "https://registry.npmjs.org/abab/-/abab-2.0.6.tgz" integrity sha512-j2afSsaIENvHZN2B8GOpF566vZ5WVk5opAiMTvWgaQT8DkbOqsTfvNAvHoRGU2zzP8cPoqys+xHTRDWW8L+/BA== -abort-controller@^3.0.0: - version "3.0.0" - resolved "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz" - integrity sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg== - dependencies: - event-target-shim "^5.0.0" - -accepts@~1.3.8: - version "1.3.8" - resolved "https://registry.yarnpkg.com/accepts/-/accepts-1.3.8.tgz#0bf0be125b67014adcb0b0921e62db7bffe16b2e" - integrity sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw== - dependencies: - mime-types "~2.1.34" - negotiator "0.6.3" - acorn-globals@^6.0.0: version "6.0.0" resolved "https://registry.npmjs.org/acorn-globals/-/acorn-globals-6.0.0.tgz" @@ -1079,16 +898,6 @@ ajv@^6.10.0, ajv@^6.12.4: json-schema-traverse "^0.4.1" uri-js "^4.2.2" -amqplib@^0.10.3: - version "0.10.3" - resolved "https://registry.npmjs.org/amqplib/-/amqplib-0.10.3.tgz" - integrity sha512-UHmuSa7n8vVW/a5HGh2nFPqAEr8+cD4dEZ6u9GjP91nHfr1a54RyAKyra7Sb5NH7NBKOUlyQSMXIp0qAixKexw== - dependencies: - "@acuminous/bitsyntax" "^0.1.2" - buffer-more-ints "~1.0.0" - readable-stream "1.x >=1.1.9" - url-parse "~1.5.10" - ansi-escapes@^4.2.1: version "4.3.2" resolved "https://registry.npmjs.org/ansi-escapes/-/ansi-escapes-4.3.2.tgz" @@ -1145,21 +954,11 @@ argparse@^2.0.1: resolved "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz" integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q== -array-flatten@1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/array-flatten/-/array-flatten-1.1.1.tgz#9a5f699051b1e7073328f2a008968b64ea2955d2" - integrity sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg== - array-union@^2.1.0: version "2.1.0" resolved "https://registry.npmjs.org/array-union/-/array-union-2.1.0.tgz" integrity sha512-HGyxoOTYUyCM6stUe6EJgnd4EoewAI7zMdfqO+kGjnlZmBDz/cR5pf8r/cR4Wq60sL/p0IkcjUEEPwS3GFrIyw== -arrify@^2.0.0, arrify@^2.0.1: - version "2.0.1" - resolved "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz" - integrity sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug== - async@^3.2.4: version "3.2.4" resolved "https://registry.npmjs.org/async/-/async-3.2.4.tgz" @@ -1236,49 +1035,6 @@ balanced-match@^1.0.0: resolved "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz" integrity sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw== -base64-js@^1.3.0, base64-js@^1.3.1: - version "1.5.1" - resolved "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz" - integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA== - -big.js@^6.0.0: - version "6.2.1" - resolved "https://registry.npmjs.org/big.js/-/big.js-6.2.1.tgz" - integrity sha512-bCtHMwL9LeDIozFn+oNhhFoq+yQ3BNdnsLSASUxLciOb1vgvpHsIO1dsENiGMgbb4SkP5TrzWzRiLddn8ahVOQ== - -bignumber.js@^9.0.0: - version "9.1.0" - resolved "https://registry.npmjs.org/bignumber.js/-/bignumber.js-9.1.0.tgz" - integrity sha512-4LwHK4nfDOraBCtst+wOWIHbu1vhvAPJK8g8nROd4iuc3PSEjWif/qwbkh8jwCJz6yDBvtU4KPynETgrfh7y3A== - -bn.js@5.2.0: - version "5.2.0" - resolved "https://registry.npmjs.org/bn.js/-/bn.js-5.2.0.tgz" - integrity sha512-D7iWRBvnZE8ecXiLj/9wbxH7Tk79fAh8IHaTNq1RWRixsS02W+5qS+iE9yq6RYl0asXx5tw0bLhmT5pIfbSquw== - -bn.js@^4.11.9: - version "4.12.0" - resolved "https://registry.npmjs.org/bn.js/-/bn.js-4.12.0.tgz" - integrity sha512-c98Bf3tPniI+scsdk237ku1Dc3ujXQTSgyiPUDEOe7tRkhrqridvh8klBv0HCEso1OLOYcHuCv/cS6DNxKH+ZA== - -body-parser@1.20.1: - version "1.20.1" - resolved "https://registry.yarnpkg.com/body-parser/-/body-parser-1.20.1.tgz#b1812a8912c195cd371a3ee5e66faa2338a5c668" - integrity sha512-jWi7abTbYwajOytWCQc37VulmWiRae5RyTpaCyDcS5/lMdtwSz5lOpDE67srw/HYe35f1z3fDQw+3txg7gNtWw== - dependencies: - bytes "3.1.2" - content-type "~1.0.4" - debug "2.6.9" - depd "2.0.0" - destroy "1.2.0" - http-errors "2.0.0" - iconv-lite "0.4.24" - on-finished "2.4.1" - qs "6.11.0" - raw-body "2.5.1" - type-is "~1.6.18" - unpipe "1.0.0" - brace-expansion@^1.1.7: version "1.1.11" resolved "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz" @@ -1294,11 +1050,6 @@ braces@^3.0.2: dependencies: fill-range "^7.0.1" -brorand@^1.1.0: - version "1.1.0" - resolved "https://registry.npmjs.org/brorand/-/brorand-1.1.0.tgz" - integrity sha512-cKV8tMCEpQs4hK/ik71d6LrPOnpkpGBR0wzxqr68g2m/LB2GxVYQroAjMJZRVM1Y4BCjCKc3vAamxSzOY2RP+w== - browser-process-hrtime@^1.0.0: version "1.0.0" resolved "https://registry.npmjs.org/browser-process-hrtime/-/browser-process-hrtime-1.0.0.tgz" @@ -1328,57 +1079,11 @@ bser@2.1.1: dependencies: node-int64 "^0.4.0" -bson@^4.7.0: - version "4.7.0" - resolved "https://registry.npmjs.org/bson/-/bson-4.7.0.tgz" - integrity sha512-VrlEE4vuiO1WTpfof4VmaVolCVYkYTgB9iWgYNOrVlnifpME/06fhFRmONgBhClD5pFC1t9ZWqFUQEQAzY43bA== - dependencies: - buffer "^5.6.0" - -buffer-equal-constant-time@1.0.1: - version "1.0.1" - resolved "https://registry.npmjs.org/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz" - integrity sha512-zRpUiDwd/xk6ADqPMATG8vc9VPrkck7T07OIx0gnjmJAnHnTVXNQG3vfvWNuiZIkwu9KrKdA1iJKfsfTVxE6NA== - buffer-from@^1.0.0: version "1.1.2" resolved "https://registry.npmjs.org/buffer-from/-/buffer-from-1.1.2.tgz" integrity sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ== -buffer-more-ints@~1.0.0: - version "1.0.0" - resolved "https://registry.npmjs.org/buffer-more-ints/-/buffer-more-ints-1.0.0.tgz" - integrity sha512-EMetuGFz5SLsT0QTnXzINh4Ksr+oo4i+UGTXEshiGCQWnsgSs7ZhJ8fzlwQ+OzEMs0MpDAMr1hxnblp5a4vcHg== - -buffer@^5.6.0: - version "5.7.1" - resolved "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz" - integrity sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ== - dependencies: - base64-js "^1.3.1" - ieee754 "^1.1.13" - -buffer@^6.0.3: - version "6.0.3" - resolved "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz" - integrity sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA== - dependencies: - base64-js "^1.3.1" - ieee754 "^1.2.1" - -bytes@3.1.2: - version "3.1.2" - resolved "https://registry.yarnpkg.com/bytes/-/bytes-3.1.2.tgz#8b0beeb98605adf1b128fa4386403c009e0221a5" - integrity sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg== - -call-bind@^1.0.0: - version "1.0.2" - resolved "https://registry.yarnpkg.com/call-bind/-/call-bind-1.0.2.tgz#b1d4e89e688119c3c9a903ad30abb2f6a919be3c" - integrity sha512-7O+FbCihrB5WGbFYesctwmTKae6rOiIzmz1icreWJ+0aA7LJfuqhEso2T9ncpcFtzMQtzXf2QGGueWJGTYsqrA== - dependencies: - function-bind "^1.1.1" - get-intrinsic "^1.0.2" - callsites@^3.0.0: version "3.1.0" resolved "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz" @@ -1440,11 +1145,6 @@ cliui@^7.0.2: strip-ansi "^6.0.0" wrap-ansi "^7.0.0" -cluster-key-slot@1.1.2: - version "1.1.2" - resolved "https://registry.yarnpkg.com/cluster-key-slot/-/cluster-key-slot-1.1.2.tgz#88ddaa46906e303b5de30d3153b7d9fe0a0c19ac" - integrity sha512-RMr0FhtfXemyinomL4hrWcYJxmX6deFdCxpJzhDttxgO1+bcCnkk+9drydLVDmAMG7NE6aN/fl4F7ucU/90gAA== - co@^4.6.0: version "4.6.0" resolved "https://registry.npmjs.org/co/-/co-4.6.0.tgz" @@ -1486,43 +1186,21 @@ combined-stream@^1.0.8: dependencies: delayed-stream "~1.0.0" +commander@^10.0.1: + version "10.0.1" + resolved "https://registry.yarnpkg.com/commander/-/commander-10.0.1.tgz#881ee46b4f77d1c1dccc5823433aa39b022cbe06" + integrity sha512-y4Mg2tXshplEbSGzx7amzPwKKOCGuoSRP/CjEdwwk0FOGlUbq6lKuoyDZTNZkmxHdJtp54hdfY/JUrdL7Xfdug== + concat-map@0.0.1: version "0.0.1" resolved "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" integrity sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg== -content-disposition@0.5.4: - version "0.5.4" - resolved "https://registry.yarnpkg.com/content-disposition/-/content-disposition-0.5.4.tgz#8b82b4efac82512a02bb0b1dcec9d2c5e8eb5bfe" - integrity sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ== - dependencies: - safe-buffer "5.2.1" - -content-type@~1.0.4: - version "1.0.5" - resolved "https://registry.yarnpkg.com/content-type/-/content-type-1.0.5.tgz#8b773162656d1d1086784c8f23a54ce6d73d7918" - integrity sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA== - convert-source-map@^1.4.0, convert-source-map@^1.6.0, convert-source-map@^1.7.0: version "1.9.0" resolved "https://registry.npmjs.org/convert-source-map/-/convert-source-map-1.9.0.tgz" integrity sha512-ASFBup0Mz1uyiIjANan1jzLQami9z1PoYSZCiiYW2FczPbenXc45FZdBZLzOT+r6+iciuEModtmCti+hjaAk0A== -cookie-signature@1.0.6: - version "1.0.6" - resolved "https://registry.yarnpkg.com/cookie-signature/-/cookie-signature-1.0.6.tgz#e303a882b342cc3ee8ca513a79999734dab3ae2c" - integrity sha512-QADzlaHc8icV8I7vbaJXJwod9HWYp8uCqf1xa4OfNu1T7JVxQIrUgOWtHdNDtPiywmFbiS12VjotIXLrKM3orQ== - -cookie@0.5.0: - version "0.5.0" - resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.5.0.tgz#d1f5d71adec6558c58f389987c366aa47e994f8b" - integrity sha512-YZ3GUyn/o8gfKJlnlX7g7xq4gyO6OSuhGPKaaGssGB2qgDUS0gPgtTvoyZLTt9Ab6dC4hfc9dV5arkvc/OCmrw== - -core-util-is@~1.0.0: - version "1.0.3" - resolved "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz" - integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ== - create-require@^1.1.0: version "1.1.1" resolved "https://registry.npmjs.org/create-require/-/create-require-1.1.1.tgz" @@ -1568,13 +1246,6 @@ data-urls@^2.0.0: whatwg-mimetype "^2.3.0" whatwg-url "^8.0.0" -debug@2.6.9: - version "2.6.9" - resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f" - integrity sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA== - dependencies: - ms "2.0.0" - debug@4, debug@^4.1.0, debug@^4.1.1, debug@^4.3.2, debug@^4.3.4: version "4.3.4" resolved "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz" @@ -1607,21 +1278,6 @@ delayed-stream@~1.0.0: resolved "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz" integrity sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ== -denque@^2.1.0: - version "2.1.0" - resolved "https://registry.npmjs.org/denque/-/denque-2.1.0.tgz" - integrity sha512-HVQE3AAb/pxF8fQAoiqpvg9i3evqug3hoiwakOyZAwJm+6vZehbkYXZ0l4JxS+I3QxM97v5aaRNhj8v5oBhekw== - -depd@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/depd/-/depd-2.0.0.tgz#b696163cc757560d09cf22cc8fad1571b79e76df" - integrity sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw== - -destroy@1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/destroy/-/destroy-1.2.0.tgz#4803735509ad8be552934c67df614f94e66fa015" - integrity sha512-2sJGJTaXIIaR1w4iJSNoN0hnMY7Gpc/n8D4qSCJw8QqFWXf7cuAgnEHxBpweaVcPevC2l3KpjYCx3NypQQgaJg== - detect-newline@^3.0.0: version "3.1.0" resolved "https://registry.npmjs.org/detect-newline/-/detect-newline-3.1.0.tgz" @@ -1658,46 +1314,11 @@ domexception@^2.0.1: dependencies: webidl-conversions "^5.0.0" -duplexify@^4.0.0, duplexify@^4.1.1: - version "4.1.2" - resolved "https://registry.npmjs.org/duplexify/-/duplexify-4.1.2.tgz" - integrity sha512-fz3OjcNCHmRP12MJoZMPglx8m4rrFP8rovnk4vT8Fs+aonZoCwGg10dSsQsfP/E62eZcPTMSMP6686fu9Qlqtw== - dependencies: - end-of-stream "^1.4.1" - inherits "^2.0.3" - readable-stream "^3.1.1" - stream-shift "^1.0.0" - -ecdsa-sig-formatter@1.0.11, ecdsa-sig-formatter@^1.0.11: - version "1.0.11" - resolved "https://registry.npmjs.org/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz" - integrity sha512-nagl3RYrbNv6kQkeJIpt6NJZy8twLB/2vtz6yN9Z4vRKHN4/QZJIEbqohALSgwKdnksuY3k5Addp5lg8sVoVcQ== - dependencies: - safe-buffer "^5.0.1" - -ee-first@1.1.1: - version "1.1.1" - resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d" - integrity sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow== - electron-to-chromium@^1.4.251: version "1.4.284" resolved "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.4.284.tgz" integrity sha512-M8WEXFuKXMYMVr45fo8mq0wUrrJHheiKZf6BArTKk9ZBYCKJEOU5H8cdWgDT+qCVZf7Na4lVUaZsA+h6uA9+PA== -elliptic@6.5.4: - version "6.5.4" - resolved "https://registry.npmjs.org/elliptic/-/elliptic-6.5.4.tgz" - integrity sha512-iLhC6ULemrljPZb+QutR5TQGB+pdW6KGD5RSegS+8sorOZT+rdQFbsQFJgvN3eRqNALqJer4oQ16YvJHlU8hzQ== - dependencies: - bn.js "^4.11.9" - brorand "^1.1.0" - hash.js "^1.0.0" - hmac-drbg "^1.0.1" - inherits "^2.0.4" - minimalistic-assert "^1.0.1" - minimalistic-crypto-utils "^1.0.1" - emittery@^0.8.1: version "0.8.1" resolved "https://registry.npmjs.org/emittery/-/emittery-0.8.1.tgz" @@ -1708,33 +1329,6 @@ emoji-regex@^8.0.0: resolved "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz" integrity sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A== -encodeurl@~1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/encodeurl/-/encodeurl-1.0.2.tgz#ad3ff4c86ec2d029322f5a02c3a9a606c95b3f59" - integrity sha512-TPJXq8JqFaVYm2CWmPvnP2Iyo4ZSM7/QKcSmuMLDObfpH5fi7RUGmd/rTDf+rut/saiDiQEeVTNgAmJEdAOx0w== - -end-of-stream@^1.4.1: - version "1.4.4" - resolved "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.4.tgz" - integrity sha512-+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q== - dependencies: - once "^1.4.0" - -ent@^2.2.0: - version "2.2.0" - resolved "https://registry.npmjs.org/ent/-/ent-2.2.0.tgz" - integrity sha512-GHrMyVZQWvTIdDtpiEXdHZnFQKzeO09apj8Cbl4pKWy4i0Oprcq17usfDt5aO63swf0JOeMWjWQE/LzgSRuWpA== - -eosjs@^22.1.0: - version "22.1.0" - resolved "https://registry.npmjs.org/eosjs/-/eosjs-22.1.0.tgz" - integrity sha512-Ka8KO7akC3RxNdSg/3dkGWuUWUQESTzSUzQljBdVP16UG548vmQoBqSGnZdnjlZyfcab8VOu2iEt+JjyfYc5+A== - dependencies: - bn.js "5.2.0" - elliptic "6.5.4" - hash.js "1.1.7" - pako "2.0.3" - error-ex@^1.3.1: version "1.3.2" resolved "https://registry.npmjs.org/error-ex/-/error-ex-1.3.2.tgz" @@ -1747,11 +1341,6 @@ escalade@^3.1.1: resolved "https://registry.npmjs.org/escalade/-/escalade-3.1.1.tgz" integrity sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw== -escape-html@~1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/escape-html/-/escape-html-1.0.3.tgz#0258eae4d3d0c0974de1c169188ef0051d1d1988" - integrity sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow== - escape-string-regexp@^1.0.5: version "1.0.5" resolved "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz" @@ -1912,21 +1501,6 @@ esutils@^2.0.2: resolved "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz" integrity sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g== -etag@~1.8.1: - version "1.8.1" - resolved "https://registry.yarnpkg.com/etag/-/etag-1.8.1.tgz#41ae2eeb65efa62268aebfea83ac7d79299b0887" - integrity sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg== - -event-target-shim@^5.0.0: - version "5.0.1" - resolved "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz" - integrity sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ== - -events@^3.3.0: - version "3.3.0" - resolved "https://registry.npmjs.org/events/-/events-3.3.0.tgz" - integrity sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q== - execa@^5.0.0: version "5.1.1" resolved "https://registry.npmjs.org/execa/-/execa-5.1.1.tgz" @@ -1957,48 +1531,6 @@ expect@^27.5.1: jest-matcher-utils "^27.5.1" jest-message-util "^27.5.1" -express@^4.18.2: - version "4.18.2" - resolved "https://registry.yarnpkg.com/express/-/express-4.18.2.tgz#3fabe08296e930c796c19e3c516979386ba9fd59" - integrity sha512-5/PsL6iGPdfQ/lKM1UuielYgv3BUoJfz1aUwU9vHZ+J7gyvwdQXFEBIEIaxeGf0GIcreATNyBExtalisDbuMqQ== - dependencies: - accepts "~1.3.8" - array-flatten "1.1.1" - body-parser "1.20.1" - content-disposition "0.5.4" - content-type "~1.0.4" - cookie "0.5.0" - cookie-signature "1.0.6" - debug "2.6.9" - depd "2.0.0" - encodeurl "~1.0.2" - escape-html "~1.0.3" - etag "~1.8.1" - finalhandler "1.2.0" - fresh "0.5.2" - http-errors "2.0.0" - merge-descriptors "1.0.1" - methods "~1.1.2" - on-finished "2.4.1" - parseurl "~1.3.3" - path-to-regexp "0.1.7" - proxy-addr "~2.0.7" - qs "6.11.0" - range-parser "~1.2.1" - safe-buffer "5.2.1" - send "0.18.0" - serve-static "1.15.0" - setprototypeof "1.2.0" - statuses "2.0.1" - type-is "~1.6.18" - utils-merge "1.0.1" - vary "~1.1.2" - -extend@^3.0.2: - version "3.0.2" - resolved "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz" - integrity sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g== - fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3: version "3.1.3" resolved "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz" @@ -2030,11 +1562,6 @@ fast-levenshtein@^2.0.6, fast-levenshtein@~2.0.6: resolved "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz" integrity sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw== -fast-text-encoding@^1.0.0: - version "1.0.6" - resolved "https://registry.npmjs.org/fast-text-encoding/-/fast-text-encoding-1.0.6.tgz" - integrity sha512-VhXlQgj9ioXCqGstD37E/HBeqEGV/qOD/kmbVG8h5xKBYvM1L3lR1Zn4555cQ8GkYbJa8aJSipLPndE1k6zK2w== - fastq@^1.6.0: version "1.13.0" resolved "https://registry.npmjs.org/fastq/-/fastq-1.13.0.tgz" @@ -2063,19 +1590,6 @@ fill-range@^7.0.1: dependencies: to-regex-range "^5.0.1" -finalhandler@1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/finalhandler/-/finalhandler-1.2.0.tgz#7d23fe5731b207b4640e4fcd00aec1f9207a7b32" - integrity sha512-5uXcUVftlQMFnWC9qu/svkWv3GTd2PfUhK/3PLkYNAe7FbqJMt3515HaxE6eRL74GdsriiwujiawdaB1BpEISg== - dependencies: - debug "2.6.9" - encodeurl "~1.0.2" - escape-html "~1.0.3" - on-finished "2.4.1" - parseurl "~1.3.3" - statuses "2.0.1" - unpipe "~1.0.0" - find-up@^4.0.0, find-up@^4.1.0: version "4.1.0" resolved "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz" @@ -2114,16 +1628,6 @@ form-data@^3.0.0: combined-stream "^1.0.8" mime-types "^2.1.12" -forwarded@0.2.0: - version "0.2.0" - resolved "https://registry.yarnpkg.com/forwarded/-/forwarded-0.2.0.tgz#2269936428aad4c15c7ebe9779a84bf0b2a81811" - integrity sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow== - -fresh@0.5.2: - version "0.5.2" - resolved "https://registry.yarnpkg.com/fresh/-/fresh-0.5.2.tgz#3d8cadd90d976569fa835ab1f8e4b23a105605a7" - integrity sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q== - fs.realpath@^1.0.0: version "1.0.0" resolved "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz" @@ -2139,29 +1643,6 @@ function-bind@^1.1.1: resolved "https://registry.npmjs.org/function-bind/-/function-bind-1.1.1.tgz" integrity sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A== -gaxios@^5.0.0, gaxios@^5.0.1: - version "5.0.2" - resolved "https://registry.npmjs.org/gaxios/-/gaxios-5.0.2.tgz" - integrity sha512-TjtV2AJOZoMQqRYoy5eM8cCQogYwazWNYLQ72QB0kwa6vHHruYkGmhhyrlzbmgNHK1dNnuP2WSH81urfzyN2Og== - dependencies: - extend "^3.0.2" - https-proxy-agent "^5.0.0" - is-stream "^2.0.0" - node-fetch "^2.6.7" - -gcp-metadata@^5.0.0: - version "5.0.1" - resolved "https://registry.npmjs.org/gcp-metadata/-/gcp-metadata-5.0.1.tgz" - integrity sha512-jiRJ+Fk7e8FH68Z6TLaqwea307OktJpDjmYnU7/li6ziwvVvU2RlrCyQo5vkdeP94chm0kcSCOOszvmuaioq3g== - dependencies: - gaxios "^5.0.0" - json-bigint "^1.0.0" - -generic-pool@3.9.0: - version "3.9.0" - resolved "https://registry.yarnpkg.com/generic-pool/-/generic-pool-3.9.0.tgz#36f4a678e963f4fdb8707eab050823abc4e8f5e4" - integrity sha512-hymDOu5B53XvN4QT9dBmZxPX4CWhBPPLguTZ9MMFeFa/Kg0xWVfylOVNlJji/E7yTZWFd/q9GO5TxDLq156D7g== - gensync@^1.0.0-beta.2: version "1.0.0-beta.2" resolved "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz" @@ -2172,15 +1653,6 @@ get-caller-file@^2.0.5: resolved "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz" integrity sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg== -get-intrinsic@^1.0.2: - version "1.2.0" - resolved "https://registry.yarnpkg.com/get-intrinsic/-/get-intrinsic-1.2.0.tgz#7ad1dc0535f3a2904bba075772763e5051f6d05f" - integrity sha512-L049y6nFOuom5wGyRc3/gdTLO94dySVKRACj1RmJZBQXlbTMhtNIgkWkUHq+jYmZvKf14EW1EoJnnjbmoHij0Q== - dependencies: - function-bind "^1.1.1" - has "^1.0.3" - has-symbols "^1.0.3" - get-package-type@^0.1.0: version "0.1.0" resolved "https://registry.npmjs.org/get-package-type/-/get-package-type-0.1.0.tgz" @@ -2241,28 +1713,6 @@ globby@^11.1.0: merge2 "^1.4.1" slash "^3.0.0" -google-auth-library@^8.0.2: - version "8.7.0" - resolved "https://registry.npmjs.org/google-auth-library/-/google-auth-library-8.7.0.tgz" - integrity sha512-1M0NG5VDIvJZEnstHbRdckLZESoJwguinwN8Dhae0j2ZKIQFIV63zxm6Fo6nM4xkgqUr2bbMtV5Dgo+Hy6oo0Q== - dependencies: - arrify "^2.0.0" - base64-js "^1.3.0" - ecdsa-sig-formatter "^1.0.11" - fast-text-encoding "^1.0.0" - gaxios "^5.0.0" - gcp-metadata "^5.0.0" - gtoken "^6.1.0" - jws "^4.0.0" - lru-cache "^6.0.0" - -google-p12-pem@^4.0.0: - version "4.0.1" - resolved "https://registry.npmjs.org/google-p12-pem/-/google-p12-pem-4.0.1.tgz" - integrity sha512-WPkN4yGtz05WZ5EhtlxNDWPhC4JIic6G8ePitwUWy4l+XPVYec+a0j0Ts47PDtW59y3RwAhUd9/h9ZZ63px6RQ== - dependencies: - node-forge "^1.3.1" - graceful-fs@^4.2.9: version "4.2.10" resolved "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.10.tgz" @@ -2273,15 +1723,6 @@ grapheme-splitter@^1.0.4: resolved "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz" integrity sha512-bzh50DW9kTPM00T8y4o8vQg89Di9oLJVLW/KaOGIXJWP/iqCN6WKYkbNOF04vFLJhwcpYUh9ydh/+5vpOqV4YQ== -gtoken@^6.1.0: - version "6.1.2" - resolved "https://registry.npmjs.org/gtoken/-/gtoken-6.1.2.tgz" - integrity sha512-4ccGpzz7YAr7lxrT2neugmXQ3hP9ho2gcaityLVkiUecAiwiy60Ii8gRbZeOsXV19fYaRjgBSshs8kXw+NKCPQ== - dependencies: - gaxios "^5.0.1" - google-p12-pem "^4.0.0" - jws "^4.0.0" - has-flag@^3.0.0: version "3.0.0" resolved "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz" @@ -2292,11 +1733,6 @@ has-flag@^4.0.0: resolved "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz" integrity sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ== -has-symbols@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/has-symbols/-/has-symbols-1.0.3.tgz#bb7b2c4349251dce87b125f7bdf874aa7c8b39f8" - integrity sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A== - has@^1.0.3: version "1.0.3" resolved "https://registry.npmjs.org/has/-/has-1.0.3.tgz" @@ -2304,23 +1740,6 @@ has@^1.0.3: dependencies: function-bind "^1.1.1" -hash.js@1.1.7, hash.js@^1.0.0, hash.js@^1.0.3: - version "1.1.7" - resolved "https://registry.npmjs.org/hash.js/-/hash.js-1.1.7.tgz" - integrity sha512-taOaskGt4z4SOANNseOviYDvjEJinIkRgmp7LbKP2YTTmVxWBl87s/uzK9r+44BclBSp2X7K1hqeNfz9JbBeXA== - dependencies: - inherits "^2.0.3" - minimalistic-assert "^1.0.1" - -hmac-drbg@^1.0.1: - version "1.0.1" - resolved "https://registry.npmjs.org/hmac-drbg/-/hmac-drbg-1.0.1.tgz" - integrity sha512-Tti3gMqLdZfhOQY1Mzf/AanLiqh1WTiJgEj26ZuYQ9fbkLomzGchCws4FyrSd4VkpBfiNhaE1On+lOz894jvXg== - dependencies: - hash.js "^1.0.3" - minimalistic-assert "^1.0.0" - minimalistic-crypto-utils "^1.0.1" - html-encoding-sniffer@^2.0.1: version "2.0.1" resolved "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-2.0.1.tgz" @@ -2333,17 +1752,6 @@ html-escaper@^2.0.0: resolved "https://registry.npmjs.org/html-escaper/-/html-escaper-2.0.2.tgz" integrity sha512-H2iMtd0I4Mt5eYiapRdIDjp+XzelXQ0tFE4JS7YFwFevXXMmOp9myNrUvCg0D6ws8iqkRPBfKHgbwig1SmlLfg== -http-errors@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-2.0.0.tgz#b7774a1486ef73cf7667ac9ae0858c012c57b9d3" - integrity sha512-FtwrG/euBzaEjYeRqOgly7G0qviiXoJWnvEH2Z1plBdXgbyjv34pHTSb9zoeHMyDy33+DWy5Wt9Wo+TURtOYSQ== - dependencies: - depd "2.0.0" - inherits "2.0.4" - setprototypeof "1.2.0" - statuses "2.0.1" - toidentifier "1.0.1" - http-proxy-agent@^4.0.1: version "4.0.1" resolved "https://registry.npmjs.org/http-proxy-agent/-/http-proxy-agent-4.0.1.tgz" @@ -2353,15 +1761,6 @@ http-proxy-agent@^4.0.1: agent-base "6" debug "4" -http-proxy-agent@^5.0.0: - version "5.0.0" - resolved "https://registry.npmjs.org/http-proxy-agent/-/http-proxy-agent-5.0.0.tgz" - integrity sha512-n2hY8YdoRE1i7r6M0w9DIw5GgZN0G25P8zLCRQ8rjXtTU3vsNFBI/vWK/UIeE6g5MUUz6avwAPXmL6Fy9D/90w== - dependencies: - "@tootallnate/once" "2" - agent-base "6" - debug "4" - https-proxy-agent@^5.0.0: version "5.0.1" resolved "https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-5.0.1.tgz" @@ -2382,11 +1781,6 @@ iconv-lite@0.4.24: dependencies: safer-buffer ">= 2.1.2 < 3" -ieee754@^1.1.13, ieee754@^1.2.1: - version "1.2.1" - resolved "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz" - integrity sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA== - ignore@^5.2.0: version "5.2.0" resolved "https://registry.npmjs.org/ignore/-/ignore-5.2.0.tgz" @@ -2421,26 +1815,16 @@ inflight@^1.0.4: once "^1.3.0" wrappy "1" -inherits@2, inherits@2.0.4, inherits@^2.0.3, inherits@^2.0.4, inherits@~2.0.1: +inherits@2: version "2.0.4" resolved "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz" integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ== inversify@^6.0.1: version "6.0.1" - resolved "https://registry.npmjs.org/inversify/-/inversify-6.0.1.tgz" + resolved "https://registry.yarnpkg.com/inversify/-/inversify-6.0.1.tgz#b20d35425d5d8c5cd156120237aad0008d969f02" integrity sha512-B3ex30927698TJENHR++8FfEaJGqoWOgI6ZY5Ht/nLUsFCwHn6akbwtnUAPCgUepAnTpe2qHxhDNjoKLyz6rgQ== -ip@^2.0.0: - version "2.0.0" - resolved "https://registry.npmjs.org/ip/-/ip-2.0.0.tgz" - integrity sha512-WKa+XuLG1A1R0UWhl2+1XQSi+fZWMsYKffMZTTYsiZaUD8k2yDAj5atimTUD2TZkyCkNEeYE5NhFZmupOGtjYQ== - -ipaddr.js@1.9.1: - version "1.9.1" - resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3" - integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g== - is-arrayish@^0.2.1: version "0.2.1" resolved "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.2.1.tgz" @@ -2500,16 +1884,6 @@ is-typedarray@^1.0.0: resolved "https://registry.npmjs.org/is-typedarray/-/is-typedarray-1.0.0.tgz" integrity sha512-cyA56iCMHAh5CdzjJIa4aohJyeO1YbwLi3Jc35MmRU6poroFjIGZzUzupGiRPOjgHg9TLu43xbpwXk523fMxKA== -is@^3.3.0: - version "3.3.0" - resolved "https://registry.npmjs.org/is/-/is-3.3.0.tgz" - integrity sha512-nW24QBoPcFGGHJGUwnfpI7Yc5CdqWNdsyHQszVE/z2pKHXzh7FZ5GWhJqSyaQ9wMkQnsTx+kAI8bHlCX4tKdbg== - -isarray@0.0.1: - version "0.0.1" - resolved "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" - integrity sha512-D2S+3GLxWH+uhrNEcoh/fnmYeP8E8/zHl644d/jdA0g2uyXvy3sb0qxotE+ne0LtccHknQzWwZEzhak7oJ0COQ== - isexe@^2.0.0: version "2.0.0" resolved "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz" @@ -3025,13 +2399,6 @@ jsesc@^2.5.1: resolved "https://registry.npmjs.org/jsesc/-/jsesc-2.5.2.tgz" integrity sha512-OYu7XEzjkCQ3C5Ps3QIZsQfNpqoJyZZA99wd9aWd05NCtC5pWOkShK2mkL6HXQR6/Cy2lbNdPlZBpuQHXE63gA== -json-bigint@^1.0.0: - version "1.0.0" - resolved "https://registry.npmjs.org/json-bigint/-/json-bigint-1.0.0.tgz" - integrity sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ== - dependencies: - bignumber.js "^9.0.0" - json-parse-even-better-errors@^2.3.0: version "2.3.1" resolved "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz" @@ -3052,23 +2419,6 @@ json5@2.x, json5@^2.2.1: resolved "https://registry.npmjs.org/json5/-/json5-2.2.1.tgz" integrity sha512-1hqLFMSrGHRHxav9q9gNjJ5EXznIxGVO09xQRrwplcS8qs28pZ8s8hupZAmqDwZUmVZ2Qb2jnyPOWcDH8m8dlA== -jwa@^2.0.0: - version "2.0.0" - resolved "https://registry.npmjs.org/jwa/-/jwa-2.0.0.tgz" - integrity sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA== - dependencies: - buffer-equal-constant-time "1.0.1" - ecdsa-sig-formatter "1.0.11" - safe-buffer "^5.0.1" - -jws@^4.0.0: - version "4.0.0" - resolved "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz" - integrity sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg== - dependencies: - jwa "^2.0.0" - safe-buffer "^5.0.1" - kleur@^3.0.3: version "3.0.3" resolved "https://registry.npmjs.org/kleur/-/kleur-3.0.3.tgz" @@ -3155,21 +2505,6 @@ makeerror@1.0.12: dependencies: tmpl "1.0.5" -media-typer@0.3.0: - version "0.3.0" - resolved "https://registry.yarnpkg.com/media-typer/-/media-typer-0.3.0.tgz#8710d7af0aa626f8fffa1ce00168545263255748" - integrity sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ== - -memory-pager@^1.0.2: - version "1.5.0" - resolved "https://registry.npmjs.org/memory-pager/-/memory-pager-1.5.0.tgz" - integrity sha512-ZS4Bp4r/Zoeq6+NLJpP+0Zzm0pR8whtGPf1XExKLJBAczGMnSi3It14OiNCStjQjM6NU1okjQGSxgEZN8eBYKg== - -merge-descriptors@1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/merge-descriptors/-/merge-descriptors-1.0.1.tgz#b00aaa556dd8b44568150ec9d1b953f3f90cbb61" - integrity sha512-cCi6g3/Zr1iqQi6ySbseM1Xvooa98N0w31jzUYrXPX2xqObmFGHJ0tQ5u74H3mVh7wLouTseZyYIq39g8cNp1w== - merge-stream@^2.0.0: version "2.0.0" resolved "https://registry.npmjs.org/merge-stream/-/merge-stream-2.0.0.tgz" @@ -3180,11 +2515,6 @@ merge2@^1.3.0, merge2@^1.4.1: resolved "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz" integrity sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg== -methods@~1.1.2: - version "1.1.2" - resolved "https://registry.yarnpkg.com/methods/-/methods-1.1.2.tgz#5529a4d67654134edcc5266656835b0f851afcee" - integrity sha512-iclAHeNqNm68zFtnZ0e+1L2yUIdvzNoauKU4WBA3VvH/vPFieF7qfRlwUZU+DA9P9bPXIS90ulxoUoCH23sV2w== - micromatch@^4.0.4: version "4.0.5" resolved "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz" @@ -3198,33 +2528,18 @@ mime-db@1.52.0: resolved "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz" integrity sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg== -mime-types@^2.1.12, mime-types@~2.1.24, mime-types@~2.1.34: +mime-types@^2.1.12: version "2.1.35" resolved "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz" integrity sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw== dependencies: mime-db "1.52.0" -mime@1.6.0: - version "1.6.0" - resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1" - integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg== - mimic-fn@^2.1.0: version "2.1.0" resolved "https://registry.npmjs.org/mimic-fn/-/mimic-fn-2.1.0.tgz" integrity sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg== -minimalistic-assert@^1.0.0, minimalistic-assert@^1.0.1: - version "1.0.1" - resolved "https://registry.npmjs.org/minimalistic-assert/-/minimalistic-assert-1.0.1.tgz" - integrity sha512-UtJcAD4yEaGtjPezWuO9wC4nwUnVH/8/Im3yEHQP4b67cXlD/Qr9hdITCU1xDbSEXg2XKNaP8jsReV7vQd00/A== - -minimalistic-crypto-utils@^1.0.1: - version "1.0.1" - resolved "https://registry.npmjs.org/minimalistic-crypto-utils/-/minimalistic-crypto-utils-1.0.1.tgz" - integrity sha512-JIYlbt6g8i5jKfJ3xz7rF0LXmv2TkDxBLUkiBeZ7bAx4GnnNMr8xFpGnOxn6GhTEHx3SjRrZEoU+j04prX1ktg== - minimatch@^3.0.4, minimatch@^3.1.1, minimatch@^3.1.2: version "3.1.2" resolved "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz" @@ -3232,80 +2547,28 @@ minimatch@^3.0.4, minimatch@^3.1.1, minimatch@^3.1.2: dependencies: brace-expansion "^1.1.7" -mongodb-connection-string-url@^2.5.3: - version "2.6.0" - resolved "https://registry.npmjs.org/mongodb-connection-string-url/-/mongodb-connection-string-url-2.6.0.tgz" - integrity sha512-WvTZlI9ab0QYtTYnuMLgobULWhokRjtC7db9LtcVfJ+Hsnyr5eo6ZtNAt3Ly24XZScGMelOcGtm7lSn0332tPQ== - dependencies: - "@types/whatwg-url" "^8.2.1" - whatwg-url "^11.0.0" - -mongodb@4.9.1: - version "4.9.1" - resolved "https://registry.npmjs.org/mongodb/-/mongodb-4.9.1.tgz" - integrity sha512-ZhgI/qBf84fD7sI4waZBoLBNJYPQN5IOC++SBCiPiyhzpNKOxN/fi0tBHvH2dEC42HXtNEbFB0zmNz4+oVtorQ== - dependencies: - bson "^4.7.0" - denque "^2.1.0" - mongodb-connection-string-url "^2.5.3" - socks "^2.7.0" - optionalDependencies: - saslprep "^1.0.3" - -ms@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8" - integrity sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A== - ms@2.1.2: version "2.1.2" resolved "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz" integrity sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w== -ms@2.1.3: - version "2.1.3" - resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2" - integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA== - nanoid@^3.0.0: - version "3.3.4" - resolved "https://registry.npmjs.org/nanoid/-/nanoid-3.3.4.tgz" - integrity sha512-MqBkQh/OHTS2egovRtLk45wEyNXwF+cokD+1YPf9u5VfJiRdAiRwB2froX5Co9Rh20xs4siNPm8naNotSD6RBw== + version "3.3.6" + resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.6.tgz#443380c856d6e9f9824267d960b4236ad583ea4c" + integrity sha512-BGcqMMJuToF7i1rt+2PWSNVnWIkGCU78jBG3RxO/bZlnZPK2Cmi2QaffxGO/2RvWi9sL+FAiRiXMgsyxQ1DIDA== natural-compare@^1.4.0: version "1.4.0" resolved "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz" integrity sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw== -negotiator@0.6.3: - version "0.6.3" - resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.3.tgz#58e323a72fedc0d6f9cd4d31fe49f51479590ccd" - integrity sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg== - -node-addon-api@2.0.0: - version "2.0.0" - resolved "https://registry.npmjs.org/node-addon-api/-/node-addon-api-2.0.0.tgz" - integrity sha512-ASCL5U13as7HhOExbT6OlWJJUV/lLzL2voOSP1UVehpRD8FbSrSDjfScK/KwAvVTI5AS6r4VwbOMlIqtvRidnA== - -node-fetch@2, node-fetch@^2.6.1, node-fetch@^2.6.7: - version "2.6.7" - resolved "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.7.tgz" - integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ== - dependencies: - whatwg-url "^5.0.0" - node-fetch@2.6.6: version "2.6.6" - resolved "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.6.tgz" + resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.6.tgz#1751a7c01834e8e1697758732e9efb6eeadfaf89" integrity sha512-Z8/6vRlTUChSdIgMa51jxQ4lrw/Jy5SOW10ObaA47/RElsAN2c5Pn8bTgFGWn/ibwzXTE8qwr1Yzx28vsecXEA== dependencies: whatwg-url "^5.0.0" -node-forge@^1.3.1: - version "1.3.1" - resolved "https://registry.npmjs.org/node-forge/-/node-forge-1.3.1.tgz" - integrity sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA== - node-int64@^0.4.0: version "0.4.0" resolved "https://registry.npmjs.org/node-int64/-/node-int64-0.4.0.tgz" @@ -3333,19 +2596,7 @@ nwsapi@^2.2.0: resolved "https://registry.npmjs.org/nwsapi/-/nwsapi-2.2.2.tgz" integrity sha512-90yv+6538zuvUMnN+zCr8LuV6bPFdq50304114vJYJ8RDyK8D5O9Phpbd6SZWgI7PwzmmfN1upeOJlvybDSgCw== -object-inspect@^1.9.0: - version "1.12.3" - resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.12.3.tgz#ba62dffd67ee256c8c086dfae69e016cd1f198b9" - integrity sha512-geUvdk7c+eizMNUDkRpW1wJwgfOiOeHbxBR/hLXK1aT6zmVSO0jsQcs7fj6MGw89jC/cjGfLcNOrtMYtGqm81g== - -on-finished@2.4.1: - version "2.4.1" - resolved "https://registry.yarnpkg.com/on-finished/-/on-finished-2.4.1.tgz#58c8c44116e54845ad57f14ab10b03533184ac3f" - integrity sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg== - dependencies: - ee-first "1.1.1" - -once@^1.3.0, once@^1.4.0: +once@^1.3.0: version "1.4.0" resolved "https://registry.npmjs.org/once/-/once-1.4.0.tgz" integrity sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w== @@ -3383,18 +2634,6 @@ optionator@^0.9.1: type-check "^0.4.0" word-wrap "^1.2.3" -p-event@^4.1.0: - version "4.2.0" - resolved "https://registry.npmjs.org/p-event/-/p-event-4.2.0.tgz" - integrity sha512-KXatOjCRXXkSePPb1Nbi0p0m+gQAwdlbhi4wQKJPI1HsMQS9g+Sqp2o+QHziPr7eYJyOZet836KoHEVM1mwOrQ== - dependencies: - p-timeout "^3.1.0" - -p-finally@^1.0.0: - version "1.0.0" - resolved "https://registry.npmjs.org/p-finally/-/p-finally-1.0.0.tgz" - integrity sha512-LICb2p9CB7FS+0eR1oqWnHhp0FljGLZCWBE9aix0Uye9W8LTQPwMTYVGWQWIw9RdQiDg4+epXQODwIYJtSJaow== - p-limit@^2.2.0: version "2.3.0" resolved "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz" @@ -3423,23 +2662,11 @@ p-locate@^5.0.0: dependencies: p-limit "^3.0.2" -p-timeout@^3.1.0: - version "3.2.0" - resolved "https://registry.npmjs.org/p-timeout/-/p-timeout-3.2.0.tgz" - integrity sha512-rhIwUycgwwKcP9yTOOFK/AKsAopjjCakVqLHePO3CC6Mir1Z99xT+R63jZxAT5lFZLa2inS5h+ZS2GvR99/FBg== - dependencies: - p-finally "^1.0.0" - p-try@^2.0.0: version "2.2.0" resolved "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz" integrity sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ== -pako@2.0.3: - version "2.0.3" - resolved "https://registry.npmjs.org/pako/-/pako-2.0.3.tgz" - integrity sha512-WjR1hOeg+kki3ZIOjaf4b5WVcay1jaliKSYiEaB1XzwhMQZJxRdQRv0V31EKBYlxb4T7SK3hjfc/jxyU64BoSw== - parent-module@^1.0.0: version "1.0.1" resolved "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz" @@ -3462,11 +2689,6 @@ parse5@6.0.1: resolved "https://registry.npmjs.org/parse5/-/parse5-6.0.1.tgz" integrity sha512-Ofn/CTFzRGTTxwpNEs9PP93gXShHcTq255nzRYSKe8AkVpZY7e1fpmTfOyoIvjP5HG7Z2ZM7VS9PPhQGW2pOpw== -parseurl@~1.3.3: - version "1.3.3" - resolved "https://registry.yarnpkg.com/parseurl/-/parseurl-1.3.3.tgz#9da19e7bee8d12dff0513ed5b76957793bc2e8d4" - integrity sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ== - path-exists@^4.0.0: version "4.0.0" resolved "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz" @@ -3487,11 +2709,6 @@ path-parse@^1.0.7: resolved "https://registry.npmjs.org/path-parse/-/path-parse-1.0.7.tgz" integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw== -path-to-regexp@0.1.7: - version "0.1.7" - resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-0.1.7.tgz#df604178005f522f15eb4490e7247a1bfaa67f8c" - integrity sha512-5DFkuoqlv1uYQKxy8omFBeJPQcdoE07Kv2sferDCrAq1ohOU+MSDswDIbnx3YAM60qIOnYa53wBhXW0EbMonrQ== - path-type@^4.0.0: version "4.0.0" resolved "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz" @@ -3550,11 +2767,6 @@ pretty-format@^27.0.0, pretty-format@^27.5.1: ansi-styles "^5.0.0" react-is "^17.0.1" -process@^0.11.10: - version "0.11.10" - resolved "https://registry.npmjs.org/process/-/process-0.11.10.tgz" - integrity sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A== - prompts@^2.0.1: version "2.4.2" resolved "https://registry.npmjs.org/prompts/-/prompts-2.4.2.tgz" @@ -3563,14 +2775,6 @@ prompts@^2.0.1: kleur "^3.0.3" sisteransi "^1.0.5" -proxy-addr@~2.0.7: - version "2.0.7" - resolved "https://registry.yarnpkg.com/proxy-addr/-/proxy-addr-2.0.7.tgz#f19fe69ceab311eeb94b42e70e8c2070f9ba1025" - integrity sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg== - dependencies: - forwarded "0.2.0" - ipaddr.js "1.9.1" - psl@^1.1.33: version "1.9.0" resolved "https://registry.npmjs.org/psl/-/psl-1.9.0.tgz" @@ -3581,13 +2785,6 @@ punycode@^2.1.0, punycode@^2.1.1: resolved "https://registry.npmjs.org/punycode/-/punycode-2.1.1.tgz" integrity sha512-XRsRjdf+j5ml+y/6GKHPZbrF/8p2Yga0JPtdqTIY2Xe5ohJPD9saDJJLPvp9+NSBprVvevdXZybnj2cv8OEd0A== -qs@6.11.0: - version "6.11.0" - resolved "https://registry.yarnpkg.com/qs/-/qs-6.11.0.tgz#fd0d963446f7a65e1367e01abd85429453f0c37a" - integrity sha512-MvjoMCJwEarSbUYk5O+nmoSzSutSsTwF85zcHPQ9OrlFoZOYIjaqBAJIqIXjptyD5vThxGq52Xu/MaJzRkIk4Q== - dependencies: - side-channel "^1.0.4" - querystringify@^2.1.1: version "2.2.0" resolved "https://registry.npmjs.org/querystringify/-/querystringify-2.2.0.tgz" @@ -3598,70 +2795,14 @@ queue-microtask@^1.2.2: resolved "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz" integrity sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A== -range-parser@~1.2.1: - version "1.2.1" - resolved "https://registry.yarnpkg.com/range-parser/-/range-parser-1.2.1.tgz#3cf37023d199e1c24d1a55b84800c2f3e6468031" - integrity sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg== - -raw-body@2.5.1: - version "2.5.1" - resolved "https://registry.yarnpkg.com/raw-body/-/raw-body-2.5.1.tgz#fe1b1628b181b700215e5fd42389f98b71392857" - integrity sha512-qqJBtEyVgS0ZmPGdCFPWJ3FreoqvG4MVQln/kCgF7Olq95IbOp0/BWyMwbdtn4VTvkM8Y7khCQ2Xgk/tcrCXig== - dependencies: - bytes "3.1.2" - http-errors "2.0.0" - iconv-lite "0.4.24" - unpipe "1.0.0" - react-is@^17.0.1: version "17.0.2" resolved "https://registry.npmjs.org/react-is/-/react-is-17.0.2.tgz" integrity sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w== -"readable-stream@1.x >=1.1.9": - version "1.1.14" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.14.tgz" - integrity sha512-+MeVjFf4L44XUkhM1eYbD8fyEsxcV81pqMSR5gblfcLCHfZvbrqy4/qYHE+/R5HoBUT11WV5O08Cr1n3YXkWVQ== - dependencies: - core-util-is "~1.0.0" - inherits "~2.0.1" - isarray "0.0.1" - string_decoder "~0.10.x" - -readable-stream@^3.1.1: - version "3.6.0" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.0.tgz" - integrity sha512-BViHy7LKeTz4oNnkcLJ+lVSL6vpiFeX6/d3oSH8zCW7UxP2onchk+vTGB143xuFjHS3deTgkKoXXymXqymiIdA== - dependencies: - inherits "^2.0.3" - string_decoder "^1.1.1" - util-deprecate "^1.0.1" - -readable-stream@^4.0.0: - version "4.2.0" - resolved "https://registry.npmjs.org/readable-stream/-/readable-stream-4.2.0.tgz" - integrity sha512-gJrBHsaI3lgBoGMW/jHZsQ/o/TIWiu5ENCJG1BB7fuCKzpFM8GaS2UoBVt9NO+oI+3FcrBNbUkl3ilDe09aY4A== - dependencies: - abort-controller "^3.0.0" - buffer "^6.0.3" - events "^3.3.0" - process "^0.11.10" - -redis@^4.6.5: - version "4.6.5" - resolved "https://registry.yarnpkg.com/redis/-/redis-4.6.5.tgz#f32fbde44429e96f562bb0c9b1db0143ab8cfa4f" - integrity sha512-O0OWA36gDQbswOdUuAhRL6mTZpHFN525HlgZgDaVNgCJIAZR3ya06NTESb0R+TUZ+BFaDpz6NnnVvoMx9meUFg== - dependencies: - "@redis/bloom" "1.2.0" - "@redis/client" "1.5.6" - "@redis/graph" "1.1.0" - "@redis/json" "1.0.4" - "@redis/search" "1.1.2" - "@redis/time-series" "1.0.4" - reflect-metadata@^0.1.13: version "0.1.13" - resolved "https://registry.npmjs.org/reflect-metadata/-/reflect-metadata-0.1.13.tgz" + resolved "https://registry.yarnpkg.com/reflect-metadata/-/reflect-metadata-0.1.13.tgz#67ae3ca57c972a2aa1642b10fe363fe32d49dc08" integrity sha512-Ts1Y/anZELhSsjMcU605fU9RE4Oi3p5ORujwbIKXfWa+0Zxs510Qrmrce5/Jowq3cHSZSJqBjypxmHarc+vEWg== regexpp@^3.2.0: @@ -3710,14 +2851,6 @@ resolve@^1.20.0: path-parse "^1.0.7" supports-preserve-symlinks-flag "^1.0.0" -retry-request@^5.0.0: - version "5.0.2" - resolved "https://registry.npmjs.org/retry-request/-/retry-request-5.0.2.tgz" - integrity sha512-wfI3pk7EE80lCIXprqh7ym48IHYdwmAAzESdbU8Q9l7pnRCk9LEhpbOTNKjz6FARLm/Bl5m+4F0ABxOkYUujSQ== - dependencies: - debug "^4.1.1" - extend "^3.0.2" - reusify@^1.0.4: version "1.0.4" resolved "https://registry.npmjs.org/reusify/-/reusify-1.0.4.tgz" @@ -3737,28 +2870,11 @@ run-parallel@^1.1.9: dependencies: queue-microtask "^1.2.2" -safe-buffer@5.2.1, safe-buffer@^5.0.1, safe-buffer@~5.2.0: - version "5.2.1" - resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz" - integrity sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ== - -safe-buffer@~5.1.2: - version "5.1.2" - resolved "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz" - integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g== - "safer-buffer@>= 2.1.2 < 3": version "2.1.2" resolved "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz" integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg== -saslprep@^1.0.3: - version "1.0.3" - resolved "https://registry.npmjs.org/saslprep/-/saslprep-1.0.3.tgz" - integrity sha512-/MY/PEMbk2SuY5sScONwhUDsV2p77Znkb/q3nSVstq/yQzYJOH/Azh29p9oJLsl3LnQwSvZDKagDGBsBwSooag== - dependencies: - sparse-bitfield "^3.0.3" - saxes@^5.0.1: version "5.0.1" resolved "https://registry.npmjs.org/saxes/-/saxes-5.0.1.tgz" @@ -3778,40 +2894,6 @@ semver@^6.0.0, semver@^6.3.0: resolved "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz" integrity sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw== -send@0.18.0: - version "0.18.0" - resolved "https://registry.yarnpkg.com/send/-/send-0.18.0.tgz#670167cc654b05f5aa4a767f9113bb371bc706be" - integrity sha512-qqWzuOjSFOuqPjFe4NOsMLafToQQwBSOEpS+FwEt3A2V3vKubTquT3vmLTQpFgMXp8AlFWFuP1qKaJZOtPpVXg== - dependencies: - debug "2.6.9" - depd "2.0.0" - destroy "1.2.0" - encodeurl "~1.0.2" - escape-html "~1.0.3" - etag "~1.8.1" - fresh "0.5.2" - http-errors "2.0.0" - mime "1.6.0" - ms "2.1.3" - on-finished "2.4.1" - range-parser "~1.2.1" - statuses "2.0.1" - -serve-static@1.15.0: - version "1.15.0" - resolved "https://registry.yarnpkg.com/serve-static/-/serve-static-1.15.0.tgz#faaef08cffe0a1a62f60cad0c4e513cff0ac9540" - integrity sha512-XGuRDNjXUijsUL0vl6nSD7cwURuzEgglbOaFuZM9g3kwDXOWVTck0jLzjPzGD+TazWbboZYu52/9/XPdUgne9g== - dependencies: - encodeurl "~1.0.2" - escape-html "~1.0.3" - parseurl "~1.3.3" - send "0.18.0" - -setprototypeof@1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/setprototypeof/-/setprototypeof-1.2.0.tgz#66c9a24a73f9fc28cbe66b09fed3d33dcaf1b424" - integrity sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw== - shebang-command@^2.0.0: version "2.0.0" resolved "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz" @@ -3824,15 +2906,6 @@ shebang-regex@^3.0.0: resolved "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz" integrity sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A== -side-channel@^1.0.4: - version "1.0.4" - resolved "https://registry.yarnpkg.com/side-channel/-/side-channel-1.0.4.tgz#efce5c8fdc104ee751b25c58d4290011fa5ea2cf" - integrity sha512-q5XPytqFEIKHkGdiMIrY10mvLRvnQh42/+GoBlFW3b2LXLE2xxJpZFdm94we0BaoV3RwJyGqg5wS7epxTv0Zvw== - dependencies: - call-bind "^1.0.0" - get-intrinsic "^1.0.2" - object-inspect "^1.9.0" - signal-exit@^3.0.2, signal-exit@^3.0.3: version "3.0.7" resolved "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz" @@ -3848,19 +2921,6 @@ slash@^3.0.0: resolved "https://registry.npmjs.org/slash/-/slash-3.0.0.tgz" integrity sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q== -smart-buffer@^4.2.0: - version "4.2.0" - resolved "https://registry.npmjs.org/smart-buffer/-/smart-buffer-4.2.0.tgz" - integrity sha512-94hK0Hh8rPqQl2xXc3HsaBoOXKV20MToPkcXvwbISWLEs+64sBq5kFgn2kJDHb1Pry9yrP0dxrCI9RRci7RXKg== - -socks@^2.7.0: - version "2.7.1" - resolved "https://registry.npmjs.org/socks/-/socks-2.7.1.tgz" - integrity sha512-7maUZy1N7uo6+WVEX6psASxtNlKaNVMlGQKkG/63nEDdLOWNbiUMoLK7X4uYoLhQstau72mLgfEWcXcwsaHbYQ== - dependencies: - ip "^2.0.0" - smart-buffer "^4.2.0" - source-map-support@^0.5.6: version "0.5.21" resolved "https://registry.npmjs.org/source-map-support/-/source-map-support-0.5.21.tgz" @@ -3879,13 +2939,6 @@ source-map@^0.7.3: resolved "https://registry.npmjs.org/source-map/-/source-map-0.7.4.tgz" integrity sha512-l3BikUxvPOcn5E74dZiq5BGsTb5yEwhaTSzccU6t4sDOH8NWJCstKO5QT2CvtFoK6F0saL7p9xHAqHOlCPJygA== -sparse-bitfield@^3.0.3: - version "3.0.3" - resolved "https://registry.npmjs.org/sparse-bitfield/-/sparse-bitfield-3.0.3.tgz" - integrity sha512-kvzhi7vqKTfkh0PZU+2D2PIllw2ymqJKujUcyPMd9Y75Nv4nPbGJZXNhxsgdQab2BmlDct1YnfQCguEvHr7VsQ== - dependencies: - memory-pager "^1.0.2" - sprintf-js@~1.0.2: version "1.0.3" resolved "https://registry.npmjs.org/sprintf-js/-/sprintf-js-1.0.3.tgz" @@ -3898,23 +2951,6 @@ stack-utils@^2.0.3: dependencies: escape-string-regexp "^2.0.0" -statuses@2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/statuses/-/statuses-2.0.1.tgz#55cb000ccf1d48728bd23c685a063998cf1a1b63" - integrity sha512-RwNA9Z/7PrK06rYLIzFMlaF+l73iwpzsqRIFgbMLbTcLD6cOao82TaWefPXQvB2fOC4AjuYSEndS7N/mTCbkdQ== - -stream-events@^1.0.5: - version "1.0.5" - resolved "https://registry.npmjs.org/stream-events/-/stream-events-1.0.5.tgz" - integrity sha512-E1GUzBSgvct8Jsb3v2X15pjzN1tYebtbLaMg+eBOUOAxgbLoSbT2NS91ckc5lJD1KfLjId+jXJRgo0qnV5Nerg== - dependencies: - stubs "^3.0.0" - -stream-shift@^1.0.0: - version "1.0.1" - resolved "https://registry.npmjs.org/stream-shift/-/stream-shift-1.0.1.tgz" - integrity sha512-AiisoFqQ0vbGcZgQPY1cdP2I76glaVA/RauYR4G4thNFgkTqr90yXTo4LYX60Jl+sIlPNHHdGSwo01AvbKUSVQ== - string-length@^4.0.1: version "4.0.2" resolved "https://registry.npmjs.org/string-length/-/string-length-4.0.2.tgz" @@ -3932,18 +2968,6 @@ string-width@^4.1.0, string-width@^4.2.0: is-fullwidth-code-point "^3.0.0" strip-ansi "^6.0.1" -string_decoder@^1.1.1: - version "1.3.0" - resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz" - integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA== - dependencies: - safe-buffer "~5.2.0" - -string_decoder@~0.10.x: - version "0.10.31" - resolved "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" - integrity sha512-ev2QzSzWPYmy9GuqfIVildA4OdcGLeFZQrq5ys6RtiuF+RQQiZWr8TZNyAcuVXyQRYfEO+MsoB/1BuQVhOJuoQ== - strip-ansi@^6.0.0, strip-ansi@^6.0.1: version "6.0.1" resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz" @@ -3966,11 +2990,6 @@ strip-json-comments@^3.1.0, strip-json-comments@^3.1.1: resolved "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz" integrity sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig== -stubs@^3.0.0: - version "3.0.0" - resolved "https://registry.npmjs.org/stubs/-/stubs-3.0.0.tgz" - integrity sha512-PdHt7hHUJKxvTCgbKX9C1V/ftOcjJQgz8BZwNfV5c4B6dcGqlpelTbJ999jBGZ2jYiPAwcX5dP6oBwVlBlUbxw== - supports-color@^5.3.0: version "5.5.0" resolved "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz" @@ -4010,17 +3029,6 @@ symbol-tree@^3.2.4: resolved "https://registry.npmjs.org/symbol-tree/-/symbol-tree-3.2.4.tgz" integrity sha512-9QNk5KwDF+Bvz+PyObkmSYjI5ksVUYtjW7AU22r2NKcfLJcXp96hkDWU3+XndOsUb+AQ9QhfzfCT2O+CNWT5Tw== -teeny-request@^8.0.0: - version "8.0.2" - resolved "https://registry.npmjs.org/teeny-request/-/teeny-request-8.0.2.tgz" - integrity sha512-34pe0a4zASseXZCKdeTiIZqSKA8ETHb1EwItZr01PAR3CLPojeAKgSjzeNS4373gi59hNulyDrPKEbh2zO9sCg== - dependencies: - http-proxy-agent "^5.0.0" - https-proxy-agent "^5.0.0" - node-fetch "^2.6.1" - stream-events "^1.0.5" - uuid "^9.0.0" - terminal-link@^2.0.0: version "2.1.1" resolved "https://registry.npmjs.org/terminal-link/-/terminal-link-2.1.1.tgz" @@ -4038,11 +3046,6 @@ test-exclude@^6.0.0: glob "^7.1.4" minimatch "^3.0.4" -text-encoding@^0.7.0: - version "0.7.0" - resolved "https://registry.npmjs.org/text-encoding/-/text-encoding-0.7.0.tgz" - integrity sha512-oJQ3f1hrOnbRLOcwKz0Liq2IcrvDeZRHXhd9RgLrsT+DjWY/nty1Hi7v3dtkaEYbPYe0mUoOfzRrMwfXXwgPUA== - text-table@^0.2.0: version "0.2.0" resolved "https://registry.npmjs.org/text-table/-/text-table-0.2.0.tgz" @@ -4070,11 +3073,6 @@ to-regex-range@^5.0.1: dependencies: is-number "^7.0.0" -toidentifier@1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/toidentifier/-/toidentifier-1.0.1.tgz#3be34321a88a820ed1bd80dfaa33e479fbb8dd35" - integrity sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA== - tough-cookie@^4.0.0: version "4.1.2" resolved "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.1.2.tgz" @@ -4092,16 +3090,9 @@ tr46@^2.1.0: dependencies: punycode "^2.1.1" -tr46@^3.0.0: - version "3.0.0" - resolved "https://registry.npmjs.org/tr46/-/tr46-3.0.0.tgz" - integrity sha512-l7FvfAHlcmulp8kr+flpQZmVwtu7nfRV7NZujtN0OqES8EL4O4e0qqzL0DC5gAvx/ZC/9lk6rhcUwYvkBnBnYA== - dependencies: - punycode "^2.1.1" - tr46@~0.0.3: version "0.0.3" - resolved "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz" + resolved "https://registry.yarnpkg.com/tr46/-/tr46-0.0.3.tgz#8184fd347dac9cdc185992f3a6622e14b9d9ab6a" integrity sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw== ts-jest@^27.1.3: @@ -4178,14 +3169,6 @@ type-fest@^0.21.3: resolved "https://registry.npmjs.org/type-fest/-/type-fest-0.21.3.tgz" integrity sha512-t0rzBq87m3fVcduHDUFhKmyyX+9eo6WQjZvf51Ea/M0Q7+T374Jp1aUiyUl0GKxp8M/OETVHSDvmkyPgvX+X2w== -type-is@~1.6.18: - version "1.6.18" - resolved "https://registry.yarnpkg.com/type-is/-/type-is-1.6.18.tgz#4e552cd05df09467dcbc4ef739de89f2cf37c131" - integrity sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g== - dependencies: - media-typer "0.3.0" - mime-types "~2.1.24" - typedarray-to-buffer@^3.1.5: version "3.1.5" resolved "https://registry.npmjs.org/typedarray-to-buffer/-/typedarray-to-buffer-3.1.5.tgz" @@ -4203,11 +3186,6 @@ universalify@^0.2.0: resolved "https://registry.npmjs.org/universalify/-/universalify-0.2.0.tgz" integrity sha512-CJ1QgKmNg3CwvAv/kOFmtnEN05f0D/cn9QntgNOQlQF9dgvVTHj3t+8JPdjqawCHk7V/KA+fbUqzZ9XWhcqPUg== -unpipe@1.0.0, unpipe@~1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/unpipe/-/unpipe-1.0.0.tgz#b2bf4ee8514aae6165b4817829d21b2ef49904ec" - integrity sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ== - update-browserslist-db@^1.0.9: version "1.0.10" resolved "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.0.10.tgz" @@ -4223,7 +3201,7 @@ uri-js@^4.2.2: dependencies: punycode "^2.1.0" -url-parse@^1.5.3, url-parse@~1.5.10: +url-parse@^1.5.3: version "1.5.10" resolved "https://registry.npmjs.org/url-parse/-/url-parse-1.5.10.tgz" integrity sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ== @@ -4231,26 +3209,6 @@ url-parse@^1.5.3, url-parse@~1.5.10: querystringify "^2.1.1" requires-port "^1.0.0" -util-deprecate@^1.0.1: - version "1.0.2" - resolved "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" - integrity sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw== - -utils-merge@1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713" - integrity sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA== - -uuid@^8.0.0: - version "8.3.2" - resolved "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz" - integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg== - -uuid@^9.0.0: - version "9.0.0" - resolved "https://registry.npmjs.org/uuid/-/uuid-9.0.0.tgz" - integrity sha512-MXcSTerfPa4uqyzStbRoTgt5XIe3x5+42+q1sDuy3R5MDk66URdLMOZe5aPX/SQd+kuYAh0FdP/pO28IkQyTeg== - v8-compile-cache-lib@^3.0.1: version "3.0.1" resolved "https://registry.npmjs.org/v8-compile-cache-lib/-/v8-compile-cache-lib-3.0.1.tgz" @@ -4265,11 +3223,6 @@ v8-to-istanbul@^8.1.0: convert-source-map "^1.6.0" source-map "^0.7.3" -vary@~1.1.2: - version "1.1.2" - resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc" - integrity sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg== - w3c-hr-time@^1.0.2: version "1.0.2" resolved "https://registry.npmjs.org/w3c-hr-time/-/w3c-hr-time-1.0.2.tgz" @@ -4293,7 +3246,7 @@ walker@^1.0.7: webidl-conversions@^3.0.0: version "3.0.1" - resolved "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz" + resolved "https://registry.yarnpkg.com/webidl-conversions/-/webidl-conversions-3.0.1.tgz#24534275e2a7bc6be7bc86611cc16ae0a5654871" integrity sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ== webidl-conversions@^5.0.0: @@ -4306,11 +3259,6 @@ webidl-conversions@^6.1.0: resolved "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-6.1.0.tgz" integrity sha512-qBIvFLGiBpLjfwmYAaHPXsn+ho5xZnGvyGvsarywGNc8VyQJUMHJ8OBKGGrPER0okBeMDaan4mNBlgBROxuI8w== -webidl-conversions@^7.0.0: - version "7.0.0" - resolved "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-7.0.0.tgz" - integrity sha512-VwddBukDzu71offAQR975unBIGqfKZpM+8ZX6ySk8nYhVoo5CYaZyzt3YBvYtRtO+aoGlqxPg/B87NGVZ/fu6g== - whatwg-encoding@^1.0.5: version "1.0.5" resolved "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-1.0.5.tgz" @@ -4323,17 +3271,9 @@ whatwg-mimetype@^2.3.0: resolved "https://registry.npmjs.org/whatwg-mimetype/-/whatwg-mimetype-2.3.0.tgz" integrity sha512-M4yMwr6mAnQz76TbJm914+gPpB/nCwvZbJU28cUD6dR004SAxDLOOSUaB1JDRqLtaOV/vi0IC5lEAGFgrjGv/g== -whatwg-url@^11.0.0: - version "11.0.0" - resolved "https://registry.npmjs.org/whatwg-url/-/whatwg-url-11.0.0.tgz" - integrity sha512-RKT8HExMpoYx4igMiVMY83lN6UeITKJlBQ+vR/8ZJ8OCdSiN3RwCq+9gH0+Xzj0+5IrM6i4j/6LuvzbZIQgEcQ== - dependencies: - tr46 "^3.0.0" - webidl-conversions "^7.0.0" - whatwg-url@^5.0.0: version "5.0.0" - resolved "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz" + resolved "https://registry.yarnpkg.com/whatwg-url/-/whatwg-url-5.0.0.tgz#966454e8765462e37644d3626f6742ce8b70965d" integrity sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw== dependencies: tr46 "~0.0.3" @@ -4389,11 +3329,6 @@ ws@^7.4.6: resolved "https://registry.npmjs.org/ws/-/ws-7.5.9.tgz" integrity sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q== -ws@^8.12.0: - version "8.12.0" - resolved "https://registry.npmjs.org/ws/-/ws-8.12.0.tgz" - integrity sha512-kU62emKIdKVeEIOIKVegvqpXMSTAMLJozpHZaJNDYqBjzlSYXQGviYwN1osDLJ9av68qHd4a2oSjd7yD4pacig== - xml-name-validator@^3.0.0: version "3.0.0" resolved "https://registry.npmjs.org/xml-name-validator/-/xml-name-validator-3.0.0.tgz" @@ -4409,7 +3344,7 @@ y18n@^5.0.5: resolved "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz" integrity sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA== -yallist@4.0.0, yallist@^4.0.0: +yallist@^4.0.0: version "4.0.0" resolved "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz" integrity sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==