├── .github └── dependabot.yaml ├── .gitignore ├── .npmignore ├── CHANGELOG.md ├── LICENSE.md ├── README.md ├── main.js ├── package.json └── src ├── Collection.js ├── managers └── DataManager.js ├── structures ├── DocumentBasedCollection.js └── KeyValueBasedCollection.js └── util ├── Debugger.js └── Tools.js /.github/dependabot.yaml: -------------------------------------------------------------------------------- 1 | version: 2 2 | 3 | updates: 4 | - package-ecosystem: "npm" 5 | directory: "/" 6 | 7 | schedule: 8 | interval: "daily" 9 | versioning-strategy: "increase" 10 | 11 | ignore: 12 | - dependency-name: "--" -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | node_modules 2 | peakdb 3 | package-lock.json 4 | peakdb.code-workspace -------------------------------------------------------------------------------- /.npmignore: -------------------------------------------------------------------------------- 1 | .github 2 | node_modules 3 | peakdb 4 | package-lock.json 5 | peakdb.code-workspace -------------------------------------------------------------------------------- /CHANGELOG.md: -------------------------------------------------------------------------------- 1 | # Change Log 2 | 3 | ## v2.2.1 → v2.3.0 4 | 5 | * Updates for System: 6 | * **`.Backup()` changed.** This function will now be used as `createBackup()`. 7 | * **`.LoadBackup()` added.** With this function, you can easily restore backups. 8 | * **`.CreateBackup()` returns changed.** Now this function, if successful, will return the filename of the backed up collection on its return. 9 | * **`.Auto_Backup` changed.** This option will now be used as `auto_create_backup`. 10 | * Updates for Key-Value Based Collections: 11 | * **`.Reduce()` changed.** This function will now be used as `decrease()`. 12 | 13 | ## v2.1.0 → v2.2.0 14 | 15 | * Updates for System: 16 | * **Bugs fixed.** Fixed some bugs in the system. 17 | * **`.Indicate_Archived_At` added.** If this is enabled, will be automatically specified date when documents are archived. 18 | * **`.Indicate_Archived_Timestamp` added.** If this is enabled, will be automatically specified timestamp when documents are archived. 19 | * **`.Indicate_Unarchived_At` added.** If this is enabled, will be automatically specified date when documents are unarchived. 20 | * **`.Indicate_Unarchived_Timestamp` added.** If this is enabled, will be automatically specified timestamp when documents are unarchived. 21 | * Updates for Document Based Collections: 22 | * **`.Archive()` added.** By archiving a document, you can have it ignored by the system. 23 | * **`.Unarchive()` added.** You can extract the archived document from the archive. 24 | * **`.Find(..., options)` added.** You can customize it with options to find. 25 | * **`.Filter(..., options)` added.** You can customize it with options to filter. 26 | * **`.Has(..., options)` added.** You can customize it with options to check. 27 | * **`.Archived` added.** With this option you can specify whether to find archived documents or not. 28 | * **`.Archived` added.** With this option you can specify whether to filter archived documents or not. 29 | * **`.Archived` added.** With this option you can specify whether to check archived documents or not. 30 | * Updates for Key-Value Based Collections: 31 | * **`.Find()` added.** You can find the data in the array. 32 | * **`.Filter()` added.** You can filter the data in the array. 33 | 34 | ## v2.0.2 → v2.1.0 35 | 36 | * Updates for All Collections: 37 | * **`.Has()` added.** You can check if a data exists. 38 | ## v1.3.1 → v2.0.0 39 | * Updates for System: 40 | * **Added new collection type.** You can now use your data on key-value based. Thanks to the newly added key-value based collection type, you do not have to keep your data in a document based format. 41 | * **Added find and filter with JSON.** In your collection, you can also use JSON for find and filter operations instead of functions. 42 | * **Data read and write optimized.** Your data has been rendered faster and unnecessary RAM loss has been prevented. 43 | * **`.Type` added.** This allows you to specify type of your collection. Valid values: `DOCUMENT_BASED` and `KEY_VALUE_BASED` 44 | * **`.Activate_Destroy_Function` added.** If this is enabled, the `.Destroy()` function becomes operable. This command serves to destroy your collection completely. It is a dangerous command. 45 | * Updates for Key-Value Based Collections: 46 | * **`.Set()` added.** This allows you to set a data to your collection. 47 | * **`.Get()` added.** This allows you to get a data into your collection. 48 | * **`.Push()` added.** This allows you to push a data to Array in your collection. 49 | * **`.Remove()` added.** This allows you to remove a data from Array in your collection. 50 | * **`.Increase()` added.** This allows you to increase number in your collection. 51 | * **`.Reduce()` added.** This allows you to reduce number in your collection. 52 | * **`.Destroy()` added.** This serves to completely destroy the data in your collection. You need to activate it with the `activate_destroy_function` option. 53 | 54 | ## v1.2.2 → v1.3.0 55 | 56 | * Updates for Document Based Collections: 57 | * **`.Delete()` changed.** This function will now be used as `remove()`. 58 | 59 | ## v1.1.6 → v1.2.0 60 | 61 | * Updates for System: 62 | * **`.No_Save_Timeout_After` changed.** This option will now be used as `save_directly_after`. 63 | * **`.Caching_Time` changed.** This option will now be used as `cache_retention_time`. 64 | * **`.Delete_Backups_Before` changed.** This option will now be used as `backup_retention_time`. 65 | ## v1.1.2 → v1.1.3 66 | * Updates for System: 67 | * **`.No_Save_Timeout_After` added.** This specifies that after how many documents have been inserted, the collection will be saved without the save timeout. So that the data does not stand without saving for a long time. 68 | 69 | ## v1.0.6 → v1.1.0 70 | 71 | * Updates for System: 72 | * **`.Save_Timeout` added.** This specifies how many seconds after a document is inserted, the collection will be saved. This way it limits the successive saving of the collection when many data are inserted in succession, so the system is not slowed down. Data loss may occur if the system is turned off after repeatedly entering data. When the document is added 5 times in a row, the collection is saved so that the data does not remain unsaved for a long time. This can be edited with the `no_save_timeout_after` option. 73 | * **`.Caching_Time` added.** This determines how long the cache will be kept if caching is enabled. The cache is cleared if there is no activity in the collection, this avoids unnecessary RAM loss. 74 | * **`.Detailed_Debugger_Logs` added.** It will print more events in the collection to the console. 75 | * **`.Delete_Backups_Before_This_Day` changed.** This option will now be used as `delete_backups_before`. 76 | 77 | ## v1.0.2 → v1.0.3 78 | 79 | * Updates for System: 80 | * **`.Caching` added.** The data is kept in the cache. In this case, the data is processed quickly, but the size of the collection is the loss of RAM. Is not preferred for large collections. 81 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 Keift 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: 6 | 7 | The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [Function]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function 2 | [String]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String 3 | [Number]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number 4 | [Object]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object 5 | [Array]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array 6 | [Boolean]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean 7 | 8 |
9 | PeakDB 10 |
11 | 12 | 13 | 14 | 15 |
16 | 17 | ## Contents 18 | 19 | * [About](#about) 20 | * [Features](#features) 21 | * [Installation](#installation) 22 | * [Links](#links) 23 | 24 | ## About 25 | 26 | Fast and advanced, document-based and key-value-based NoSQL database. 27 | 28 | ## Features 29 | 30 | * NoSQL database 31 | * Can be run as it is installed 32 | * Can be used document-based and key-value-based 33 | * Customizable settings for collections 34 | * No need to use schema 35 | * Quick data reading and writing 36 | * Data can be kept in cache 37 | * Easy to find data 38 | * Automatically or manual backup 39 | 40 | ## Installation 41 | 42 | NPM 43 | ```sh-session 44 | npm install peakdb 45 | ``` 46 | Yarn 47 | ```sh-session 48 | yarn add peakdb 49 | ``` 50 | 51 | ## Links 52 | 53 | * [Documentation](https://peakdb.paiode.cf/api-docs) 54 | * [Change Log](CHANGELOG.md#change-log) 55 | 56 | ## License 57 | 58 | [MIT](LICENSE.md) 59 | -------------------------------------------------------------------------------- /main.js: -------------------------------------------------------------------------------- 1 | const Collection = require("./src/Collection.js") 2 | , Debugger = require("./src/util/Debugger.js") 3 | , Package = require("./package.json") 4 | , axios = require("axios"); 5 | console.log(""), console.log(" \x1b[32m!? \x1b[36m:?: 7!"), console.log(" \x1b[32m?Y. \x1b[36m^Y^ J?"), console.log(" \x1b[32m.: .^~~^. .^~~^: .^^~~~^. ?Y. .^. \x1b[36m.^^^:.^Y^ J? :^^^:"), console.log(" \x1b[32m757!~^^!J7 .7J!^^~?J: !!~^^~7Y^ ?Y. ^7?~. \x1b[36m.7?~^^~!?Y^ JJ!~^^^7?^"), console.log(" \x1b[32m?5^ ^5! ?5: JY. ..:::YJ ?Y.^??^ \x1b[36m?J. !Y^ JJ. !Y:"), console.log(" \x1b[32m?5. YJ YY!7777!?7. !?7!~~~YJ ?Y?5~ \x1b[36m.J7 ^Y^ J? :Y~"), console.log(" \x1b[32m?5: :57 ?Y. 75: :5J ?Y.^??^ \x1b[36m?J ~Y^ JJ ~Y^"), console.log(" \x1b[32m75?~:::~J?. .?J~::::^. ~57::^!7YJ ?5. ^??^ \x1b[36m.??^::^~?Y^ JJ!^:::!J~"), console.log(" \x1b[32m?5::~!!~: :~!!!~~. :~!!~^ ^^ :^ :~. \x1b[36m:~~~^:.^. ^:.^~~~^."), console.log(" \x1b[32m?5."), console.log(" \x1b[32m!J. \x1b[33mv" + Package.version), console.log(""), setTimeout(() => { 6 | axios.get("https://cdn.jsdelivr.net/npm/" + Package.name + "/package.json") 7 | .then(o => { 8 | axios.get("https://raw.githubusercontent.com/" + o.data.repository.url.split("/")[3] + "/" + o.data.repository.url.split("/")[4].split(".git") 9 | .join("") + "/main/package.json") 10 | .then(o => { 11 | o.data.version.split(".") 12 | .join("") > Package.version.split(".") 13 | .join("") && Debugger.log("\x1b[33mNEW VERSION AVAILABLE! \x1b[32mYou are currently using version \x1b[35mv" + Package.version + "\x1b[32m. New version \x1b[35mv" + o.data.version + "\x1b[32m is available. You can update it using this command: \x1b[35mnpm install " + o.data.name + "@^" + o.data.version) 14 | }) 15 | .catch(() => { 16 | Debugger.error("An error occurred while checking for updates.") 17 | }) 18 | }) 19 | .catch(() => { 20 | Debugger.error("An error occurred while checking for updates.") 21 | }) 22 | }, 5e3), module.exports.Collection = Collection; -------------------------------------------------------------------------------- /package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "peakdb", 3 | "author": "keift", 4 | "version": "2.3.7", 5 | "description": "Fast and advanced, document based and key-value based NoSQL database that able to work as it is installed.", 6 | "license": "MIT", 7 | "main": "main.js", 8 | "dependencies": { 9 | "axios": "^0.26.1", 10 | "better-sqlite3": "^10.0.0", 11 | "bson": "^6.2.0", 12 | "find-remove": "^4.0.4", 13 | "fs": "^0.0.1-security", 14 | "lodash": "^4.17.21", 15 | "nanoid": "^3.3.7", 16 | "path": "^0.12.7" 17 | }, 18 | "engines": { 19 | "node": ">= 12.0.0" 20 | }, 21 | "homepage": "https://peakdb.gitbook.io", 22 | "repository": { 23 | "type": "git", 24 | "url": "git+https://github.com/keift/peakdb.git" 25 | }, 26 | "keywords": [ 27 | "peak", 28 | "peakdb", 29 | "database", 30 | "db", 31 | "sql", 32 | "nosql", 33 | "document", 34 | "key-value" 35 | ] 36 | } 37 | -------------------------------------------------------------------------------- /src/Collection.js: -------------------------------------------------------------------------------- 1 | const DocumentBasedCollection = require("./structures/DocumentBasedCollection.js") 2 | , KeyValueBasedCollection = require("./structures/KeyValueBasedCollection.js") 3 | , DataManager = require("./managers/DataManager.js") 4 | , { 5 | rebuildCollectionName: rebuildCollectionName 6 | , zeroBeforeNumber: zeroBeforeNumber 7 | } = require("./util/Tools.js") 8 | , Debugger = require("./util/Debugger.js") 9 | , sqlite3 = require("better-sqlite3") 10 | , bson = require("bson") 11 | , fs = require("fs") 12 | , find_remove = require("find-remove") 13 | , path = require("path"); 14 | class Collection { 15 | constructor(e) { 16 | if (!e || !e.name || "" === rebuildCollectionName(e.name)) return Debugger.error("Not specified: CollectionOptions.Name"); 17 | if (!e.type) return Debugger.error("Not specified: CollectionOptions.Type"); 18 | if (e.name && "string" != typeof e.name) return Debugger.error("Incorrect option type: CollectionOptions.Name"); 19 | if (e.type && "string" != typeof e.type) return Debugger.error("Incorrect option type: CollectionOptions.Type"); 20 | if ("DOCUMENT_BASED" === e.type) { 21 | if (e.id_length && "number" != typeof e.id_length) return Debugger.error("Incorrect option type: CollectionOptions.Id_Length"); 22 | if (e.indicate_created_at && "boolean" != typeof e.indicate_created_at) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Created_At"); 23 | if (e.indicate_created_timestamp && "boolean" != typeof e.indicate_created_timestamp) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Created_Timestamp"); 24 | if (e.indicate_updated_at && "boolean" != typeof e.indicate_updated_at) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Updated_At"); 25 | if (e.indicate_updated_timestamp && "boolean" != typeof e.indicate_updated_timestamp) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Updated_Timestamp"); 26 | if (e.indicate_archived_at && "boolean" != typeof e.indicate_archived_at) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Archived_At"); 27 | if (e.indicate_archived_timestamp && "boolean" != typeof e.indicate_archived_timestamp) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Archived_Timestamp"); 28 | if (e.indicate_unarchived_at && "boolean" != typeof e.indicate_unarchived_at) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Unarchived_At"); 29 | if (e.indicate_unarchived_timestamp && "boolean" != typeof e.indicate_unarchived_timestamp) return Debugger.error("Incorrect option type: CollectionOptions.Indicate_Unarchived_Timestamp") 30 | } 31 | if (e.save_timeout && "number" != typeof e.save_timeout) return Debugger.error("Incorrect option type: CollectionOptions.Save_Timeout"); 32 | if (e.save_directly_after && "number" != typeof e.save_directly_after) return Debugger.error("Incorrect option type: CollectionOptions.Save_Directly_After"); 33 | if (e.cache_retention_time && "number" != typeof e.cache_retention_time) return Debugger.error("Incorrect option type: CollectionOptions.Cache_Retention_Time"); 34 | if (e.backup_retention_time && "number" != typeof e.backup_retention_time) return Debugger.error("Incorrect option type: CollectionOptions.Backup_Retention_Time"); 35 | if (e.caching && "boolean" != typeof e.caching) return Debugger.error("Incorrect option type: CollectionOptions.Caching"); 36 | if (e.auto_create_backup && "boolean" != typeof e.auto_create_backup) return Debugger.error("Incorrect option type: CollectionOptions.Auto_Create_Backup"); 37 | if (e.detailed_debugger_logs && "boolean" != typeof e.detailed_debugger_logs) return Debugger.error("Incorrect option type: CollectionOptions.Detailed_Debugger_Logs"); 38 | if (e.type && "DOCUMENT_BASED" !== e.type && "KEY_VALUE_BASED" !== e.type) return Debugger.error('Valid option value: CollectionOptions.Type === "DOCUMENT_BASED" || "KEY_VALUE_BASED"'); 39 | if ("DOCUMENT_BASED" === e.type) { 40 | if (e.id_length && e.id_length < 4) return Debugger.error("Valid option value: CollectionOptions.Id_Length >= 4"); 41 | if (e.id_length && e.id_length > 256) return Debugger.error("Valid option value: CollectionOptions.Id_Length <= 256") 42 | } 43 | return e.save_timeout && e.save_timeout < 1 ? Debugger.error("Valid option value: CollectionOptions.Save_Timeout >= 1") : e.save_timeout && e.save_timeout > 25 ? Debugger.error("Valid option value: CollectionOptions.Save_Timeout <= 25") : e.save_directly_after && e.save_directly_after < 1 ? Debugger.error("Valid option value: CollectionOptions.Save_Directly_After >= 1") : e.save_directly_after && e.save_directly_after > 500 ? Debugger.error("Valid option value: CollectionOptions.Save_Directly_After <= 500") : e.cache_retention_time && e.cache_retention_time < -1 ? Debugger.error("Valid option value: CollectionOptions.Cache_Retention_Time >= -1") : e.cache_retention_time && e.cache_retention_time > 20160 ? Debugger.error("Valid option value: CollectionOptions.Cache_Retention_Time <= 20160") : e.backup_retention_time && e.backup_retention_time < -1 ? Debugger.error("Valid option value: CollectionOptions.Backup_Retention_Time >= -1") : e.backup_retention_time && e.backup_retention_time > 365 ? Debugger.error("Valid option value: CollectionOptions.Backup_Retention_Time <= 365") : (e.name = rebuildCollectionName(e.name), Debugger.log("Starting collection '\x1b[35m" + e.name + "\x1b[32m'..."), fs.mkdirSync(path.join("./", "./peakdb/Collections"), { 44 | recursive: !0 45 | }, t => Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mError creating folders.")), this._CollectionManager = new sqlite3("./peakdb/Collections/" + e.name + ".pea"), this._CollectionManager.prepare("CREATE TABLE IF NOT EXISTS peakdb (data)") 46 | .run(), "DOCUMENT_BASED" === e.type && this._CollectionManager.prepare("SELECT data FROM peakdb") 47 | .all()[0] && !1 === Array.isArray(bson.deserialize(this._CollectionManager.prepare("SELECT * FROM peakdb") 48 | .all()[0].data) 49 | .data) || "KEY_VALUE_BASED" === e.type && this._CollectionManager.prepare("SELECT data FROM peakdb") 50 | .all()[0] && !0 === Array.isArray(bson.deserialize(this._CollectionManager.prepare("SELECT * FROM peakdb") 51 | .all()[0].data) 52 | .data) ? Debugger.error("Collection type cannot be changed.") : ("DOCUMENT_BASED" !== e.type || this._CollectionManager.prepare("SELECT data FROM peakdb") 53 | .all()[0] || this._CollectionManager.prepare("INSERT INTO peakdb (data) VALUES (?)") 54 | .run(bson.serialize({ 55 | data: [] 56 | })), "KEY_VALUE_BASED" !== e.type || this._CollectionManager.prepare("SELECT data FROM peakdb") 57 | .all()[0] || this._CollectionManager.prepare("INSERT INTO peakdb (data) VALUES (?)") 58 | .run(bson.serialize({ 59 | data: {} 60 | })), this._DataManager = new DataManager.Manager(this._CollectionManager, e), Debugger.log("Collection '\x1b[35m" + e.name + "\x1b[32m' has been started."), !0 === e.caching && Debugger.log("\x1b[31mWARNING! \x1b[32mCaching is active, data are kept in cache. For these large collections, it means more RAM loss."), "DOCUMENT_BASED" === e.type ? (this._DocumentBasedCollection = new DocumentBasedCollection.Collection(this._DataManager, e), this.insert = (e => this._DocumentBasedCollection.insert(e)), this.find = ((e, t) => this._DocumentBasedCollection.find(e, t)), this.filter = ((e, t) => this._DocumentBasedCollection.filter(e, t)), this.has = ((e, t) => this._DocumentBasedCollection.has(e, t)), this.update = ((e, t) => this._DocumentBasedCollection.update(e, t)), this.archive = (e => this._DocumentBasedCollection.archive(e)), this.unarchive = (e => this._DocumentBasedCollection.unarchive(e)), this.delete = (e => this._DocumentBasedCollection.delete(e))) : "KEY_VALUE_BASED" === e.type && (this._KeyValueBasedCollection = new KeyValueBasedCollection.Collection(this._DataManager, e), this.set = ((e, t) => this._KeyValueBasedCollection.set(e, t)), this.get = (e => this._KeyValueBasedCollection.get(e)), this.push = ((e, t) => this._KeyValueBasedCollection.push(e, t)), this.remove = ((e, t) => this._KeyValueBasedCollection.remove(e, t)), this.find = ((e, t) => this._KeyValueBasedCollection.find(e, t)), this.filter = ((e, t) => this._KeyValueBasedCollection.filter(e, t)), this.has = ((e, t) => this._KeyValueBasedCollection.has(e, t)), this.increase = ((e, t) => this._KeyValueBasedCollection.increase(e, t)), this.reduce = ((e, t) => this._KeyValueBasedCollection.reduce(e, t)), this.delete = (e => this._KeyValueBasedCollection.delete(e))), this.createBackup = (() => { 61 | fs.mkdirSync(path.join("./", "./peakdb/Backups/Collections"), { 62 | recursive: !0 63 | }, t => (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mAn error occurred while creating the backup."), !1)); 64 | let t = e.name + "_" + (new Date) 65 | .getFullYear() + "-" + zeroBeforeNumber((new Date) 66 | .getMonth() + 1) + "-" + zeroBeforeNumber((new Date) 67 | .getDate()) + "_" + zeroBeforeNumber((new Date) 68 | .getHours()) + zeroBeforeNumber((new Date) 69 | .getMinutes()) + ".pea"; 70 | return !1 !== fs.existsSync("./peakdb/Backups/Collections/" + t) ? (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mA backup has already been created recently."), !1) : (this._CollectionManager.backup("./peakdb/Backups/Collections/" + t) 71 | .then(() => { 72 | Debugger.log("\x1b[35m(Collection#" + e.name + "): \x1b[32mCollection backup was created with filename '\x1b[35m" + t + "\x1b[32m'.") 73 | }) 74 | .catch(t => (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mAn error occurred while creating the backup."), !1)), t) 75 | }), this.loadBackup = (t => t ? "string" != typeof t ? (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mIncorrect parameter type: Collection.LoadBackup(Filename)"), !1) : (t.endsWith(".pea") || (t += ".pea"), !1 === fs.existsSync("./peakdb/Backups/Collections/" + t) ? (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mBackup with filename '\x1b[35m" + t + "\x1b[31m' not found."), !1) : (this._BackupCollectionManager = new sqlite3("./peakdb/Backups/Collections/" + t), "DOCUMENT_BASED" === e.type && this._BackupCollectionManager.prepare("SELECT data FROM peakdb") 76 | .all()[0] && !1 === Array.isArray(bson.deserialize(this._BackupCollectionManager.prepare("SELECT * FROM peakdb") 77 | .all()[0].data) 78 | .data) || "KEY_VALUE_BASED" === e.type && this._BackupCollectionManager.prepare("SELECT data FROM peakdb") 79 | .all()[0] && !0 === Array.isArray(bson.deserialize(this._BackupCollectionManager.prepare("SELECT * FROM peakdb") 80 | .all()[0].data) 81 | .data) ? Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mThe type of this backup does not match the type of this collection.") : (this._BackupData = bson.deserialize(this._BackupCollectionManager.prepare("SELECT * FROM peakdb") 82 | .all()[0].data) 83 | .data, Debugger.log("\x1b[35m(Collection#" + e.name + "): \x1b[32mCollection successfully loaded from file with name '\x1b[35m" + t + "\x1b[32m'."), this._DataManager.set(this._BackupData), !0))) : (Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mNot specified: Collection.LoadBackup(Filename)"), !1)), !0 === e.activate_destroy_function && (this.destroy = (() => (Debugger.log("\x1b[35m(Collection#" + e.name + "): \x1b[32mCollection has been destroyed."), "DOCUMENT_BASED" === e.type ? this._DataManager.set([]) : "KEY_VALUE_BASED" === e.type && this._DataManager.set({}), !0))), void(!0 === e.auto_create_backup && setInterval(() => { 84 | fs.mkdirSync(path.join("./", "./peakdb/Backups/Collections"), { 85 | recursive: !0 86 | }, t => Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mAn error occurred while creating the backup.")), (e.backup_retention_time || 3) > -1 && find_remove("./peakdb/Backups/Collections", { 87 | age: { 88 | seconds: 86400 * (e.backup_retention_time || 3) 89 | } 90 | , extensions: [".pea"] 91 | }); 92 | let t = e.name + "_" + (new Date) 93 | .getFullYear() + "-" + zeroBeforeNumber((new Date) 94 | .getMonth() + 1) + "-" + zeroBeforeNumber((new Date) 95 | .getDate()) + "_AUTO.pea"; 96 | !0 !== fs.existsSync("./peakdb/Backups/Collections/" + t) && this._CollectionManager.backup("./peakdb/Backups/Collections/" + t) 97 | .then(() => { 98 | Debugger.log("\x1b[35m(Collection#" + e.name + "): \x1b[32mCollection backup was created with filename '\x1b[35m" + t + "\x1b[32m'.") 99 | }) 100 | .catch(t => { 101 | Debugger.error("\x1b[35m(Collection#" + e.name + "): \x1b[31mAn error occurred while creating the backup.") 102 | }) 103 | }, 6e4)))) 104 | } 105 | } 106 | module.exports = Collection; -------------------------------------------------------------------------------- /src/managers/DataManager.js: -------------------------------------------------------------------------------- 1 | const Debugger = require("../util/Debugger.js") 2 | , bson = require("bson"); 3 | class Manager { 4 | constructor(e, t) { 5 | this._set_count = 0, this._cache_timeout_function = (() => { 6 | clearTimeout(this._cache_timeout), this._cache_timeout = setTimeout(() => { 7 | delete this._cache, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mCache is cleared because there is no activity for a long time.") 8 | }, 6e4 * (t.cache_retention_time || 10)) 9 | }), !0 === t.caching && (this._cache = bson.deserialize(e.prepare("SELECT * FROM peakdb") 10 | .all()[0].data) 11 | .data, t.cache_retention_time && t.cache_retention_time > -1 && this._cache_timeout_function()), this.get = (() => (!0 === t.caching && this._cache && t.cache_retention_time && t.cache_retention_time > -1 && this._cache_timeout_function(), !0 !== t.caching || this._cache || (this._cache = bson.deserialize(e.prepare("SELECT * FROM peakdb") 12 | .all()[0].data) 13 | .data, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mCache started to be kept again."), t.cache_retention_time && t.cache_retention_time > -1 && this._cache_timeout_function()), this._cache || this._temporary_cache || bson.deserialize(e.prepare("SELECT * FROM peakdb") 14 | .all()[0].data) 15 | .data)), this.set = (a => (!0 === t.caching && this._cache && t.cache_retention_time && t.cache_retention_time > -1 && this._cache_timeout_function(), !0 !== t.caching || this._cache || (this._cache = a, !0 !== t.detailed_debugger_logs || this._cache || Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mCache started to be kept again."), t.cache_retention_time && t.cache_retention_time > -1 && this._cache_timeout_function()), this._temporary_cache = a, this._set_count++, clearTimeout(this._save_timeout), this._set_count === (t.save_directly_after || 5) ? (e.prepare("UPDATE peakdb SET data = (?)") 16 | .run(bson.serialize({ 17 | data: this._temporary_cache 18 | })), delete this._temporary_cache, this._set_count = 0, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mCollection has been saved.")) : this._save_timeout = setTimeout(() => { 19 | e.prepare("UPDATE peakdb SET data = (?)") 20 | .run(bson.serialize({ 21 | data: this._temporary_cache 22 | })), delete this._temporary_cache, this._set_count = 0, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mCollection has been saved.") 23 | }, 1e3 * (t.save_timeout || 1)), !0)) 24 | } 25 | } 26 | module.exports.Manager = Manager; -------------------------------------------------------------------------------- /src/structures/DocumentBasedCollection.js: -------------------------------------------------------------------------------- 1 | const { 2 | isJSON: isJSON 3 | , separateNumber: separateNumber 4 | } = require("../util/Tools.js"), Debugger = require("../util/Debugger.js"), { 5 | find: find 6 | , filter: filter 7 | } = require("lodash"), { 8 | nanoid: nanoid 9 | } = require("nanoid"); 10 | class DocumentBasedCollection { 11 | constructor(e, t) { 12 | this.insert = (r => { 13 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Insert(Document)"); 14 | if (!1 === isJSON(r)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Insert(Document)"); 15 | let i = e.get() 16 | , a = r._id || nanoid(t.id_length || 32) 17 | , d = find(i, e => e._id === a) 18 | , n = { 19 | _id: a 20 | , _updated: !1 21 | , _archived: !1 22 | }; 23 | if (!d) return !0 === t.indicate_created_at && (n._created_at = new Date), !0 === t.indicate_created_timestamp && (n._created_timestamp = Date.now()), n = Object.assign(n, r), i.push(n), !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument inserted with ID '\x1b[35m" + a + "\x1b[32m'."), e.set(i), n; 24 | !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocument with this ID already exists: " + a) 25 | }), this.find = ((r, i) => { 26 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Find(Params, ...)"); 27 | if ("function" != typeof r && !1 === isJSON(r)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Find(Params, ...)"); 28 | if (i && !1 === isJSON(i)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Find(..., Options)"); 29 | if (i && i.archived && "boolean" != typeof i.archived) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect option type: FindOptions.Archived"); 30 | let a = e.get(); 31 | i && !0 === i.archived || (a = filter(a, e => !0 !== e._archived)); 32 | let d = find(a, r); 33 | if (d) return !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument with ID '\x1b[35m" + d._id + "\x1b[32m' found."), d; 34 | !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocument not found.") 35 | }), this.filter = ((r, i) => { 36 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Filter(Params, ...)"); 37 | if ("function" != typeof r && !1 === isJSON(r)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Filter(Params, ...)"); 38 | if (i && !1 === isJSON(i)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Filter(..., Options)"); 39 | if (i && i.archived && "boolean" != typeof i.archived) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect option type: FilterOptions.Archived"); 40 | let a = e.get(); 41 | i && !0 === i.archived || (a = filter(a, e => !0 !== e._archived)); 42 | let d = filter(a, r); 43 | if (0 !== d.length) return !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32m\x1b[35m" + separateNumber(d.length) + "\x1b[32m documents found."), d; 44 | !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocuments not found.") 45 | }), this.has = ((r, i) => { 46 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Has(Params, ...)"), !1; 47 | if ("function" != typeof r && !1 === isJSON(r)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Has(Params, ...)"), !1; 48 | if (i && !1 === isJSON(i)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Has(..., Options)"), !1; 49 | if (i && i.archived && "boolean" != typeof i.archived) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect option type: HasOptions.Archived"), !1; 50 | let a = e.get(); 51 | return i && !0 === i.archived || (a = filter(a, e => !0 !== e._archived)), find(a, r) ? (!0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument found."), !0) : (!0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocument not found."), !1) 52 | }), this.update = ((r, i) => { 53 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Update(Document_Id, ...)"); 54 | if (!i) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Update(..., Document)"); 55 | if (!1 === isJSON(i)) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mIncorrect parameter type: Collection.Update(..., Document)"); 56 | let a = e.get() 57 | , d = find(a, e => e._id === r); 58 | if (!d) return void(!0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mInvalid document ID: " + r)); 59 | let n = { 60 | _id: r 61 | , _updated: !0 62 | , _archived: d._archived || !1 63 | }; 64 | return d._created_at && (n._created_at = d._created_at), d._created_timestamp && (n._created_timestamp = d._created_timestamp), !0 === t.indicate_updated_at && (n._updated_at = new Date), !0 === t.indicate_updated_timestamp && (n._updated_timestamp = Date.now()), d._archived_at && (n._archived_at = d._archived_at), d._archived_timestamp && (n._archived_timestamp = d._archived_timestamp), d._unarchived_at && (n._unarchived_at = d._unarchived_at), d._unarchived_timestamp && (n._unarchived_timestamp = d._unarchived_timestamp), delete i._id, delete i._updated, delete i._archived, delete i._created_at, delete i._created_timestamp, delete i._updated_at, delete i._updated_timestamp, delete i._archived_at, delete i._archived_timestamp, delete i._unarchived_at, delete i._unarchived_timestamp, n = Object.assign(n, i), a[a.indexOf(d)] = n, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument with ID '\x1b[35m" + r + "\x1b[32m' has been updated."), e.set(a), n 65 | }), this.archive = (r => { 66 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Archive(Document_Id)"), !1; 67 | let i = e.get() 68 | , a = find(i, e => e._id === r); 69 | if (!a) return !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mInvalid document ID: " + r), !1; 70 | if (!0 === a._archived) return !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocument with ID '\x1b[35m" + r + "\x1b[31m' is already archived."), !1; 71 | let d = { 72 | _id: r 73 | , _updated: a._updated || !1 74 | , _archived: !0 75 | }; 76 | return a._created_at && (d._created_at = a._created_at), a._created_timestamp && (d._created_timestamp = a._created_timestamp), a._updated_at && (d._updated_at = a._updated_at), a._updated_timestamp && (d._updated_timestamp = a._updated_timestamp), !0 === t.indicate_archived_at && (d._archived_at = new Date), !0 === t.indicate_archived_timestamp && (d._archived_timestamp = Date.now()), a._unarchived_at && (d._unarchived_at = a._unarchived_at), a._unarchived_timestamp && (d._unarchived_timestamp = a._unarchived_timestamp), delete a._id, delete a._updated, delete a._archived, delete a._created_at, delete a._created_timestamp, delete a._updated_at, delete a._updated_timestamp, delete a._archived_at, delete a._archived_timestamp, delete a._unarchived_at, delete a._unarchived_timestamp, d = Object.assign(d, a), i[i.indexOf(a)] = d, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument with ID '\x1b[35m" + r + "\x1b[32m' has been archived."), e.set(i), !0 77 | }), this.unarchive = (r => { 78 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Unarchive(Document_Id)"), !1; 79 | let i = e.get() 80 | , a = find(i, e => e._id === r); 81 | if (!a) return !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mInvalid document ID: " + r), !1; 82 | if (!1 === a._archived) return !0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mDocument with ID '\x1b[35m" + r + "\x1b[31m' is not already archived."), !1; 83 | let d = { 84 | _id: r 85 | , _updated: a._updated || !1 86 | , _archived: !1 87 | }; 88 | return a._created_at && (d._created_at = a._created_at), a._created_timestamp && (d._created_timestamp = a._created_timestamp), a._updated_at && (d._updated_at = a._updated_at), a._updated_timestamp && (d._updated_timestamp = a._updated_timestamp), a._archived_at && (d._archived_at = a._archived_at), a._archived_timestamp && (d._archived_timestamp = a._archived_timestamp), !0 === t.indicate_unarchived_at && (d._unarchived_at = new Date), !0 === t.indicate_unarchived_timestamp && (d._unarchived_timestamp = Date.now()), delete a._id, delete a._updated, delete a._archived, delete a._created_at, delete a._created_timestamp, delete a._updated_at, delete a._updated_timestamp, delete a._archived_at, delete a._archived_timestamp, delete a._unarchived_at, delete a._unarchived_timestamp, d = Object.assign(d, a), i[i.indexOf(a)] = d, !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument with ID '\x1b[35m" + r + "\x1b[32m' has been unarchived."), e.set(i), !0 89 | }), this.delete = (r => { 90 | if (!r) return Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mNot specified: Collection.Delete(Document_Id)"), !1; 91 | let i = e.get() 92 | , a = find(i, e => e._id === r); 93 | return a ? (i.splice(i.indexOf(a), 1), !0 === t.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + t.name + "): \x1b[32mDocument with ID '\x1b[35m" + r + "\x1b[32m' has been deleted."), e.set(i), !0) : (!0 === t.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + t.name + "): \x1b[31mInvalid document ID: " + r), !1) 94 | }) 95 | } 96 | } 97 | module.exports.Collection = DocumentBasedCollection; -------------------------------------------------------------------------------- /src/structures/KeyValueBasedCollection.js: -------------------------------------------------------------------------------- 1 | const { 2 | isJSON: isJSON 3 | , separateNumber: separateNumber 4 | } = require("../util/Tools.js"), Debugger = require("../util/Debugger.js"), { 5 | set: set 6 | , get: get 7 | , find: find 8 | , filter: filter 9 | , unset: unset 10 | } = require("lodash"); 11 | class KeyValueBasedCollection { 12 | constructor(e, r) { 13 | this.set = ((t, o) => { 14 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Set(Key, ...)"); 15 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Set(..., Value)"); 16 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Set(Key, ...)"); 17 | if ("string" != typeof o && "number" != typeof o && "object" != typeof o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Set(..., Value)"); 18 | let n = e.get(); 19 | set(n, t, o); 20 | let l = get(n, t); 21 | return !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' has been set."), e.set(n), l 22 | }), this.get = (t => { 23 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Get(Key)"); 24 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Get(Key)"); 25 | let o = e.get() 26 | , n = get(o, t); 27 | if (void 0 !== n) return !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' found."), n; 28 | !0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mInvalid value key: " + t) 29 | }), this.push = ((t, o) => { 30 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Push(Key, ...)"); 31 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Push(..., Data)"); 32 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Push(Key, ...)"); 33 | if ("string" != typeof o && "number" != typeof o && "object" != typeof o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Push(..., Data)"); 34 | let n = e.get() 35 | , l = get(n, t); 36 | if (l && !1 === Array.isArray(l)) return void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an array.")); 37 | let i = l || []; 38 | return i.push(o), set(n, t, i), l = get(n, t), !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mData pushed to key '\x1b[35m" + t + "\x1b[32m'."), e.set(n), l 39 | }), this.remove = ((t, o) => { 40 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Remove(Key, ...)"); 41 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Remove(..., Data)"); 42 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Remove(Key, ...)"); 43 | if ("string" != typeof o && "number" != typeof o && "object" != typeof o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Remove(..., Data)"); 44 | let n = e.get() 45 | , l = get(n, t); 46 | if (l && !1 === Array.isArray(l)) return void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an array.")); 47 | let i = l || []; 48 | return i[i.indexOf(o)] ? (i.splice(i.indexOf(o), 1), set(n, t, i), l = get(n, t), !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mData in key '\x1b[35m" + t + "\x1b[32m' has been removed."), e.set(n), l) : (!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mData in key '\x1b[35m" + t + "\x1b[31m' not found."), i) 49 | }), this.find = ((t, o) => { 50 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Find(Key, ...)"); 51 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Find(..., Params)"); 52 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Find(Key, ...)"); 53 | if ("function" != typeof o && !1 === isJSON(o)) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Find(..., Params)"); 54 | let n = e.get() 55 | , l = get(n, t); 56 | if (!l || !1 !== Array.isArray(l)) return (l = find(l || [], o)) ? (!0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mData in key '\x1b[35m" + t + "\x1b[32m' found."), l) : void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mData in key '\x1b[35m" + t + "\x1b[31m' not found.")); 57 | !0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an array.") 58 | }), this.filter = ((t, o) => { 59 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Filter(Key, ...)"); 60 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Filter(..., Params)"); 61 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Filter(Key, ...)"); 62 | if ("function" != typeof o && !1 === isJSON(o)) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Filter(..., Params)"); 63 | let n = e.get() 64 | , l = get(n, t); 65 | if (!l || !1 !== Array.isArray(l)) return 0 !== (l = filter(l || [], o)) 66 | .length ? (!0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32m\x1b[35m" + separateNumber(l.length) + "\x1b[32m data in key '\x1b[35m" + t + "\x1b[32m' found."), l) : void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mData in key '\x1b[35m" + t + "\x1b[31m' not found.")); 67 | !0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an array.") 68 | }), this.has = ((t, o) => { 69 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Has(Key, ...)"), !1; 70 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Has(Key, ...)"), !1; 71 | if (!o || "function" == typeof o || "object" == typeof o || "string" == typeof o || "number" == typeof o) { 72 | if (o) { 73 | let n = e.get() 74 | , l = get(n, t); 75 | if (l && !1 === Array.isArray(l)) return !0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an array."), !1; 76 | let i = l || []; 77 | return l = find(i, o), "function" != typeof o && "object" != typeof o && (l = void 0), l ? (!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[32mData in key '\x1b[35m" + t + "\x1b[32m' found."), !0) : i[i.indexOf(o)] ? (!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[32mData in key '\x1b[35m" + t + "\x1b[32m' found."), !0) : (!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mData in key '\x1b[35m" + t + "\x1b[31m' not found."), !1) 78 | } { 79 | let o = e.get(); 80 | return void 0 === get(o, t) ? (!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[32m' not found."), !1) : (!0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' found."), !0) 81 | } 82 | } 83 | Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Has(..., Params)") 84 | }), this.increase = ((t, o) => { 85 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Increase(Key, ...)"); 86 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Increase(..., Value)"); 87 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Increase(Key, ...)"); 88 | if (o = parseFloat(o), isNaN(o) || "number" != typeof o) return Debugger.error("Incorrect parameter type: Collection.Increase(..., Value)"); 89 | let n = e.get() 90 | , l = get(n, t); 91 | if (l && "number" != typeof l) return void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an number.")); 92 | let i = l || 0; 93 | return set(n, t, i += o), l = get(n, t), !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' has been increased."), e.set(n), l 94 | }), this.decrease = ((t, o) => { 95 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Decrease(Key, ...)"); 96 | if (void 0 === o) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Decrease(..., Value)"); 97 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Decrease(Key, ...)"); 98 | if (o = parseFloat(o), isNaN(o) || "number" != typeof o) return Debugger.error("Incorrect parameter type: Collection.Decrease(..., Value)"); 99 | let n = e.get() 100 | , l = get(n, t); 101 | if (l && "number" != typeof l) return void(!0 === r.detailed_debugger_logs && Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[31m' is not an number.")); 102 | let i = l || 0; 103 | return set(n, t, i -= o), l = get(n, t), !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' has been decreased."), e.set(n), l 104 | }), this.delete = (t => { 105 | if (!t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mNot specified: Collection.Delete(Key, ...)"), !1; 106 | if ("string" != typeof t && "number" != typeof t) return Debugger.error("\x1b[35m(Collection#" + r.name + "): \x1b[31mIncorrect parameter type: Collection.Delete(Key)"), !1; 107 | let o = e.get(); 108 | if (void 0 !== get(o, t)) return unset(o, t), !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[32mValue with key '\x1b[35m" + t + "\x1b[32m' has been deleted."), e.set(o), !0; 109 | !0 === r.detailed_debugger_logs && Debugger.log("\x1b[35m(Collection#" + r.name + "): \x1b[31mValue with key '\x1b[35m" + t + "\x1b[32m' not found.") 110 | }) 111 | } 112 | } 113 | module.exports.Collection = KeyValueBasedCollection; -------------------------------------------------------------------------------- /src/util/Debugger.js: -------------------------------------------------------------------------------- 1 | const Package = require("../../package.json"); 2 | module.exports.log = (e => console.log("\x1b[36m" + Package.name.split(".") 3 | .join("") + " \x1b[34m» \x1b[32m" + e + "\x1b[0m")), module.exports.error = (e => console.error("\x1b[36m" + Package.name.split(".") 4 | .join("") + " \x1b[34m» \x1b[31m" + e + "\x1b[0m")); -------------------------------------------------------------------------------- /src/util/Tools.js: -------------------------------------------------------------------------------- 1 | module.exports.rebuildCollectionName = (t => { 2 | let e = ""; 3 | t = t.trim() 4 | .toUpperCase() 5 | .split(" ") 6 | .join("_"); 7 | for (let r = 0; r < t.length; r++) { 8 | let o = t.charAt(r); 9 | "abcdefghijklmnoprstuvyzwxqABCDEFGHIJKLMNOPRSTUVYZWXQ1234567890_".includes(o) && (e += o) 10 | } 11 | return e.substr(0, 64) 12 | }), module.exports.isJSON = (t => { 13 | if ("object" == typeof t && null !== t) return !0; 14 | try { 15 | JSON.parse(t) 16 | } catch (t) { 17 | return !1 18 | } 19 | return !0 20 | }), module.exports.separateNumber = (t => { 21 | let e = t.toString() 22 | .split(".") 23 | .join(",") 24 | .split(",")[0].split("") 25 | .reverse() 26 | .join("") 27 | , r = ""; 28 | for (let t = 0; t < e.length; t++) r += t % 3 == 0 && 0 !== t ? "," + e.charAt(t) : e.charAt(t); 29 | return t.toString() 30 | .split(".") 31 | .join(",") 32 | .split(",")[1] ? r.split("") 33 | .reverse() 34 | .join("") + "." + t.toString() 35 | .split(".") 36 | .join(",") 37 | .split(",")[1] : r.split("") 38 | .reverse() 39 | .join("") 40 | }), module.exports.zeroBeforeNumber = ((t, e = !1) => (t = t.toString(), !0 === e ? ("00" === t && (t = "0"), "01" === t && (t = "1"), "02" === t && (t = "2"), "03" === t && (t = "3"), "04" === t && (t = "4"), "05" === t && (t = "5"), "06" === t && (t = "6"), "07" === t && (t = "7"), "08" === t && (t = "8"), "09" === t && (t = "9")) : ("0" === t && (t = "00"), "1" === t && (t = "01"), "2" === t && (t = "02"), "3" === t && (t = "03"), "4" === t && (t = "04"), "5" === t && (t = "05"), "6" === t && (t = "06"), "7" === t && (t = "07"), "8" === t && (t = "08"), "9" === t && (t = "09")), t)); --------------------------------------------------------------------------------