├── .gitignore ├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE ├── LegacyReference ├── COBKS05.cob ├── COBKS05.cpy ├── COBPACK2.cob ├── COBPACK2.cpy ├── COBPACK3.cob ├── COBPACK3.cpy ├── COBVBFM2.cob ├── COBVBFM2.cpy ├── JCLCOBKS.jcl └── JCLPACK.jcl ├── NOTICE ├── README.md ├── bandit ├── 2021-08-28.txt ├── 2021-10-18.txt ├── 2021-11-09.txt ├── 2021-11-20.txt ├── 2021-11-29.txt ├── 2021-12-01.txt ├── 2021-12-06.txt ├── 2022-03-24.txt ├── 2022-04-30.txt ├── 2023-05-01.txt ├── 2023-05-07.txt └── 2024-08-07.txt ├── copybook.py ├── datasource.py ├── docs ├── 01-local-single-fb.md ├── 02-local-single-fb-s3-json.md ├── 03-local-single-fb-thread.md ├── 04-local-single-vb.md ├── 05-local-multi-fb.md ├── 06-local-multi-fb-s3-input.md ├── 07-local-multi-fb-s3-output.md ├── 08-local-multi-ddb.md ├── 09-lambda-multi-s3-output.md ├── 10-s3-lambda-obj-multi-fb.md ├── 11-local-multi-fb-discard.md ├── 99-README-v1.md └── 99-file-split-fb.md ├── ebcdic.py ├── extract_ebcdic_to_ascii.py ├── parse_copybook_to_json.py ├── sample-data ├── CLIENT-TEST.json ├── CLIENT-s3-obj.json ├── CLIENT-s3-rules.json ├── CLIENT-s3.json ├── CLIENT.ASCII-0.txt ├── CLIENT.ASCII-1.txt ├── CLIENT.ASCII-2.txt ├── CLIENT.ASCII-disc.txt ├── CLIENT.ASCII.txt ├── CLIENT.ASCII.txt.1 ├── CLIENT.ASCII.txt.2 ├── CLIENT.EBCDIC-0.txt ├── CLIENT.EBCDIC-1.txt ├── CLIENT.EBCDIC-2.txt ├── CLIENT.EBCDIC.txt ├── CLIENT.json ├── COBKS05-ddb-rules.json ├── COBKS05-ddb-s3.json ├── COBKS05-ddb.json ├── COBKS05-dict.json ├── COBKS05-list-2.json ├── COBKS05-list-s3-rules.json ├── COBKS05-list-s3.json ├── COBKS05-list.json ├── COBKS05-rules.json ├── COBKS05-split-local-err.json ├── COBKS05-split-local.json ├── COBKS05-split.json ├── COBPACK-TEST.json ├── COBPACK.ASCII.txt ├── COBPACK.ASCII.txt.1 ├── COBPACK.ASCII.txt.2 ├── COBPACK.OUTFILE.txt ├── COBPACK.json ├── COBPACK3.ASCII.txt ├── COBPACK3.EBCDIC.txt ├── COBVBFM2-list.json ├── COBVBFM2-split-local.json ├── COBVBFM2.ASCII.txt ├── COBVBFM2.EBCDIC-gt10.txt ├── COBVBFM2.EBCDIC-le10.txt ├── COBVBFM2.EBCDIC.txt ├── cobpack2-dict.json ├── cobpack2-list-ddb-s3.json ├── cobpack2-list-ddb.json ├── cobpack2-list.json ├── cobpack3-dict.json └── cobpack3-list.json ├── sample-json ├── CLIENT-s3-obj.json ├── COBKS05-ddb-rules.json ├── COBKS05-ddb.json ├── COBKS05-list-disc-rules.json ├── COBKS05-list-disc.json ├── COBKS05-list-rules.json ├── COBKS05-list-s3-out-rules.json ├── COBKS05-list-s3-out.json ├── COBKS05-list-s3-rules.json ├── COBKS05-list-s3.json ├── COBKS05-list.json ├── COBKS05-rules.json ├── COBVBFM2-list.json ├── cobpack2-dict.json ├── cobpack2-list-t.json └── cobpack2-list.json ├── split_ebcdic_file.py ├── src ├── bincompare.json ├── bincompare.py ├── core │ ├── cli.py │ ├── config.json │ ├── copybook.py │ ├── ebcdic.py │ ├── extract.py │ ├── filemeta.py │ ├── log.py │ └── parsecp.py ├── lambda_function.py ├── mdu.py └── split.py └── utils.py /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | __pycache__/ 3 | test 4 | .vscode 5 | asciifile.txt 6 | test* 7 | s_* 8 | .mduvenv -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *main* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | -------------------------------------------------------------------------------- /LegacyReference/COBKS05.cpy: -------------------------------------------------------------------------------- 1 | *----------------------------------------------------------------- 2 | * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 | * SPDX-License-Identifier: Apache-2.0 4 | *----------------------------------------------------------------- 5 | 01 REC-CLIENT. 00039300 6 | 03 CLIENT-KEY. 00039400 7 | 05 CLIENT-ID PIC 9(009) COMP. 00039500 8 | 05 CLIENT-TYPE PIC 9(004) COMP. 00039500 9 | 03 CLIENT-MAIN. 00039400 10 | 05 CLIENT-NAME PIC X(030). 00039500 11 | 05 CLIENT-BDATE PIC X(010). 00039500 12 | 05 CLIENT-ED-LVL PIC X(010). 00039500 13 | 05 CLIENT-INCOME PIC 9(007)V99 COMP-3. 00039500 14 | 05 FILLER PIC X(439). 00039500 15 | 03 CLIENT-ADDRESS REDEFINES CLIENT-MAIN. 16 | 05 CLIENT-ADDR-NUMBER PIC 9(009) COMP. 00039500 17 | 05 CLIENT-ADDR-STREET PIC X(040). 00039500 18 | 05 FILLER PIC X(450). 00039500 19 | 03 CLIENT-HEADER REDEFINES CLIENT-MAIN. 20 | 05 CLIENT-RECORD-COUNT PIC 9(009) COMP. 00039500 21 | 05 FILLER PIC X(490). 00039500 -------------------------------------------------------------------------------- /LegacyReference/COBPACK2.cpy: -------------------------------------------------------------------------------- 1 | *----------------------------------------------------------------- 2 | * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 | * SPDX-License-Identifier: Apache-2.0 4 | *----------------------------------------------------------------- 5 | 01 REC-OUTFILE. 00039300 6 | 03 OUTFILE-TEXT PIC -9(18). 00039400 7 | 03 OUTFILE-UNPACKED PIC 9(18). 00039400 8 | 03 OUTFILE-UNPACKED-S PIC S9(18). 00039400 9 | 03 BINARY-FIELDS. 00039400 10 | 05 OUTFILE-COMP-04 PIC 9(04) COMP. 00039500 11 | 05 OUTFILE-COMP-04-S PIC S9(04) COMP. 00039500 12 | 05 OUTFILE-COMP-09 PIC 9(09) COMP. 00039500 13 | 05 OUTFILE-COMP-09-S PIC S9(09) COMP. 00039500 14 | 05 OUTFILE-COMP-18 PIC 9(18) COMP. 00039500 15 | 05 OUTFILE-COMP-18-S PIC S9(18) COMP. 00039500 16 | 03 PACKED-DECIMAL-FIELDS. 00039400 17 | 05 OUTFILE-COMP3-04 PIC 9(04) COMP-3. 00039500 18 | 05 OUTFILE-COMP3-04-S PIC S9(04) COMP-3. 00039500 19 | 05 OUTFILE-COMP3-09 PIC 9(09) COMP-3. 00039500 20 | 05 OUTFILE-COMP3-09-S PIC S9(09) COMP-3. 00039500 21 | 05 OUTFILE-COMP3-18 PIC 9(18) COMP-3. 00039500 22 | 05 OUTFILE-COMP3-18-S PIC S9(18) COMP-3. 00039500 23 | 03 GROUP1. 24 | 05 GROUP1-1 OCCURS 2 TIMES. 00039500 25 | 07 TEXT1 PIC X(01). 00039500 26 | 03 GROUP2 REDEFINES GROUP1. 27 | 05 TEXT2 PIC X(02). 00039500 28 | 03 FILLER PIC X(29). -------------------------------------------------------------------------------- /LegacyReference/COBPACK3.cpy: -------------------------------------------------------------------------------- 1 | *----------------------------------------------------------------- 2 | * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 | * SPDX-License-Identifier: Apache-2.0 4 | *----------------------------------------------------------------- 5 | 01 REC-OUTFILE. 00039300 6 | BLANKLNE 7 | 03 OUTFILE-TEXT PIC -9(18). 00039400 8 | 03 OUTFILE-UNPACKED PIC 9(18). 00039400 9 | 03 OUTFILE-UNPACKED-S PIC S9(18). 00039400 10 | 03 BINARY-FIELDS. 00039400 11 | 05 OUTFILE-COMP-01 PIC 9(01) COMP. 00039500 12 | 05 OUTFILE-COMP-01-S PIC S9(01) COMP. 00039500 13 | 05 OUTFILE-COMP-02 PIC 9(02) COMP. 00039500 14 | 05 OUTFILE-COMP-02-S PIC S9(02) COMP. 00039500 15 | 05 OUTFILE-COMP-03 PIC 9(03) COMP. 00039500 16 | 05 OUTFILE-COMP-03-S PIC S9(03) COMP. 00039500 17 | 05 OUTFILE-COMP-04 PIC 9(04) COMP. 00039500 18 | 05 OUTFILE-COMP-04-S PIC S9(04) COMP. 00039500 19 | 05 OUTFILE-COMP-05 PIC 9(05) COMP. 00039500 20 | 05 OUTFILE-COMP-05-S PIC S9(05) COMP. 00039500 21 | 05 OUTFILE-COMP-06 PIC 9(06) COMP. 00039500 22 | 05 OUTFILE-COMP-06-S PIC S9(06) COMP. 00039500 23 | 05 OUTFILE-COMP-07 PIC 9(07) COMP. 00039500 24 | 05 OUTFILE-COMP-07-S PIC S9(07) COMP. 00039500 25 | 05 OUTFILE-COMP-08 PIC 9(08) COMP. 00039500 26 | 05 OUTFILE-COMP-08-S PIC S9(08) COMP. 00039500 27 | 05 OUTFILE-COMP-09 PIC 9(09) COMP. 00039500 28 | 05 OUTFILE-COMP-09-S PIC S9(09) COMP. 00039500 29 | 05 OUTFILE-COMP-18 PIC 9(18) COMP. 00039500 30 | 05 OUTFILE-COMP-18-S PIC S9(18) COMP. 00039500 31 | SKIP1 32 | 03 PACKED-DECIMAL-FIELDS. 00039400 33 | 05 OUTFILE-COMP3-01 PIC 9(01) COMP-3. 00039500 34 | 05 OUTFILE-COMP3-01-S PIC S9(01) COMP-3. 00039500 35 | 05 OUTFILE-COMP3-02 PIC 9(02) COMP-3. 00039500 36 | 05 OUTFILE-COMP3-02-S PIC S9(02) COMP-3. 00039500 37 | 05 OUTFILE-COMP3-03 PIC 9(03) COMP-3. 00039500 38 | 05 OUTFILE-COMP3-03-S PIC S9(03) COMP-3. 00039500 39 | 05 OUTFILE-COMP3-04 PIC 9(04) COMP-3. 00039500 40 | 05 OUTFILE-COMP3-04-S PIC S9(04) COMP-3. 00039500 41 | 05 OUTFILE-COMP3-05 PIC 9(05) COMP-3. 00039500 42 | 05 OUTFILE-COMP3-05-S PIC S9(05) COMP-3. 00039500 43 | 05 OUTFILE-COMP3-06 PIC 9(06) COMP-3. 00039500 44 | 05 OUTFILE-COMP3-06-S PIC S9(06) COMP-3. 00039500 45 | 05 OUTFILE-COMP3-07 PIC 9(07) COMP-3. 00039500 46 | 05 OUTFILE-COMP3-07-S PIC S9(07) COMP-3. 00039500 47 | 05 OUTFILE-COMP3-08 PIC 9(08) COMP-3. 00039500 48 | 05 OUTFILE-COMP3-08-S PIC S9(08) COMP-3. 00039500 49 | 05 OUTFILE-COMP3-09 PIC 9(09) COMP-3. 00039500 50 | 05 OUTFILE-COMP3-09-S PIC S9(09) COMP-3. 00039500 51 | 05 OUTFILE-COMP3-18 PIC 9(18) COMP-3. 00039500 52 | 05 OUTFILE-COMP3-18-S PIC S9(18) COMP-3. 00039500 53 | 03 GROUP1. 54 | 05 GROUP1-1 OCCURS 2 TIMES. 00039500 55 | 07 TEXT1 PIC X(01). 00039500 56 | 03 GROUP2 REDEFINES GROUP1. 57 | 05 TEXT2 PIC X(02). 00039500 58 | 03 FILLER PIC X(03). 00039500 -------------------------------------------------------------------------------- /LegacyReference/COBVBFM2.cpy: -------------------------------------------------------------------------------- 1 | *----------------------------------------------------------------- 2 | * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 | * SPDX-License-Identifier: Apache-2.0 4 | *----------------------------------------------------------------- 5 | 01 OUT-RECORD. 6 | 03 OUT-KEY. 7 | 05 OUTK-TYPE PIC XX. 8 | 05 OUTK-SEQT PIC 99. 9 | 03 OUT-REC-CNT PIC S999 COMP-3. 10 | 03 OUT-REC OCCURS 1 TO 10 TIMES 11 | DEPENDING ON OUT-REC-CNT. 12 | 05 OUT-REC-NO PIC 9(09). 13 | 05 OUT-NAME PIC X(21). -------------------------------------------------------------------------------- /LegacyReference/JCLCOBKS.jcl: -------------------------------------------------------------------------------- 1 | //LGDCOBKS JOB ' ',LUISDAN,CLASS=A,MSGCLASS=A,NOTIFY=&SYSUID JOB03752 2 | //********************************************************************** 3 | //* Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 4 | //* SPDX-License-Identifier: Apache-2.0 5 | //********************************************************************** 6 | //DELETE EXEC PGM=IDCAMS 7 | //SYSPRINT DD SYSOUT=* 8 | //SYSOUT DD SYSOUT=* 9 | //SYSIN DD * 10 | DELETE LUISDAN.FLAT.CLIENT 11 | SET MAXCC=0 12 | SET LASTCC=0 13 | //********************************************************************** 14 | //COBKS05 EXEC PGM=COBKS05 15 | //STEPLIB DD DISP=SHR,DSN=LUISDAN.LOAD 16 | //INPUTF DD DISP=SHR,DSN=LUISDAN.SRC(INPUT2) 17 | //CLIENT DD DISP=SHR,DSN=LUISDAN.KSDS.CLIENT 18 | //SYSPRINT DD SYSOUT=* 19 | //SYSOUT DD SYSOUT=* 20 | //********************************************************************** 21 | //EXTRACT EXEC PGM=SORT 22 | //SORTIN DD DISP=SHR,DSN=LUISDAN.KSDS.CLIENT 23 | //SORTOUT DD DSN=LUISDAN.FLAT.CLIENT, 24 | // DISP=(,CATLG,DELETE), 25 | // SPACE=(CYL,(1,1),RLSE),UNIT=3390, 26 | // DCB=(RECFM=FB,LRECL=500) 27 | //SYSPRINT DD SYSOUT=* 28 | //SYSOUT DD SYSOUT=* 29 | //SYSIN DD * 30 | SORT FIELDS=COPY -------------------------------------------------------------------------------- /LegacyReference/JCLPACK.jcl: -------------------------------------------------------------------------------- 1 | //LUISDAN1 JOB (COBUNPACK),'DANTAS',CLASS=A,MSGCLASS=A, 2 | // TIME=1440,NOTIFY=&SYSUID 3 | //********************************************************************** 4 | //* Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 5 | //* SPDX-License-Identifier: Apache-2.0 6 | //********************************************************************** 7 | //COBPACK EXEC PGM=COBPACK2 8 | //STEPLIB DD DISP=SHR,DSN=LUISDAN.LOAD 9 | //SYSPRINT DD SYSOUT=* 10 | //UTPRINT DD SYSOUT=* 11 | //SYSOUT DD SYSOUT=* 12 | //OUTFILE DD DISP=(SHR,CATLG,CATLG), 13 | // DSN=LUISDAN.COBPACK.OUTFILE, 14 | // SPACE=(CYL,(1,1),RLSE) 15 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Mainframe Data Utilities 2 | Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities v2 2 | 3 | | :exclamation: Looking for [Mainframe Data Utilities v1](docs/99-README-v1.md)? :exclamation: | 4 | |-----------------------------------------| 5 | # Mainframe Data Utilities 6 | 7 | Table of contents 8 | ================= 9 | * Security 10 | * License 11 | * About 12 | * Status 13 | * Requirements 14 | * Limitations 15 | * Download 16 | * Examples 17 | * Backlog 18 | 19 | ## Security 20 | 21 | See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information. 22 | 23 | ## License 24 | 25 | This project is licensed under the Apache-2.0 License. 26 | 27 | ## About 28 | 29 | Mainframe Data Utilities is an AWS Sample written in Python. 30 | 31 | The purpose of this project is to provide Python scripts as a starting point for those who need to read EBCDIC files transferred from mainframes and AS/400 platforms on AWS or any distributed environment. 32 | 33 | ## Requirements 34 | 35 | - [Python](https://www.python.org/downloads/) 3.8 or above 36 | - [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) 37 | 38 | ## Limitations 39 | 40 | 1. File layouts defined inside Cobol programs are not supported. 41 | 2. The file's logical record length is the sum of all field sizes. This means that in some cases the calculation may result in a size that is smaller than the physical file definition. 42 | 4. The `REDEFINES` statement for **data items**, it's only supported for **group items**. 43 | 44 | ## Download 45 | 46 | Download the code to run a preloaded example. 47 | 48 | From Windows, Mac or Linux shell (including AWS CloudShell), clone this repo and change directory: 49 | 50 | ``` 51 | git clone https://github.com/aws-samples/mainframe-data-utilities.git mdu 52 | cd mdu 53 | ``` 54 | 55 | ## Examples 56 | 57 | There are some examples about how to extract data on different use cases: 58 | 59 | |Document |Description| 60 | | - | - | 61 | |[Single Layout FB file](docs/01-local-single-fb.md) |The simplest conversion. Local, 'fixed blocked' and 'single layout' input file.| 62 | |[Read JSON metadata from Amazon S3](docs/02-local-single-fb-s3-json.md)|The JSON metadata file read from S3.| 63 | |[Single Layout FB file](docs/03-local-single-fb-thread.md) |Convert a file using multithreading and generating multiple output files.| 64 | |[Single Layout VB file](docs/04-local-single-vb.md) |Convert a Variable Block input file.| 65 | |[Multiple Layout file](docs/05-local-multi-fb.md) |Convert a multiple layout input file.| 66 | |[Read the input file from S3](docs/06-local-multi-fb-s3-input.md) |Get the input file from S3 and generate a local converted file.| 67 | |[Write the output file on S3](docs/07-local-multi-fb-s3-output.md) |Read a local file and write a converted file on S3.| 68 | |[Write the output data on DynamoDB](docs/08-local-multi-ddb.md) |Read a local file and write its data on DynamoDB.| 69 | |[Convert files using a Lambda function](docs/09-lambda-multi-s3-output.md)|Use a Lambda function to read an EBCDIC file from S3 and write the converted file back to S3.| 70 | |[Convert files using S3 Object Lambda](docs/10-s3-lambda-obj-multi-fb.md) |Use an Object Lambda to convert a EBCDIC file while it's downloaded from S3.| 71 | |[Split files by content/key](docs/99-file-split-fb.md) |Split an EBCDIC file according with a key provided | 72 | |[Discard specific layout](docs/11-local-multi-fb-discard.md) |Convert a multiple layout input file while discarding selected record types| 73 | 74 | ## Backlog 75 | 76 | ### General 77 | - There are still some try / exceptions to be coded. 78 | - Test automation. 79 | - Code organization / refactoring. 80 | 81 | ### Copybook parser 82 | - OCCURS DEPENDING ON copybook parsing. 83 | - Data item REDEFINES. 84 | - Aurora schema parser (DDL) 85 | - Add similar packing statements (BINARY, PACKED-DECIMAL...) 86 | - Handle packing statement (COMP, COMP-3, etc.) when declared before PIC statement 87 | 88 | ### Data conversion 89 | - Aurora data load 90 | -------------------------------------------------------------------------------- /bandit/2021-08-28.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-08-28 21:12:28.511029 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 168 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-10-18.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-10-18 16:26:11.952880 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 168 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-11-09.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-11-09 21:02:05.862206 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 170 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-11-20.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-11-20 09:53:22.969163 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 200 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-11-29.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-11-29 21:48:39.881907 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 406 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-12-01.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-12-01 08:34:31.139610 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 449 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2021-12-06.txt: -------------------------------------------------------------------------------- 1 | Run started:2021-12-06 14:19:11.169066 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 365 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2022-03-24.txt: -------------------------------------------------------------------------------- 1 | Run started:2022-03-24 23:24:24.517440 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 814 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (5): 22 | CODE_OF_CONDUCT.md (syntax error while parsing AST from file) 23 | CONTRIBUTING.md (syntax error while parsing AST from file) 24 | LICENSE (syntax error while parsing AST from file) 25 | NOTICE (syntax error while parsing AST from file) 26 | README.md (syntax error while parsing AST from file) 27 | -------------------------------------------------------------------------------- /bandit/2022-04-30.txt: -------------------------------------------------------------------------------- 1 | Run started:2022-04-30 13:13:38.150603 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 451 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2023-05-01.txt: -------------------------------------------------------------------------------- 1 | Run started:2023-05-01 14:28:51.195477 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 425 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2023-05-07.txt: -------------------------------------------------------------------------------- 1 | Run started:2023-05-07 22:35:55.088007 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 498 8 | Total lines skipped (#nosec): 0 9 | 10 | Run metrics: 11 | Total issues (by severity): 12 | Undefined: 0.0 13 | Low: 0.0 14 | Medium: 0.0 15 | High: 0.0 16 | Total issues (by confidence): 17 | Undefined: 0.0 18 | Low: 0.0 19 | Medium: 0.0 20 | High: 0.0 21 | Files skipped (0): 22 | -------------------------------------------------------------------------------- /bandit/2024-08-07.txt: -------------------------------------------------------------------------------- 1 | Run started:2024-08-07 15:11:38.167703 2 | 3 | Test results: 4 | No issues identified. 5 | 6 | Code scanned: 7 | Total lines of code: 561 8 | Total lines skipped (#nosec): 0 9 | Total potential issues skipped due to specifically being disabled (e.g., #nosec BXXX): 0 10 | 11 | Run metrics: 12 | Total issues (by severity): 13 | Undefined: 0 14 | Low: 0 15 | Medium: 0 16 | High: 0 17 | Total issues (by confidence): 18 | Undefined: 0 19 | Low: 0 20 | Medium: 0 21 | High: 0 22 | Files skipped (0): 23 | -------------------------------------------------------------------------------- /copybook.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | # FUNCTIONS TO HANDLE THE HIERARCHICAL STACK # 5 | def fGetSetack(): 6 | global stack, output 7 | tmp = output 8 | for k in stack: 9 | tmp = tmp[stack[k]] 10 | return tmp 11 | 12 | def fRemStack(iStack,iLevel): 13 | NewStack = {} 14 | for k in iStack: 15 | if k < iLevel: NewStack[k] = iStack[k] 16 | return NewStack 17 | 18 | #PIC 999, 9(3), XXX, X(3)... 19 | def getPicSize(arg): 20 | if arg.find("(") > 0: 21 | return int(arg[arg.find("(")+1:arg.find(")")]) 22 | else: 23 | return len(arg) 24 | 25 | # TYPE AND LENGTH CALCULATION # 26 | def getLenType(atr, p): 27 | ret = {} 28 | #FirstCh = atr[3][:1].upper() 29 | #Picture = atr[3].upper() 30 | FirstCh = atr[p][:1].upper() 31 | Picture = atr[p].upper() 32 | 33 | #data type 34 | if 'COMP-3'in atr and FirstCh=='S': ret['type'] = "pd+" 35 | elif 'COMP-3'in atr: ret['type'] = "pd" 36 | elif 'COMP' in atr and FirstCh=='S': ret['type'] = "bi+" 37 | elif 'COMP' in atr: ret['type'] = "bi" 38 | elif FirstCh=='S': ret['type'] = "zd+" 39 | elif FirstCh=='9': ret['type'] = "zd" 40 | else: ret['type'] = "ch" 41 | 42 | #Total data length 43 | PicNum = Picture.replace("V"," ").replace("S","").replace("-","").split() 44 | 45 | Lgt = getPicSize(PicNum[0]) 46 | 47 | if len(PicNum) == 1 and FirstCh !='V': 48 | ret['dplaces'] = 0 49 | elif FirstCh !='V': 50 | ret['dplaces'] = getPicSize(PicNum[1]) 51 | Lgt += ret['dplaces'] 52 | else: 53 | ret['dplaces'] = getPicSize(PicNum[0]) 54 | 55 | ret['length'] = Lgt 56 | 57 | #Data size in bytes 58 | if ret['type'][:2] == "pd": ret['bytes'] = int(Lgt/2)+1 59 | elif ret['type'][:2] == "bi": 60 | if Lgt < 5: ret['bytes'] = 2 61 | elif Lgt < 10: ret['bytes'] = 4 62 | else : ret['bytes'] = 8 63 | else: 64 | if FirstCh=='-': Lgt += 1 65 | ret['bytes'] = Lgt 66 | 67 | return ret 68 | 69 | ############# DICTIONARY AND HIERARCHICAL LOGIC ########################### 70 | def add2dict(lvl, grp, itm, stt, id): 71 | 72 | global cur, output, last, stack, FillerCount 73 | 74 | if itm.upper() == "FILLER": 75 | FillerCount += 1 76 | itm = itm + "_" + str(FillerCount) 77 | 78 | if lvl <= cur: stack = fRemStack(stack, lvl) 79 | 80 | stk = fGetSetack() 81 | stk[itm]= {} 82 | stk[itm]['id'] = id 83 | stk[itm]['level'] = lvl 84 | stk[itm]['group'] = grp 85 | 86 | if 'OCCURS' in stt: 87 | if 'TIMES' in stt: 88 | stk[itm]['occurs'] = int(stt[stt.index('TIMES')-1]) 89 | else: 90 | raise Exception('OCCURS WITHOU TIMES?' + ' '.join(stt)) 91 | 92 | if 'REDEFINES'in stt: stk[itm]['redefines'] = stt[stt.index('REDEFINES')+1] 93 | 94 | if grp == True: 95 | stack[lvl] = itm 96 | cur = lvl 97 | else: 98 | tplen = {} 99 | pic = stt.index('PIC')+1 100 | tplen = getLenType(stt, pic) 101 | #stk[itm]['pict'] = stt[3] 102 | stk[itm]['pict'] = stt[pic] 103 | stk[itm]['type'] = tplen['type'] 104 | stk[itm]['length'] = tplen['length'] 105 | stk[itm]['bytes'] = tplen['bytes'] 106 | stk[itm]['dplaces'] = tplen['dplaces'] 107 | 108 | ############################### MAIN ################################### 109 | # READS, CLEANS AND JOINS LINES # 110 | 111 | FillerCount=0 112 | cur=0 113 | output={} 114 | stack = {} 115 | 116 | def toDict(lines): 117 | 118 | id = 0 119 | stt = "" 120 | for line in lines: 121 | if len(line[6:72].strip()) > 1: 122 | if line[6] in [' ' , '-']: 123 | if not line[6:72].split()[0] in ['SKIP1','SKIP2','SKIP3']: 124 | stt += line[6:72].replace('\t', ' ') 125 | elif line[6] != '*': 126 | print('Unnexpected character in column 7:', line) 127 | quit() 128 | 129 | # READS FIELD BY FIELD / SPLITS ATTRIBUTES # 130 | for variable in stt.split("."): 131 | 132 | attribute=variable.split() 133 | 134 | if len(attribute) > 0: 135 | if attribute[0] != '88': 136 | id += 1 137 | add2dict(int(attribute[0]), False if 'PIC'in attribute else True, attribute[1], attribute, id) 138 | 139 | return output -------------------------------------------------------------------------------- /docs/01-local-single-fb.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a single layout FB file 4 | 5 | This page shows how to convert an EBCDIC file which has been downloaded to a local directory. 6 | 7 | ### Parse the copybook 8 | 9 | Use the `parse` function to convert the copybook from Cobol to JSON representation. 10 | 11 | This sample converts the [COBPACK2.cpy](/LegacyReference/COBPACK2.cpy) copybook file provided in [LegacyReference](/LegacyReference). 12 | 13 | ``` 14 | python3 src/mdu.py parse \ 15 | LegacyReference/COBPACK2.cpy \ 16 | sample-json/cobpack2-list.json \ 17 | -input sample-data/COBPACK.OUTFILE.txt \ 18 | -output sample-data/COBPACK.ASCII.txt \ 19 | -print 10000 -verbose true 20 | ``` 21 | 22 | The [cobpack2-list.json](/sample-json/cobpack2-list.json) metadata will be generated at [sample-json](/sample-json) 23 | 24 | ### Convert the local file 25 | 26 | Run the `extract` function to convert the `COBPACK.OUTFILE.txt` EBCDIC file into an ASCII file. 27 | 28 | ``` 29 | python3 src/mdu.py extract sample-json/cobpack2-list.json 30 | ``` 31 | 32 | The generated ASCII file will match the provided [COBPACK.ASCII.txt](/sample-data/COBPACK.ASCII.txt). 33 | 34 | ### More use cases 35 | 36 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/02-local-single-fb-s3-json.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a single layout FB file, reading a JSON metadata from S3 4 | 5 | ### Create an S3 bucket and set a variable 6 | export bucket=your-bucket-name 7 | 8 | ### Convert the JSON metadata to an S3 file 9 | 10 | Upload the [cobpack2-list.json](/sample-json/cobpack2-list.json) metadata created in the [Locally convert a single layout FB file](/docs/02-local-single-fb.md) procedure to your `s3 bucket`. 11 | 12 | ``` 13 | aws s3 cp sample-json/cobpack2-list.json s3://${bucket}/cobpack2-list.json 14 | ``` 15 | 16 | ### Convert the local file 17 | 18 | Run the `extract` function to convert the `COBPACK.OUTFILE.txt` EBCDIC file into an ASCII file. 19 | 20 | ``` 21 | python3 src/mdu.py extract cobpack2-list.json -json-s3 ${bucket} 22 | ``` 23 | 24 | The generated ASCCI file will match the provided [COBPACK.ASCII.txt](/sample-data/COBPACK.ASCII.txt). 25 | 26 | ### More use cases 27 | 28 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/03-local-single-fb-thread.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a single layout FB file 4 | 5 | This page shows how to convert an EBCDIC file which has been downloaded to a local directory. 6 | 7 | ### Parse the copybook 8 | 9 | Use the `parse` function to convert the copybook from Cobol to JSON representation. 10 | 11 | This sample converts the [COBPACK2.cpy](/LegacyReference/COBPACK2.cpy) copybook file provided in [LegacyReference](/LegacyReference). 12 | 13 | ``` 14 | python3 src/mdu.py parse \ 15 | LegacyReference/COBPACK2.cpy \ 16 | sample-json/cobpack2-list-t.json \ 17 | -input sample-data/COBPACK.OUTFILE.txt \ 18 | -threads 2 \ 19 | -output sample-data/COBPACK.ASCII.txt \ 20 | -print 10000 -verbose true 21 | ``` 22 | 23 | The [cobpack2-list.json](/sample-json/cobpack2-list.json) metadata will be generated at [sample-json](/sample-json) 24 | 25 | ### Convert the local file 26 | 27 | Run the `extract` function to convert the `COBPACK.OUTFILE.txt` EBCDIC file into an ASCII file. 28 | 29 | ``` 30 | python3 src/mdu.py extract sample-json/cobpack2-list-t.json 31 | ``` 32 | 33 | The generated ASCII file will match the provided [COBPACK.ASCII.txt](/sample-data/COBPACK.ASCII.txt). 34 | 35 | ### More use cases 36 | 37 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/04-local-single-vb.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## VB record format files 4 | 5 | To convert a Variable Block file you need to inform the `-input-recfm vb` when parsing the copybook. Run the `parse` function to convert the copybook: 6 | ``` 7 | python3 src/mdu.py parse \ 8 | LegacyReference/COBVBFM2.cpy \ 9 | sample-json/COBVBFM2-list.json \ 10 | -input sample-data/COBVBFM2.EBCDIC.txt \ 11 | -output sample-data/COBVBFM2.ASCII.txt \ 12 | -input-recfm vb -verbose true 13 | ``` 14 | 15 | The [COBVBFM2-list.json](/sample-json/COBVBFM2-list.json) metadata will be generated at [sample-json](/sample-json) 16 | 17 | ### Convert the local file 18 | 19 | Run the `extract` function to convert the `/sample-data/COBVBFM2.EBCDIC.txt` EBCDIC file into an ASCII file. 20 | ``` 21 | python3 src/mdu.py extract sample-json/COBVBFM2-list.json 22 | ``` 23 | 24 | ### More use cases 25 | 26 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/05-local-multi-fb.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a multiple layout file 4 | 5 | There are often multiple layouts in mainframe VSAM or sequential (flat) files. It means that you need a different transformation rule depending on the row you are reading. 6 | 7 | The REDEFINES statement allows multiple layouts declaration in the COBOL language. 8 | 9 | ### Parse a multiple layout copybook 10 | 11 | The [COBKS05.cpy](/LegacyReference/COBKS05.cpy) is provided in the [LegacyReference](/LegacyReference/) folder as an example of a VSAM or a flat file copybook having three record layouts. The [CLIENT.EBCDIC.txt](sample-data/CLIENT.EBCDIC.txt) is the EBCDIC sample that can be converted through the following steps. 12 | 13 | Run the `src/mdu.py` script, using the `parse` function, to convert the copybook file provided in [LegacyReference](/LegacyReference) from Cobol to JSON representation: 14 | 15 | ``` 16 | python3 src/mdu.py parse \ 17 | LegacyReference/COBKS05.cpy \ 18 | sample-json/COBKS05-list.json \ 19 | -input sample-data/CLIENT.EBCDIC.txt \ 20 | -output sample-data/CLIENT.ASCII.txt \ 21 | -print 20 -verbose true 22 | ``` 23 | 24 | ### Add the transformation rules 25 | 26 | 2. The step above will generate the [COBKS05-list.json](/sample-json/COBKS05-list.json) with an empty transformation rules list: `"transf_rule"=[],`. Replace the transformation rule with the content bellow and save it: 27 | 28 | ``` 29 | "transf_rule": [ 30 | { 31 | "offset": 4, 32 | "size": 2, 33 | "hex": "0002", 34 | "transf": "transf1" 35 | }, 36 | { 37 | "offset": 4, 38 | "size": 2, 39 | "hex": "0000", 40 | "transf": "transf2" 41 | } 42 | ], 43 | ``` 44 | 45 | The parameters above will inform the `extract` function that records having "0002" hexadecimal value between its 5th and 6th bytes must be converted through the layout specified in "transf1" layout, whereas records that contain "0000" at the same position will be extracted with the "transf2" layout. 46 | 47 | The result of the change above must produce a file like [COBKS05-rules.json](/sample-json/COBKS05-rules.json). 48 | 49 | ### Extract a multiple layout file 50 | 51 | 3. Run the `src/mdu.py extract` fucntion to extract the `CLIENT.EBCDIC.txt` into an ASCII file. 52 | 53 | ``` 54 | python3 src/mdu.py extract sample-json/COBKS05-list.json 55 | ``` 56 | 57 | 4. Check the [CLIENT.ASCII.txt](/sample-data/CLIENT.ASCII.txt) file. 58 | 59 | ### More use cases 60 | 61 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/06-local-multi-fb-s3-input.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a multiple layout file 4 | 5 | ### Pre-requisites 6 | - An S3 bucket already created 7 | 8 | ### Create a variable for you bucket name 9 | ``` 10 | bucket=your-bucket-name 11 | ``` 12 | ### Upload the input file to S3 13 | 14 | ``` 15 | aws s3 cp sample-data/CLIENT.EBCDIC.txt s3://${bucket}/sample-data/ 16 | ``` 17 | ### Parse a multiple layout copybook 18 | 19 | Run the `src/mdu.py` script, using the `parse` function, to convert the copybook file provided in [LegacyReference](/LegacyReference) from Cobol to JSON representation: 20 | 21 | ``` 22 | python3 src/mdu.py parse \ 23 | LegacyReference/COBKS05.cpy \ 24 | sample-json/COBKS05-list-s3.json \ 25 | -input sample-data/CLIENT.EBCDIC.txt \ 26 | -input-s3 ${bucket} \ 27 | -output sample-data/CLIENT.ASCII.txt \ 28 | -print 20 -verbose true 29 | ``` 30 | 31 | ### Extract a multiple layout file 32 | 33 | 2. The step above will generate the [COBKS05-list-s3.json](/sample-json/COBKS05-list-s3.json) with an empty transformation rules list: `"transf_rule"=[],`. Replace the transformation rule with the content bellow and save it: 34 | 35 | ``` 36 | "transf_rule": [ 37 | { 38 | "offset": 4, 39 | "size": 2, 40 | "hex": "0002", 41 | "transf": "transf1" 42 | }, 43 | { 44 | "offset": 4, 45 | "size": 2, 46 | "hex": "0000", 47 | "transf": "transf2" 48 | } 49 | ], 50 | ``` 51 | 52 | The result of the change above must produce a file like [COBKS05-s3-rules.json](/sample-json/COBKS05-list-s3-rules.json). 53 | 54 | 3. Run the `src/mdu.py extract` fucntion to extract the `CLIENT.EBCDIC.txt` into an ASCII file. 55 | 56 | ``` 57 | python3 src/mdu.py extract sample-json/COBKS05-list-s3.json 58 | ``` 59 | 60 | 4. Check the [CLIENT.ASCII.txt](/sample-data/CLIENT.ASCII.txt) file. 61 | 62 | ### More use cases 63 | 64 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/07-local-multi-fb-s3-output.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a multiple layout file and upload to S3 4 | 5 | ### Pre-requisites 6 | - An S3 bucket already created 7 | 8 | ### Create a variable for you bucket name 9 | ``` 10 | bucket=your-bucket-name 11 | ``` 12 | 13 | ### Parse a multiple layout copybook 14 | 15 | Run the `src/mdu.py` script, using the `parse` function, to convert the copybook file provided in [LegacyReference](/LegacyReference) from Cobol to JSON representation. Use `-output-s3` to inform your bucket name: 16 | 17 | ``` 18 | python3 src/mdu.py parse \ 19 | LegacyReference/COBKS05.cpy \ 20 | sample-json/COBKS05-list-s3-out.json \ 21 | -input sample-data/CLIENT.EBCDIC.txt \ 22 | -output-s3 ${bucket} \ 23 | -output sample-data/CLIENT.ASCII.txt \ 24 | -threads 2 \ 25 | -print 20 -verbose true 26 | ``` 27 | 28 | ### Add the multiple layout conversion rules 29 | 30 | 2. The step above will generate the [COBKS05-list-s3-out.json](/sample-json/sample-json/COBKS05-list-s3-out.json) with an empty transformation rules list: `"transf_rule"=[],`. Replace the transformation rule with the content bellow and save it: 31 | 32 | ``` 33 | "transf_rule": [ 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0002", 38 | "transf": "transf1" 39 | }, 40 | { 41 | "offset": 4, 42 | "size": 2, 43 | "hex": "0000", 44 | "transf": "transf2" 45 | } 46 | ], 47 | ``` 48 | 49 | The result of the change above must produce a file like [sample-json/COBKS05-list-s3-out.json](/sample-json/sample-json/COBKS05-list-s3-out.json). 50 | 51 | ### Extract a multiple layout file 52 | 53 | 3. Run the `src/mdu.py extract` fucntion to extract the `CLIENT.EBCDIC.txt` into an ASCII file. 54 | 55 | ``` 56 | python3 src/mdu.py extract sample-json/COBKS05-list-s3-out.json 57 | ``` 58 | 59 | 4. Check the [CLIENT.ASCII.txt](/sample-data/CLIENT.ASCII.txt) file (and your S3 bucket content). 60 | 61 | ### More use cases 62 | 63 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/08-local-multi-ddb.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 <- current 2 | 3 | ## Load a DymamoDB table from local disk 4 | 5 | ### Create the DynamoDB table 6 | 7 | 1. Create the DynanamoDb table which will be loaded on next steps. In this example we defined `CLIENT` as the table name, `CLIENT_ID` as its partition key, and CLIENT-R-TYPE as its sort key. 8 | 9 | ``` 10 | aws dynamodb create-table \ 11 | --table-name CLIENT \ 12 | --attribute-definitions AttributeName=CLIENT_ID,AttributeType=S AttributeName=CLIENT_R_TYPE,AttributeType=S \ 13 | --key-schema AttributeName=CLIENT_ID,KeyType=HASH AttributeName=CLIENT_R_TYPE,KeyType=RANGE \ 14 | --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5 15 | ``` 16 | 17 | 2. Check the status of the table creation: 18 | 19 | ``` 20 | aws dynamodb describe-table --table-name CLIENT | grep TableStatus 21 | ``` 22 | 23 | ### Parse the copybook 24 | 25 | Run the `mdu.py parse` function to convert the [COBKS05.cpy](/LegacyReference/COBKS05.cpy) copybook file provided in [LegacyReference](/LegacyReference/COBKS05.cpy) from Cobol to JSON representation. 26 | 27 | 1. Inform `ddb` on `-output-type`. 28 | 2. Inform the DynamoDB table name (created before) on `-output`. 29 | 3. Inform the DynamoDB table partition key name on `-part-k-name`. 30 | 4. Inform the DynamoDB table partition key size (as it is in the EBCDIC file) on `-part-k-len`. 31 | 5. Inform the DynamoDB table sort key name on `-sort-k-name`. 32 | 6. Inform the DynamoDB table sort key size (as it is in the EBCDIC file) on `-sort-k-len`. 33 | 34 | ``` 35 | python3 src/mdu.py parse \ 36 | LegacyReference/COBKS05.cpy \ 37 | sample-json/COBKS05-ddb.json \ 38 | -input sample-data/CLIENT.EBCDIC.txt \ 39 | -output CLIENT \ 40 | -part-k-len 4 \ 41 | -part-k-name CLIENT_ID \ 42 | -sort-k-len 2 \ 43 | -sort-k-name CLIENT_R_TYPE \ 44 | -output-type ddb \ 45 | -req-size 25 \ 46 | -print 20 47 | ``` 48 | ### Set the transformatiom rules 49 | 50 | The step above will generate the [COBKS05-ddb.json](sample-json/COBKS05-ddb.json) with empty transformation rules: `"transf_rule"=[],`. Replace the transformation rule with the content bellow and save it. Example: [COBKS05-ddb-rules.json](sample-json/COBKS05-ddb-rules.json): 51 | 52 | ``` 53 | "transf_rule": [ 54 | { 55 | "offset": 4, 56 | "size": 2, 57 | "hex": "0002", 58 | "transf": "transf1" 59 | }, 60 | { 61 | "offset": 4, 62 | "size": 2, 63 | "hex": "0000", 64 | "transf": "transf2" 65 | } 66 | ], 67 | ``` 68 | 69 | ### Load the data into the CLIENT Dynamodb table 70 | 71 | 1. Run the `mdu.py extract` to extract the [CLIENT.EBCDIC.txt](sample-data/CLIENT.EBCDIC.txt) and load into the `CLIENT` Dynamodb table in the ASCII encoding. 72 | 73 | ``` 74 | python3 src/mdu.py extract sample-json/COBKS05-ddb.json 75 | ``` 76 | 77 | ### More use cases 78 | 79 | Check the [main page](/). -------------------------------------------------------------------------------- /docs/11-local-multi-fb-discard.md: -------------------------------------------------------------------------------- 1 | # Mainframe Data Utilities V2 2 | 3 | ## Locally convert a multiple layout file and discard specificy record types 4 | 5 | ### Parse a multiple layout copybook 6 | 7 | The [COBKS05.cpy](/LegacyReference/COBKS05.cpy) is provided in the [LegacyReference](/LegacyReference/) folder as an example of a VSAM or a flat file copybook having three record layouts. The [CLIENT.EBCDIC.txt](sample-data/CLIENT.EBCDIC.txt) is the EBCDIC sample that can be converted through the following steps. 8 | 9 | Run the `src/mdu.py` script, using the `parse` function, to convert the copybook file provided in [LegacyReference](/LegacyReference) from Cobol to JSON representation: 10 | 11 | ``` 12 | python3 src/mdu.py parse \ 13 | LegacyReference/COBKS05.cpy \ 14 | sample-json/COBKS05-list-disc.json \ 15 | -input sample-data/CLIENT.EBCDIC.txt \ 16 | -output sample-data/CLIENT.ASCII-disc.txt \ 17 | -print 20 -verbose true 18 | ``` 19 | 20 | ### Add the transformation rules 21 | 22 | 2. The step above will generate the [COBKS05-list-disc.json](/sample-json/COBKS05-list-disc.json) with an empty transformation rules list: `"transf_rule"=[],`. To discard the record types 0 and 1 replace the transformation rule with the content bellow and save it: 23 | 24 | ``` 25 | "transf_rule": [ 26 | { 27 | "offset": 4, 28 | "size": 2, 29 | "hex": "0002", 30 | "transf": "transf1" 31 | }, 32 | { 33 | "offset": 4, 34 | "size": 2, 35 | "hex": "0001", 36 | "transf": "discard" 37 | }, 38 | { 39 | "offset": 4, 40 | "size": 2, 41 | "hex": "0000", 42 | "transf": "discard" 43 | } 44 | ], 45 | ``` 46 | 47 | The parameters above will inform the `extract` function that records having "0002" hexadecimal value between its 5th and 6th bytes must be converted through the layout specified in "transf1" layout, whereas records that contain "0000" or "0001" at the same position will be discarded. 48 | 49 | The result of the change above must produce a file like [COBKS05-list-disc-rules.json](/sample-json/COBKS05-list-disc-rules.json). 50 | 51 | ### Extract a multiple layout file 52 | 53 | 3. Run the `src/mdu.py extract` fucntion to extract the `CLIENT.EBCDIC.txt` into an ASCII file. 54 | 55 | ``` 56 | python3 src/mdu.py extract sample-json/COBKS05-list-disc.json 57 | ``` 58 | 59 | 4. Check the [CLIENT.ASCII-disc.txt](/sample-data/CLIENT.ASCII-disc.txt) file. 60 | 61 | ### More use cases 62 | 63 | Check the [main page](/). 64 | -------------------------------------------------------------------------------- /docs/99-file-split-fb.md: -------------------------------------------------------------------------------- 1 | ## File split (not refactored yet) 2 | 3 | ``` 4 | python3 split.py -local-json sample-data/COBKS05-split-local.json 5 | ``` 6 | 7 | ### More use cases 8 | 9 | Check the [main page](/). -------------------------------------------------------------------------------- /ebcdic.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | HighestPositive = "7fffffffffffffffffff" 5 | 6 | def unpack(bytes: bytearray, type: str, rem_lv: bool): 7 | # 8 | # Formats ebcdic text, zoned, big endian binary or decinal data into unpacked/string ascii data. 9 | # 10 | # Parameters: 11 | # - bytes (bytearray): The content to be extracted 12 | # - type (str)......: 13 | # - ch : text | pic x 14 | # - zd : zoned | pic 9 15 | # - zd+: signed zoned | pic s9 16 | # - bi : binary | pic 9 comp 17 | # - bi+: signed binary | pic s9 comp 18 | # - pd : packed-decimal | pic 9 comp-3 19 | # - pd+: signed packed-decimal| pic s9 comp-3 20 | # 21 | # Returns: 22 | # - ascii string 23 | # 24 | # Test sample: 25 | # import struct 26 | # ori = 9223372036854775807 27 | # print(ori, unpack(struct.pack(">q",ori),"bi+")) 28 | # ori = ori * -1 29 | # print(ori, unpack(struct.pack(">q",ori),"bi+")) 30 | # print(unpack(bytearray.fromhex("f0f0f1c1"), "zd+")) 31 | # 32 | # Input examples: 33 | # - 8 bytes comp-signed struct q: -9,223,372,036,854,775,808 through +9,223,372,036,854,775,807 34 | # - 8 bytes comp-unsigned struct Q: 0 through 18,446,744,073,709,551,615 35 | # - 4 bytes comp-signed struct i: -2147483648 through +2147483647 36 | # - 4 bytes comp-unsigned struct I: 0 through +4294967295 37 | # - 2 bytes comp-signed struct h: -32768 through +32767 38 | # - 2 bytes comp-unsigned struct H: 0 through +65535 39 | 40 | if type.lower() == "ch" or type.lower() == "zd": 41 | return bytes.decode('cp037').replace('\x00', '').rstrip() if rem_lv == True else bytes.decode('cp037') 42 | elif type.lower() == "pd" or type.lower() == "pd+": 43 | return ("" if bytes.hex()[-1:] != "d" and bytes.hex()[-1:] != "b" else "-") + bytes.hex()[:-1] 44 | elif type.lower() == "bi" or (type.lower() == "bi+" and bytes.hex() <= HighestPositive[:len(bytes)*2]): 45 | a = int("0x" + bytes.hex(), 0) 46 | return str(a) 47 | elif type.lower() == "bi+": 48 | return str(int("0x" + bytes.hex(), 0) - int("0x" + len(bytes) * 2 * "f", 0) -1) 49 | elif type.lower() == "zd+": 50 | return ("" if bytes.hex()[-2:-1] != "d" else "-") + bytes[:-1].decode('cp037') + bytes.hex()[-1:] 51 | else: 52 | print("---------------------------\nLength & Type not supported\nLength: ",len(bytes),"\nType..: " ,type) 53 | exit() -------------------------------------------------------------------------------- /extract_ebcdic_to_ascii.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | import sys, ebcdic as ebcdic, utils as utils, datasource as datasource, os 4 | 5 | def lambda_handler(event, context): 6 | 7 | bkt = event['Records'][0]['s3']['bucket']['name'] 8 | key = event['Records'][0]['s3']['object']['key'] 9 | karr = key.split('/') 10 | 11 | jfle = '.'.join(karr[-1:][0].replace('.txt','').split('.')[:-1]) + '.json' 12 | 13 | fileconvertion(['extract_ebcdic_to_ascii.py', 14 | '-s3-json' , 's3://' + os.environ.get('layout') + jfle, 15 | '-s3-input', 's3://' + bkt + '/' + key 16 | ], 17 | tmp='/tmp/') 18 | 19 | return {'statusCode': 200} 20 | 21 | def s3_obj_lambda_handler(event, context): 22 | 23 | request_route = event["getObjectContext"]["outputRoute"] 24 | request_token = event["getObjectContext"]["outputToken"] 25 | s3_url = event["getObjectContext"]["inputS3Url"] 26 | 27 | layout = event['configuration']['payload'] + '.'.join(s3_url.split('?')[0].split('/')[-1].split('.')[:-2]) + '.json' 28 | 29 | fileconvertion(['extract_ebcdic_to_ascii.py', 30 | '-s3-json' , 's3://' + layout, 31 | '-s3-input', s3_url 32 | ], request_route, request_token) 33 | 34 | return {'statusCode': 200} 35 | 36 | def fileconvertion(args, route='', tkn='', tmp=''): 37 | 38 | log = utils.Log() 39 | prm = utils.ParamReader(args) 40 | InpDS = datasource.Input(prm.general) 41 | OutDS = datasource.Output(prm.general, route, tkn, tmp) 42 | 43 | i=0 44 | while i < prm.general["max"] or prm.general["max"] == 0: 45 | 46 | record = InpDS.read(prm.general["lrecl"]) 47 | 48 | if not record: break 49 | 50 | OutRecord = datasource.item(prm.general) 51 | 52 | i+= 1 53 | if i > prm.general["skip"]: 54 | 55 | if(prm.general["print"] != 0 and i % prm.general["print"] == 0): log.Write(['Records processed', str(i)]) 56 | 57 | layout = prm.GetLayout(record) 58 | 59 | for transf in layout: 60 | 61 | OutRecord.addField(transf["name"], transf["type"], transf["part-key"], prm.general["partkname"], transf["sort-key"], prm.general["sortkname"], prm.AddDecPlaces(ebcdic.unpack(record[transf["offset"]:transf["offset"]+transf["bytes"]], transf["type"], prm.general["rem-low-values"]), transf["dplaces"])) 62 | 63 | OutDS.Write(OutRecord.get()) 64 | OutDS.Write() 65 | 66 | log.Write(['Records processed', str(i)]) 67 | log.Finish() 68 | 69 | if __name__ == '__main__': 70 | fileconvertion(sys.argv) -------------------------------------------------------------------------------- /sample-data/CLIENT-TEST.json: -------------------------------------------------------------------------------- 1 | { 2 | "Records": [ 3 | { 4 | "eventVersion": "2.0", 5 | "eventSource": "aws:s3", 6 | "awsRegion": "us-west-2", 7 | "eventTime": "1970-01-01T00:00:00.000Z", 8 | "eventName": "ObjectCreated:Put", 9 | "userIdentity": { 10 | "principalId": "EXAMPLE" 11 | }, 12 | "requestParameters": { 13 | "sourceIPAddress": "127.0.0.1" 14 | }, 15 | "responseElements": { 16 | "x-amz-request-id": "EXAMPLE123456789", 17 | "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH" 18 | }, 19 | "s3": { 20 | "s3SchemaVersion": "1.0", 21 | "configurationId": "testConfigRule", 22 | "bucket": { 23 | "name": "your-bucket-name", 24 | "ownerIdentity": { 25 | "principalId": "EXAMPLE" 26 | }, 27 | "arn": "arn:aws:s3:::your-bucket-name" 28 | }, 29 | "object": { 30 | "key": "CLIENT.EBCDIC.txt", 31 | "size": 1024, 32 | "eTag": "0123456789abcdef0123456789abcdef", 33 | "sequencer": "0A1B2C3D4E5F678901" 34 | } 35 | } 36 | } 37 | ] 38 | } -------------------------------------------------------------------------------- /sample-data/CLIENT-s3-obj.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "sample-data/CLIENT.ASCII.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "s3-obj", 7 | "req-size": 0, 8 | "print": 0, 9 | "max": 0, 10 | "skip": 0, 11 | "lrecl": 500, 12 | "partklen": 0, 13 | "sortklen": 0, 14 | "rem-low-values": true, 15 | "separator": "|", 16 | "transf-rule": [ 17 | { 18 | "offset": 4, 19 | "size": 2, 20 | "hex": "0002", 21 | "transf": "transf1" 22 | }, 23 | { 24 | "offset": 4, 25 | "size": 2, 26 | "hex": "0000", 27 | "transf": "transf2" 28 | } 29 | ], 30 | "transf": [ 31 | { 32 | "type": "bi", 33 | "bytes": 4, 34 | "offset": 0, 35 | "dplaces": 0, 36 | "name": "CLIENT-ID", 37 | "part-key": false, 38 | "sort-key": false 39 | }, 40 | { 41 | "type": "bi", 42 | "bytes": 2, 43 | "offset": 4, 44 | "dplaces": 0, 45 | "name": "CLIENT-TYPE", 46 | "part-key": false, 47 | "sort-key": false 48 | }, 49 | { 50 | "type": "ch", 51 | "bytes": 30, 52 | "offset": 6, 53 | "dplaces": 0, 54 | "name": "CLIENT-NAME", 55 | "part-key": false, 56 | "sort-key": false 57 | }, 58 | { 59 | "type": "ch", 60 | "bytes": 10, 61 | "offset": 36, 62 | "dplaces": 0, 63 | "name": "CLIENT-BDATE", 64 | "part-key": false, 65 | "sort-key": false 66 | }, 67 | { 68 | "type": "ch", 69 | "bytes": 10, 70 | "offset": 46, 71 | "dplaces": 0, 72 | "name": "CLIENT-ED-LVL", 73 | "part-key": false, 74 | "sort-key": false 75 | }, 76 | { 77 | "type": "pd", 78 | "bytes": 5, 79 | "offset": 56, 80 | "dplaces": 2, 81 | "name": "CLIENT-INCOME", 82 | "part-key": false, 83 | "sort-key": false 84 | }, 85 | { 86 | "type": "ch", 87 | "bytes": 439, 88 | "offset": 61, 89 | "dplaces": 0, 90 | "name": "FILLER_1", 91 | "part-key": false, 92 | "sort-key": false 93 | } 94 | ], 95 | "transf1": [ 96 | { 97 | "type": "bi", 98 | "bytes": 4, 99 | "offset": 0, 100 | "dplaces": 0, 101 | "name": "CLIENT-ID", 102 | "part-key": false, 103 | "sort-key": false 104 | }, 105 | { 106 | "type": "bi", 107 | "bytes": 2, 108 | "offset": 4, 109 | "dplaces": 0, 110 | "name": "CLIENT-TYPE", 111 | "part-key": false, 112 | "sort-key": false 113 | }, 114 | { 115 | "type": "bi", 116 | "bytes": 4, 117 | "offset": 6, 118 | "dplaces": 0, 119 | "name": "CLIENT-ADDR-NUMBER", 120 | "part-key": false, 121 | "sort-key": false 122 | }, 123 | { 124 | "type": "ch", 125 | "bytes": 40, 126 | "offset": 10, 127 | "dplaces": 0, 128 | "name": "CLIENT-ADDR-STREET", 129 | "part-key": false, 130 | "sort-key": false 131 | }, 132 | { 133 | "type": "ch", 134 | "bytes": 450, 135 | "offset": 50, 136 | "dplaces": 0, 137 | "name": "FILLER_2", 138 | "part-key": false, 139 | "sort-key": false 140 | } 141 | ], 142 | "transf2": [ 143 | { 144 | "type": "bi", 145 | "bytes": 4, 146 | "offset": 0, 147 | "dplaces": 0, 148 | "name": "CLIENT-ID", 149 | "part-key": false, 150 | "sort-key": false 151 | }, 152 | { 153 | "type": "bi", 154 | "bytes": 2, 155 | "offset": 4, 156 | "dplaces": 0, 157 | "name": "CLIENT-TYPE", 158 | "part-key": false, 159 | "sort-key": false 160 | }, 161 | { 162 | "type": "bi", 163 | "bytes": 4, 164 | "offset": 6, 165 | "dplaces": 0, 166 | "name": "CLIENT-RECORD-COUNT", 167 | "part-key": false, 168 | "sort-key": false 169 | }, 170 | { 171 | "type": "ch", 172 | "bytes": 490, 173 | "offset": 10, 174 | "dplaces": 0, 175 | "name": "FILLER_3", 176 | "part-key": false, 177 | "sort-key": false 178 | } 179 | ] 180 | } -------------------------------------------------------------------------------- /sample-data/CLIENT-s3-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "ebcdicfile.txt", 3 | "output": "asciifile.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "s3", 7 | "output-s3key": "CLIENT.ASCII.txt", 8 | "output-s3bkt": "your-bucket-name", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 0, 16 | "sortklen": 0, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [ 20 | { 21 | "offset": 4, 22 | "size": 2, 23 | "hex": "0002", 24 | "transf": "transf1" 25 | }, 26 | { 27 | "offset": 4, 28 | "size": 2, 29 | "hex": "0000", 30 | "transf": "transf2" 31 | } 32 | ], 33 | "transf": [ 34 | { 35 | "type": "bi", 36 | "bytes": 4, 37 | "offset": 0, 38 | "dplaces": 0, 39 | "name": "CLIENT-ID", 40 | "part-key": false, 41 | "sort-key": false 42 | }, 43 | { 44 | "type": "bi", 45 | "bytes": 2, 46 | "offset": 4, 47 | "dplaces": 0, 48 | "name": "CLIENT-TYPE", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "ch", 54 | "bytes": 30, 55 | "offset": 6, 56 | "dplaces": 0, 57 | "name": "CLIENT-NAME", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 10, 64 | "offset": 36, 65 | "dplaces": 0, 66 | "name": "CLIENT-BDATE", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 46, 74 | "dplaces": 0, 75 | "name": "CLIENT-ED-LVL", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "pd", 81 | "bytes": 5, 82 | "offset": 56, 83 | "dplaces": 2, 84 | "name": "CLIENT-INCOME", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "ch", 90 | "bytes": 439, 91 | "offset": 61, 92 | "dplaces": 0, 93 | "name": "FILLER_1", 94 | "part-key": false, 95 | "sort-key": false 96 | } 97 | ], 98 | "transf1": [ 99 | { 100 | "type": "bi", 101 | "bytes": 4, 102 | "offset": 0, 103 | "dplaces": 0, 104 | "name": "CLIENT-ID", 105 | "part-key": false, 106 | "sort-key": false 107 | }, 108 | { 109 | "type": "bi", 110 | "bytes": 2, 111 | "offset": 4, 112 | "dplaces": 0, 113 | "name": "CLIENT-TYPE", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 4, 120 | "offset": 6, 121 | "dplaces": 0, 122 | "name": "CLIENT-ADDR-NUMBER", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "ch", 128 | "bytes": 40, 129 | "offset": 10, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-STREET", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 450, 138 | "offset": 50, 139 | "dplaces": 0, 140 | "name": "FILLER_2", 141 | "part-key": false, 142 | "sort-key": false 143 | } 144 | ], 145 | "transf2": [ 146 | { 147 | "type": "bi", 148 | "bytes": 4, 149 | "offset": 0, 150 | "dplaces": 0, 151 | "name": "CLIENT-ID", 152 | "part-key": false, 153 | "sort-key": false 154 | }, 155 | { 156 | "type": "bi", 157 | "bytes": 2, 158 | "offset": 4, 159 | "dplaces": 0, 160 | "name": "CLIENT-TYPE", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 4, 167 | "offset": 6, 168 | "dplaces": 0, 169 | "name": "CLIENT-RECORD-COUNT", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "ch", 175 | "bytes": 490, 176 | "offset": 10, 177 | "dplaces": 0, 178 | "name": "FILLER_3", 179 | "part-key": false, 180 | "sort-key": false 181 | } 182 | ] 183 | } -------------------------------------------------------------------------------- /sample-data/CLIENT-s3.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "ebcdicfile.txt", 3 | "output": "asciifile.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "s3", 7 | "output-s3key": "CLIENT.ASCII.txt", 8 | "output-s3bkt": "your-bucket-name", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 0, 16 | "sortklen": 0, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [], 20 | "transf": [ 21 | { 22 | "type": "bi", 23 | "bytes": 4, 24 | "offset": 0, 25 | "dplaces": 0, 26 | "name": "CLIENT-ID", 27 | "part-key": false, 28 | "sort-key": false 29 | }, 30 | { 31 | "type": "bi", 32 | "bytes": 2, 33 | "offset": 4, 34 | "dplaces": 0, 35 | "name": "CLIENT-TYPE", 36 | "part-key": false, 37 | "sort-key": false 38 | }, 39 | { 40 | "type": "ch", 41 | "bytes": 30, 42 | "offset": 6, 43 | "dplaces": 0, 44 | "name": "CLIENT-NAME", 45 | "part-key": false, 46 | "sort-key": false 47 | }, 48 | { 49 | "type": "ch", 50 | "bytes": 10, 51 | "offset": 36, 52 | "dplaces": 0, 53 | "name": "CLIENT-BDATE", 54 | "part-key": false, 55 | "sort-key": false 56 | }, 57 | { 58 | "type": "ch", 59 | "bytes": 10, 60 | "offset": 46, 61 | "dplaces": 0, 62 | "name": "CLIENT-ED-LVL", 63 | "part-key": false, 64 | "sort-key": false 65 | }, 66 | { 67 | "type": "pd", 68 | "bytes": 5, 69 | "offset": 56, 70 | "dplaces": 2, 71 | "name": "CLIENT-INCOME", 72 | "part-key": false, 73 | "sort-key": false 74 | }, 75 | { 76 | "type": "ch", 77 | "bytes": 439, 78 | "offset": 61, 79 | "dplaces": 0, 80 | "name": "FILLER_1", 81 | "part-key": false, 82 | "sort-key": false 83 | } 84 | ], 85 | "transf1": [ 86 | { 87 | "type": "bi", 88 | "bytes": 4, 89 | "offset": 0, 90 | "dplaces": 0, 91 | "name": "CLIENT-ID", 92 | "part-key": false, 93 | "sort-key": false 94 | }, 95 | { 96 | "type": "bi", 97 | "bytes": 2, 98 | "offset": 4, 99 | "dplaces": 0, 100 | "name": "CLIENT-TYPE", 101 | "part-key": false, 102 | "sort-key": false 103 | }, 104 | { 105 | "type": "bi", 106 | "bytes": 4, 107 | "offset": 6, 108 | "dplaces": 0, 109 | "name": "CLIENT-ADDR-NUMBER", 110 | "part-key": false, 111 | "sort-key": false 112 | }, 113 | { 114 | "type": "ch", 115 | "bytes": 40, 116 | "offset": 10, 117 | "dplaces": 0, 118 | "name": "CLIENT-ADDR-STREET", 119 | "part-key": false, 120 | "sort-key": false 121 | }, 122 | { 123 | "type": "ch", 124 | "bytes": 450, 125 | "offset": 50, 126 | "dplaces": 0, 127 | "name": "FILLER_2", 128 | "part-key": false, 129 | "sort-key": false 130 | } 131 | ], 132 | "transf2": [ 133 | { 134 | "type": "bi", 135 | "bytes": 4, 136 | "offset": 0, 137 | "dplaces": 0, 138 | "name": "CLIENT-ID", 139 | "part-key": false, 140 | "sort-key": false 141 | }, 142 | { 143 | "type": "bi", 144 | "bytes": 2, 145 | "offset": 4, 146 | "dplaces": 0, 147 | "name": "CLIENT-TYPE", 148 | "part-key": false, 149 | "sort-key": false 150 | }, 151 | { 152 | "type": "bi", 153 | "bytes": 4, 154 | "offset": 6, 155 | "dplaces": 0, 156 | "name": "CLIENT-RECORD-COUNT", 157 | "part-key": false, 158 | "sort-key": false 159 | }, 160 | { 161 | "type": "ch", 162 | "bytes": 490, 163 | "offset": 10, 164 | "dplaces": 0, 165 | "name": "FILLER_3", 166 | "part-key": false, 167 | "sort-key": false 168 | } 169 | ] 170 | } -------------------------------------------------------------------------------- /sample-data/CLIENT.ASCII-0.txt: -------------------------------------------------------------------------------- 1 | 0|0|220| -------------------------------------------------------------------------------- /sample-data/CLIENT.ASCII-2.txt: -------------------------------------------------------------------------------- 1 | 1|2|36|THE ROE AVENUE| 2 | 2|2|365|HEATHFIELD ESPLANADE| 3 | 3|2|4555|MORRISON STRAND| 4 | 4|2|1311|MARMION PARK| 5 | 5|2|3398|VICTORIA GAIT| 6 | 6|2|857|STIRLING LEAS| 7 | 7|2|4893|BRAMWELL LANE| 8 | 8|2|1096|DORSET LINK| 9 | 9|2|3168|RIDINGS WAY| 10 | 10|2|3811|WARDS CRESCENT| 11 | 11|2|3042|PRINCESS CLIFF| 12 | 12|2|3024|NICHOLSON VIEW| 13 | 13|2|397|BRUNEL FARM| 14 | 14|2|2866|NEWLANDS LOAN| 15 | 15|2|2625|HOWE NESS| 16 | 16|2|1386|MORRIS ESTATE| 17 | 17|2|3962|CHERRY SPRINGS| 18 | 18|2|4631|GREYFRIARS OAK| 19 | 19|2|1012|HADRIAN TOWN| 20 | 20|2|1589|BLANDFORD CAUSEWAY| 21 | 21|2|2407|EXMOUTH SQUARE| 22 | 22|2|1189|VICTORIA MOUNT| 23 | 23|2|790|BEECHFIELD RIDGEWAY| 24 | 24|2|3334|PALMERSTON CORNER| 25 | 25|2|664|LINCOLN HIGHWAY| 26 | 26|2|4671|ARMSTRONG ORCHARD| 27 | 27|2|1888|CARDINAL OAK| 28 | 28|2|1672|OAKHILL STREET| 29 | 29|2|3836|HARCOURT FIRS| 30 | 30|2|3092|GREAT DRIFT| 31 | 31|2|3094|PRINCE OF WALES CROFT| 32 | 32|2|704|PEAKIRK ROAD| 33 | 33|2|1615|LABURNUM ORCHARDS| 34 | 34|2|4892|SWAINSFIELD ROAD| 35 | 35|2|4882|HEXHAM QUAY| 36 | 36|2|4568|WEST VIEW WILLOWS| 37 | 37|2|1709|WAVERLEY GARDEN| 38 | 38|2|467|AUSTIN COMMON| 39 | 39|2|157|FIFE HILL| 40 | 40|2|685|MUNRO VILLAGE| 41 | 41|2|1961|FORT LODGE| 42 | 42|2|998|LOMBARD CROFT| 43 | 43|2|979|CONIFER CRESCENT| 44 | 44|2|2748|MONTPELIER RISE| 45 | 45|2|4291|ALMOND PADDOCKS| 46 | 46|2|4456|HAMILTON WOODLANDS| 47 | 47|2|122|MORETON NEWYDD| 48 | 48|2|1098|PITTSBURGH ROAD| 49 | 49|2|4664|HAMPSTEAD SIDE| 50 | 50|2|3263|EPPING GARDENS| 51 | 51|2|3397|PEARSON MEADOW| 52 | 52|2|808|MEAD MILL| 53 | 53|2|3447|BRAESIDE BUILDINGS| 54 | 54|2|1529|LYNN GAIT| 55 | 55|2|1368|WALLACE NOOK| 56 | 56|2|742|NEW BARNES AVENUE| 57 | 57|2|1655|FOSTER ROAD| 58 | 58|2|4583|STONY WOOD| 59 | 59|2|2634|DYKE HOUSE LANE| 60 | 60|2|1456|ASHURST LAS| 61 | 61|2|4709|HAMILTON STREET| 62 | 62|2|3673|KEATS COPPICE| 63 | 63|2|4493|RAMSAY RIDE| 64 | 64|2|4032|SANDPIPER QUADRANT| 65 | 65|2|4507|JOSEPH CLOSE| 66 | 66|2|152|KILLIERSFIELD| 67 | 67|2|2059|FOUNTAIN RIDGEWAY| 68 | 68|2|4694|MIDLAND LANE| 69 | 69|2|835|PRESTON CORNER| 70 | 70|2|1425|SOLWAY SPUR| 71 | 71|2|3773|MAIN CEDARS| 72 | 72|2|2525|MAYFAIR FAIRWAY| 73 | 73|2|2264|BEACON SQUARE| 74 | 74|2|4116|LION SPRINGS| 75 | 75|2|2688|FOLDHILL LANE| 76 | 76|2|3157|ATTLEE GATE| 77 | 77|2|4335|EWART CRESCENT| 78 | 78|2|179|BOYES CRESCENT| 79 | 79|2|891|FURNESS FOLD| 80 | 80|2|1281|GREENFIELDS MOOR| 81 | 81|2|2812|NAIRN CROFT| 82 | 82|2|3982|ALDERS BROOK| 83 | 83|2|4670|WEST PARK BY-PASS| 84 | 84|2|3406|MARSHMONT AVENUE| 85 | 85|2|1109|NOBLE BROW| 86 | 86|2|3530|CALEDONIA CREST| 87 | 87|2|4025|SYMONDS CLOSE| 88 | 88|2|3244|BROOMHILL GARDENS| 89 | 89|2|4804|JACKSON HOLLIES| 90 | 90|2|1086|CAERNARVON GRANGE| 91 | 91|2|1205|SHIRLEY LINKS| 92 | 92|2|3141|SINCLAIR HALL| 93 | 93|2|4543|TOURTEL ROAD| 94 | 94|2|764|BLACKACRE| 95 | 95|2|1334|CANBERRA HAVEN| 96 | 96|2|1188|DAIRY ESPLANADE| 97 | 97|2|1132|DRONLEY ROAD| 98 | 98|2|3787|RECTORY PIECE| 99 | 99|2|988|REDWOOD MALTINGS| 100 | 100|2|4062|GREENFIELD CIRCUS| 101 | 101|2|2512|PRINCE STREET| 102 | 102|2|3760|CANAL LANE| 103 | 103|2|3454|WINTER WAY| 104 | 104|2|4930|PEACE STREET| 105 | 105|2|4142|LUNA WAY| 106 | 106|2|2732|SOUTH LANE| 107 | 107|2|3098|OAK ROW| 108 | 108|2|1705|PINE WAY| 109 | 109|2|4802|ANGEL BOULEVARD| 110 | 110|2|1472|HAZELNUT STREET| -------------------------------------------------------------------------------- /sample-data/CLIENT.ASCII-disc.txt: -------------------------------------------------------------------------------- 1 | 1|2|36|THE ROE AVENUE| 2 | 2|2|365|HEATHFIELD ESPLANADE| 3 | 3|2|4555|MORRISON STRAND| 4 | 4|2|1311|MARMION PARK| 5 | 5|2|3398|VICTORIA GAIT| 6 | 6|2|857|STIRLING LEAS| 7 | 7|2|4893|BRAMWELL LANE| 8 | 8|2|1096|DORSET LINK| 9 | 9|2|3168|RIDINGS WAY| 10 | 10|2|3811|WARDS CRESCENT| 11 | 11|2|3042|PRINCESS CLIFF| 12 | 12|2|3024|NICHOLSON VIEW| 13 | 13|2|397|BRUNEL FARM| 14 | 14|2|2866|NEWLANDS LOAN| 15 | 15|2|2625|HOWE NESS| 16 | 16|2|1386|MORRIS ESTATE| 17 | 17|2|3962|CHERRY SPRINGS| 18 | 18|2|4631|GREYFRIARS OAK| 19 | 19|2|1012|HADRIAN TOWN| 20 | 20|2|1589|BLANDFORD CAUSEWAY| 21 | 21|2|2407|EXMOUTH SQUARE| 22 | 22|2|1189|VICTORIA MOUNT| 23 | 23|2|790|BEECHFIELD RIDGEWAY| 24 | 24|2|3334|PALMERSTON CORNER| 25 | 25|2|664|LINCOLN HIGHWAY| 26 | 26|2|4671|ARMSTRONG ORCHARD| 27 | 27|2|1888|CARDINAL OAK| 28 | 28|2|1672|OAKHILL STREET| 29 | 29|2|3836|HARCOURT FIRS| 30 | 30|2|3092|GREAT DRIFT| 31 | 31|2|3094|PRINCE OF WALES CROFT| 32 | 32|2|704|PEAKIRK ROAD| 33 | 33|2|1615|LABURNUM ORCHARDS| 34 | 34|2|4892|SWAINSFIELD ROAD| 35 | 35|2|4882|HEXHAM QUAY| 36 | 36|2|4568|WEST VIEW WILLOWS| 37 | 37|2|1709|WAVERLEY GARDEN| 38 | 38|2|467|AUSTIN COMMON| 39 | 39|2|157|FIFE HILL| 40 | 40|2|685|MUNRO VILLAGE| 41 | 41|2|1961|FORT LODGE| 42 | 42|2|998|LOMBARD CROFT| 43 | 43|2|979|CONIFER CRESCENT| 44 | 44|2|2748|MONTPELIER RISE| 45 | 45|2|4291|ALMOND PADDOCKS| 46 | 46|2|4456|HAMILTON WOODLANDS| 47 | 47|2|122|MORETON NEWYDD| 48 | 48|2|1098|PITTSBURGH ROAD| 49 | 49|2|4664|HAMPSTEAD SIDE| 50 | 50|2|3263|EPPING GARDENS| 51 | 51|2|3397|PEARSON MEADOW| 52 | 52|2|808|MEAD MILL| 53 | 53|2|3447|BRAESIDE BUILDINGS| 54 | 54|2|1529|LYNN GAIT| 55 | 55|2|1368|WALLACE NOOK| 56 | 56|2|742|NEW BARNES AVENUE| 57 | 57|2|1655|FOSTER ROAD| 58 | 58|2|4583|STONY WOOD| 59 | 59|2|2634|DYKE HOUSE LANE| 60 | 60|2|1456|ASHURST LAS| 61 | 61|2|4709|HAMILTON STREET| 62 | 62|2|3673|KEATS COPPICE| 63 | 63|2|4493|RAMSAY RIDE| 64 | 64|2|4032|SANDPIPER QUADRANT| 65 | 65|2|4507|JOSEPH CLOSE| 66 | 66|2|152|KILLIERSFIELD| 67 | 67|2|2059|FOUNTAIN RIDGEWAY| 68 | 68|2|4694|MIDLAND LANE| 69 | 69|2|835|PRESTON CORNER| 70 | 70|2|1425|SOLWAY SPUR| 71 | 71|2|3773|MAIN CEDARS| 72 | 72|2|2525|MAYFAIR FAIRWAY| 73 | 73|2|2264|BEACON SQUARE| 74 | 74|2|4116|LION SPRINGS| 75 | 75|2|2688|FOLDHILL LANE| 76 | 76|2|3157|ATTLEE GATE| 77 | 77|2|4335|EWART CRESCENT| 78 | 78|2|179|BOYES CRESCENT| 79 | 79|2|891|FURNESS FOLD| 80 | 80|2|1281|GREENFIELDS MOOR| 81 | 81|2|2812|NAIRN CROFT| 82 | 82|2|3982|ALDERS BROOK| 83 | 83|2|4670|WEST PARK BY-PASS| 84 | 84|2|3406|MARSHMONT AVENUE| 85 | 85|2|1109|NOBLE BROW| 86 | 86|2|3530|CALEDONIA CREST| 87 | 87|2|4025|SYMONDS CLOSE| 88 | 88|2|3244|BROOMHILL GARDENS| 89 | 89|2|4804|JACKSON HOLLIES| 90 | 90|2|1086|CAERNARVON GRANGE| 91 | 91|2|1205|SHIRLEY LINKS| 92 | 92|2|3141|SINCLAIR HALL| 93 | 93|2|4543|TOURTEL ROAD| 94 | 94|2|764|BLACKACRE| 95 | 95|2|1334|CANBERRA HAVEN| 96 | 96|2|1188|DAIRY ESPLANADE| 97 | 97|2|1132|DRONLEY ROAD| 98 | 98|2|3787|RECTORY PIECE| 99 | 99|2|988|REDWOOD MALTINGS| 100 | 100|2|4062|GREENFIELD CIRCUS| 101 | 101|2|2512|PRINCE STREET| 102 | 102|2|3760|CANAL LANE| 103 | 103|2|3454|WINTER WAY| 104 | 104|2|4930|PEACE STREET| 105 | 105|2|4142|LUNA WAY| 106 | 106|2|2732|SOUTH LANE| 107 | 107|2|3098|OAK ROW| 108 | 108|2|1705|PINE WAY| 109 | 109|2|4802|ANGEL BOULEVARD| 110 | 110|2|1472|HAZELNUT STREET| -------------------------------------------------------------------------------- /sample-data/CLIENT.ASCII.txt.1: -------------------------------------------------------------------------------- 1 | 0|0|220| 2 | 1|2|36|THE ROE AVENUE| 3 | 2|2|365|HEATHFIELD ESPLANADE| 4 | 3|2|4555|MORRISON STRAND| 5 | 4|2|1311|MARMION PARK| 6 | 5|2|3398|VICTORIA GAIT| 7 | 6|2|857|STIRLING LEAS| 8 | 7|2|4893|BRAMWELL LANE| 9 | 8|2|1096|DORSET LINK| 10 | 9|2|3168|RIDINGS WAY| 11 | 10|2|3811|WARDS CRESCENT| 12 | 11|2|3042|PRINCESS CLIFF| 13 | 12|2|3024|NICHOLSON VIEW| 14 | 13|2|397|BRUNEL FARM| 15 | 14|2|2866|NEWLANDS LOAN| 16 | 15|2|2625|HOWE NESS| 17 | 16|2|1386|MORRIS ESTATE| 18 | 17|2|3962|CHERRY SPRINGS| 19 | 18|2|4631|GREYFRIARS OAK| 20 | 19|2|1012|HADRIAN TOWN| 21 | 20|2|1589|BLANDFORD CAUSEWAY| 22 | 21|2|2407|EXMOUTH SQUARE| 23 | 22|2|1189|VICTORIA MOUNT| 24 | 23|2|790|BEECHFIELD RIDGEWAY| 25 | 24|2|3334|PALMERSTON CORNER| 26 | 25|2|664|LINCOLN HIGHWAY| 27 | 26|2|4671|ARMSTRONG ORCHARD| 28 | 27|2|1888|CARDINAL OAK| 29 | 28|2|1672|OAKHILL STREET| 30 | 29|2|3836|HARCOURT FIRS| 31 | 30|2|3092|GREAT DRIFT| 32 | 31|2|3094|PRINCE OF WALES CROFT| 33 | 32|2|704|PEAKIRK ROAD| 34 | 33|2|1615|LABURNUM ORCHARDS| 35 | 34|2|4892|SWAINSFIELD ROAD| 36 | 35|2|4882|HEXHAM QUAY| 37 | 36|2|4568|WEST VIEW WILLOWS| 38 | 37|2|1709|WAVERLEY GARDEN| 39 | 38|2|467|AUSTIN COMMON| 40 | 39|2|157|FIFE HILL| 41 | 40|2|685|MUNRO VILLAGE| 42 | 41|2|1961|FORT LODGE| 43 | 42|2|998|LOMBARD CROFT| 44 | 43|2|979|CONIFER CRESCENT| 45 | 44|2|2748|MONTPELIER RISE| 46 | 45|2|4291|ALMOND PADDOCKS| 47 | 46|2|4456|HAMILTON WOODLANDS| 48 | 47|2|122|MORETON NEWYDD| 49 | 48|2|1098|PITTSBURGH ROAD| 50 | 49|2|4664|HAMPSTEAD SIDE| 51 | 50|2|3263|EPPING GARDENS| 52 | 51|2|3397|PEARSON MEADOW| 53 | 52|2|808|MEAD MILL| 54 | 53|2|3447|BRAESIDE BUILDINGS| 55 | 54|2|1529|LYNN GAIT| 56 | 55|2|1368|WALLACE NOOK| 57 | 56|2|742|NEW BARNES AVENUE| 58 | 57|2|1655|FOSTER ROAD| 59 | 58|2|4583|STONY WOOD| 60 | 59|2|2634|DYKE HOUSE LANE| 61 | 60|2|1456|ASHURST LAS| 62 | 61|2|4709|HAMILTON STREET| 63 | 62|2|3673|KEATS COPPICE| 64 | 63|2|4493|RAMSAY RIDE| 65 | 64|2|4032|SANDPIPER QUADRANT| 66 | 65|2|4507|JOSEPH CLOSE| 67 | 66|2|152|KILLIERSFIELD| 68 | 67|2|2059|FOUNTAIN RIDGEWAY| 69 | 68|2|4694|MIDLAND LANE| 70 | 69|2|835|PRESTON CORNER| 71 | 70|2|1425|SOLWAY SPUR| 72 | 71|2|3773|MAIN CEDARS| 73 | 72|2|2525|MAYFAIR FAIRWAY| 74 | 73|2|2264|BEACON SQUARE| 75 | 74|2|4116|LION SPRINGS| 76 | 75|2|2688|FOLDHILL LANE| 77 | 76|2|3157|ATTLEE GATE| 78 | 77|2|4335|EWART CRESCENT| 79 | 78|2|179|BOYES CRESCENT| 80 | 79|2|891|FURNESS FOLD| 81 | 80|2|1281|GREENFIELDS MOOR| 82 | 81|2|2812|NAIRN CROFT| 83 | 82|2|3982|ALDERS BROOK| 84 | 83|2|4670|WEST PARK BY-PASS| 85 | 84|2|3406|MARSHMONT AVENUE| 86 | 85|2|1109|NOBLE BROW| 87 | 86|2|3530|CALEDONIA CREST| 88 | 87|2|4025|SYMONDS CLOSE| 89 | 88|2|3244|BROOMHILL GARDENS| 90 | 89|2|4804|JACKSON HOLLIES| 91 | 90|2|1086|CAERNARVON GRANGE| 92 | 91|2|1205|SHIRLEY LINKS| 93 | 92|2|3141|SINCLAIR HALL| 94 | 93|2|4543|TOURTEL ROAD| 95 | 94|2|764|BLACKACRE| 96 | 95|2|1334|CANBERRA HAVEN| 97 | 96|2|1188|DAIRY ESPLANADE| 98 | 97|2|1132|DRONLEY ROAD| 99 | 98|2|3787|RECTORY PIECE| 100 | 99|2|988|REDWOOD MALTINGS| 101 | 100|2|4062|GREENFIELD CIRCUS| 102 | 101|2|2512|PRINCE STREET| 103 | 102|2|3760|CANAL LANE| 104 | 103|2|3454|WINTER WAY| 105 | 104|2|4930|PEACE STREET| 106 | 105|2|4142|LUNA WAY| 107 | 106|2|2732|SOUTH LANE| 108 | 107|2|3098|OAK ROW| 109 | 108|2|1705|PINE WAY| 110 | 109|2|4802|ANGEL BOULEVARD| 111 | 110|2|1472|HAZELNUT STREET| -------------------------------------------------------------------------------- /sample-data/CLIENT.EBCDIC-0.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/CLIENT.EBCDIC-0.txt -------------------------------------------------------------------------------- /sample-data/CLIENT.EBCDIC-1.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/CLIENT.EBCDIC-1.txt -------------------------------------------------------------------------------- /sample-data/CLIENT.EBCDIC-2.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/CLIENT.EBCDIC-2.txt -------------------------------------------------------------------------------- /sample-data/CLIENT.EBCDIC.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/CLIENT.EBCDIC.txt -------------------------------------------------------------------------------- /sample-data/CLIENT.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "", 3 | "output": "CLIENT", 4 | "partkname": "CLIENT-ID", 5 | "sortkname": "CLIENT-R-TYPE", 6 | "output-type": "ddb", 7 | "recfm": "fb", 8 | "req-size": 10, 9 | "print": 20, 10 | "max": 0, 11 | "skip": 0, 12 | "lrecl": 500, 13 | "partklen": 4, 14 | "sortklen": 2, 15 | "rem-low-values": true, 16 | "separator": "|", 17 | "transf-rule": [ 18 | { 19 | "offset": 4, 20 | "size": 2, 21 | "hex": "0002", 22 | "transf": "transf1" 23 | }, 24 | { 25 | "offset": 4, 26 | "size": 2, 27 | "hex": "0000", 28 | "transf": "transf2" 29 | } 30 | ], 31 | "transf": [ 32 | { 33 | "type": "bi", 34 | "bytes": 4, 35 | "offset": 0, 36 | "dplaces": 0, 37 | "name": "CLIENT-ID", 38 | "part-key": true, 39 | "sort-key": false 40 | }, 41 | { 42 | "type": "bi", 43 | "bytes": 2, 44 | "offset": 4, 45 | "dplaces": 0, 46 | "name": "CLIENT-TYPE", 47 | "part-key": false, 48 | "sort-key": true 49 | }, 50 | { 51 | "type": "ch", 52 | "bytes": 30, 53 | "offset": 6, 54 | "dplaces": 0, 55 | "name": "CLIENT-NAME", 56 | "part-key": false, 57 | "sort-key": false 58 | }, 59 | { 60 | "type": "ch", 61 | "bytes": 10, 62 | "offset": 36, 63 | "dplaces": 0, 64 | "name": "CLIENT-BDATE", 65 | "part-key": false, 66 | "sort-key": false 67 | }, 68 | { 69 | "type": "ch", 70 | "bytes": 10, 71 | "offset": 46, 72 | "dplaces": 0, 73 | "name": "CLIENT-ED-LVL", 74 | "part-key": false, 75 | "sort-key": false 76 | }, 77 | { 78 | "type": "pd", 79 | "bytes": 5, 80 | "offset": 56, 81 | "dplaces": 2, 82 | "name": "CLIENT-INCOME", 83 | "part-key": false, 84 | "sort-key": false 85 | }, 86 | { 87 | "type": "ch", 88 | "bytes": 439, 89 | "offset": 61, 90 | "dplaces": 0, 91 | "name": "FILLER_1", 92 | "part-key": false, 93 | "sort-key": false 94 | } 95 | ], 96 | "transf1": [ 97 | { 98 | "type": "bi", 99 | "bytes": 4, 100 | "offset": 0, 101 | "dplaces": 0, 102 | "name": "CLIENT-ID", 103 | "part-key": true, 104 | "sort-key": false 105 | }, 106 | { 107 | "type": "bi", 108 | "bytes": 2, 109 | "offset": 4, 110 | "dplaces": 0, 111 | "name": "CLIENT-TYPE", 112 | "part-key": false, 113 | "sort-key": true 114 | }, 115 | { 116 | "type": "bi", 117 | "bytes": 4, 118 | "offset": 6, 119 | "dplaces": 0, 120 | "name": "CLIENT-ADDR-NUMBER", 121 | "part-key": false, 122 | "sort-key": false 123 | }, 124 | { 125 | "type": "ch", 126 | "bytes": 40, 127 | "offset": 10, 128 | "dplaces": 0, 129 | "name": "CLIENT-ADDR-STREET", 130 | "part-key": false, 131 | "sort-key": false 132 | }, 133 | { 134 | "type": "ch", 135 | "bytes": 450, 136 | "offset": 50, 137 | "dplaces": 0, 138 | "name": "FILLER_2", 139 | "part-key": false, 140 | "sort-key": false 141 | } 142 | ], 143 | "transf2": [ 144 | { 145 | "type": "bi", 146 | "bytes": 4, 147 | "offset": 0, 148 | "dplaces": 0, 149 | "name": "CLIENT-ID", 150 | "part-key": true, 151 | "sort-key": false 152 | }, 153 | { 154 | "type": "bi", 155 | "bytes": 2, 156 | "offset": 4, 157 | "dplaces": 0, 158 | "name": "CLIENT-TYPE", 159 | "part-key": false, 160 | "sort-key": true 161 | }, 162 | { 163 | "type": "bi", 164 | "bytes": 4, 165 | "offset": 6, 166 | "dplaces": 0, 167 | "name": "CLIENT-RECORD-COUNT", 168 | "part-key": false, 169 | "sort-key": false 170 | }, 171 | { 172 | "type": "ch", 173 | "bytes": 490, 174 | "offset": 10, 175 | "dplaces": 0, 176 | "name": "FILLER_3", 177 | "part-key": false, 178 | "sort-key": false 179 | } 180 | ] 181 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-ddb-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "CLIENT", 4 | "partkname": "CLIENT-ID", 5 | "sortkname": "CLIENT-R-TYPE", 6 | "output-type": "ddb", 7 | "output-s3key": "", 8 | "output-s3bkt": "", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 4, 16 | "sortklen": 2, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [ 20 | { 21 | "offset": 4, 22 | "size": 2, 23 | "hex": "0002", 24 | "transf": "transf1" 25 | }, 26 | { 27 | "offset": 4, 28 | "size": 2, 29 | "hex": "0000", 30 | "transf": "transf2" 31 | } 32 | ], 33 | "transf": [ 34 | { 35 | "type": "bi", 36 | "bytes": 4, 37 | "offset": 0, 38 | "dplaces": 0, 39 | "name": "CLIENT-ID", 40 | "part-key": true, 41 | "sort-key": false 42 | }, 43 | { 44 | "type": "bi", 45 | "bytes": 2, 46 | "offset": 4, 47 | "dplaces": 0, 48 | "name": "CLIENT-TYPE", 49 | "part-key": false, 50 | "sort-key": true 51 | }, 52 | { 53 | "type": "ch", 54 | "bytes": 30, 55 | "offset": 6, 56 | "dplaces": 0, 57 | "name": "CLIENT-NAME", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 10, 64 | "offset": 36, 65 | "dplaces": 0, 66 | "name": "CLIENT-BDATE", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 46, 74 | "dplaces": 0, 75 | "name": "CLIENT-ED-LVL", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "pd", 81 | "bytes": 5, 82 | "offset": 56, 83 | "dplaces": 2, 84 | "name": "CLIENT-INCOME", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "ch", 90 | "bytes": 439, 91 | "offset": 61, 92 | "dplaces": 0, 93 | "name": "FILLER_1", 94 | "part-key": false, 95 | "sort-key": false 96 | } 97 | ], 98 | "transf1": [ 99 | { 100 | "type": "bi", 101 | "bytes": 4, 102 | "offset": 0, 103 | "dplaces": 0, 104 | "name": "CLIENT-ID", 105 | "part-key": true, 106 | "sort-key": false 107 | }, 108 | { 109 | "type": "bi", 110 | "bytes": 2, 111 | "offset": 4, 112 | "dplaces": 0, 113 | "name": "CLIENT-TYPE", 114 | "part-key": false, 115 | "sort-key": true 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 4, 120 | "offset": 6, 121 | "dplaces": 0, 122 | "name": "CLIENT-ADDR-NUMBER", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "ch", 128 | "bytes": 40, 129 | "offset": 10, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-STREET", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 450, 138 | "offset": 50, 139 | "dplaces": 0, 140 | "name": "FILLER_2", 141 | "part-key": false, 142 | "sort-key": false 143 | } 144 | ], 145 | "transf2": [ 146 | { 147 | "type": "bi", 148 | "bytes": 4, 149 | "offset": 0, 150 | "dplaces": 0, 151 | "name": "CLIENT-ID", 152 | "part-key": true, 153 | "sort-key": false 154 | }, 155 | { 156 | "type": "bi", 157 | "bytes": 2, 158 | "offset": 4, 159 | "dplaces": 0, 160 | "name": "CLIENT-TYPE", 161 | "part-key": false, 162 | "sort-key": true 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 4, 167 | "offset": 6, 168 | "dplaces": 0, 169 | "name": "CLIENT-RECORD-COUNT", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "ch", 175 | "bytes": 490, 176 | "offset": 10, 177 | "dplaces": 0, 178 | "name": "FILLER_3", 179 | "part-key": false, 180 | "sort-key": false 181 | } 182 | ] 183 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-ddb-s3.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "s3://your-bucket-name/CLIENT.EBCDIC.txt", 3 | "output": "CLIENT", 4 | "partkname": "CLIENT-ID", 5 | "sortkname": "CLIENT-R-TYPE", 6 | "output-type": "ddb", 7 | "recfm": "fb", 8 | "req-size": 10, 9 | "print": 20, 10 | "max": 0, 11 | "skip": 0, 12 | "lrecl": 500, 13 | "partklen": 4, 14 | "sortklen": 2, 15 | "rem-low-values": true, 16 | "separator": "|", 17 | "transf-rule": [ 18 | { 19 | "offset": 4, 20 | "size": 2, 21 | "hex": "0002", 22 | "transf": "transf1" 23 | }, 24 | { 25 | "offset": 4, 26 | "size": 2, 27 | "hex": "0000", 28 | "transf": "transf2" 29 | } 30 | ], 31 | "transf": [ 32 | { 33 | "type": "bi", 34 | "bytes": 4, 35 | "offset": 0, 36 | "dplaces": 0, 37 | "name": "CLIENT-ID", 38 | "part-key": true, 39 | "sort-key": false 40 | }, 41 | { 42 | "type": "bi", 43 | "bytes": 2, 44 | "offset": 4, 45 | "dplaces": 0, 46 | "name": "CLIENT-TYPE", 47 | "part-key": false, 48 | "sort-key": true 49 | }, 50 | { 51 | "type": "ch", 52 | "bytes": 30, 53 | "offset": 6, 54 | "dplaces": 0, 55 | "name": "CLIENT-NAME", 56 | "part-key": false, 57 | "sort-key": false 58 | }, 59 | { 60 | "type": "ch", 61 | "bytes": 10, 62 | "offset": 36, 63 | "dplaces": 0, 64 | "name": "CLIENT-BDATE", 65 | "part-key": false, 66 | "sort-key": false 67 | }, 68 | { 69 | "type": "ch", 70 | "bytes": 10, 71 | "offset": 46, 72 | "dplaces": 0, 73 | "name": "CLIENT-ED-LVL", 74 | "part-key": false, 75 | "sort-key": false 76 | }, 77 | { 78 | "type": "pd", 79 | "bytes": 5, 80 | "offset": 56, 81 | "dplaces": 2, 82 | "name": "CLIENT-INCOME", 83 | "part-key": false, 84 | "sort-key": false 85 | }, 86 | { 87 | "type": "ch", 88 | "bytes": 439, 89 | "offset": 61, 90 | "dplaces": 0, 91 | "name": "FILLER_1", 92 | "part-key": false, 93 | "sort-key": false 94 | } 95 | ], 96 | "transf1": [ 97 | { 98 | "type": "bi", 99 | "bytes": 4, 100 | "offset": 0, 101 | "dplaces": 0, 102 | "name": "CLIENT-ID", 103 | "part-key": true, 104 | "sort-key": false 105 | }, 106 | { 107 | "type": "bi", 108 | "bytes": 2, 109 | "offset": 4, 110 | "dplaces": 0, 111 | "name": "CLIENT-TYPE", 112 | "part-key": false, 113 | "sort-key": true 114 | }, 115 | { 116 | "type": "bi", 117 | "bytes": 4, 118 | "offset": 6, 119 | "dplaces": 0, 120 | "name": "CLIENT-ADDR-NUMBER", 121 | "part-key": false, 122 | "sort-key": false 123 | }, 124 | { 125 | "type": "ch", 126 | "bytes": 40, 127 | "offset": 10, 128 | "dplaces": 0, 129 | "name": "CLIENT-ADDR-STREET", 130 | "part-key": false, 131 | "sort-key": false 132 | }, 133 | { 134 | "type": "ch", 135 | "bytes": 450, 136 | "offset": 50, 137 | "dplaces": 0, 138 | "name": "FILLER_2", 139 | "part-key": false, 140 | "sort-key": false 141 | } 142 | ], 143 | "transf2": [ 144 | { 145 | "type": "bi", 146 | "bytes": 4, 147 | "offset": 0, 148 | "dplaces": 0, 149 | "name": "CLIENT-ID", 150 | "part-key": true, 151 | "sort-key": false 152 | }, 153 | { 154 | "type": "bi", 155 | "bytes": 2, 156 | "offset": 4, 157 | "dplaces": 0, 158 | "name": "CLIENT-TYPE", 159 | "part-key": false, 160 | "sort-key": true 161 | }, 162 | { 163 | "type": "bi", 164 | "bytes": 4, 165 | "offset": 6, 166 | "dplaces": 0, 167 | "name": "CLIENT-RECORD-COUNT", 168 | "part-key": false, 169 | "sort-key": false 170 | }, 171 | { 172 | "type": "ch", 173 | "bytes": 490, 174 | "offset": 10, 175 | "dplaces": 0, 176 | "name": "FILLER_3", 177 | "part-key": false, 178 | "sort-key": false 179 | } 180 | ] 181 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-ddb.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "CLIENT", 4 | "partkname": "CLIENT-ID", 5 | "sortkname": "CLIENT-R-TYPE", 6 | "output-type": "ddb", 7 | "output-s3key": "", 8 | "output-s3bkt": "", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 4, 16 | "sortklen": 2, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [], 20 | "transf": [ 21 | { 22 | "type": "bi", 23 | "bytes": 4, 24 | "offset": 0, 25 | "dplaces": 0, 26 | "name": "CLIENT-ID", 27 | "part-key": true, 28 | "sort-key": false 29 | }, 30 | { 31 | "type": "bi", 32 | "bytes": 2, 33 | "offset": 4, 34 | "dplaces": 0, 35 | "name": "CLIENT-TYPE", 36 | "part-key": false, 37 | "sort-key": true 38 | }, 39 | { 40 | "type": "ch", 41 | "bytes": 30, 42 | "offset": 6, 43 | "dplaces": 0, 44 | "name": "CLIENT-NAME", 45 | "part-key": false, 46 | "sort-key": false 47 | }, 48 | { 49 | "type": "ch", 50 | "bytes": 10, 51 | "offset": 36, 52 | "dplaces": 0, 53 | "name": "CLIENT-BDATE", 54 | "part-key": false, 55 | "sort-key": false 56 | }, 57 | { 58 | "type": "ch", 59 | "bytes": 10, 60 | "offset": 46, 61 | "dplaces": 0, 62 | "name": "CLIENT-ED-LVL", 63 | "part-key": false, 64 | "sort-key": false 65 | }, 66 | { 67 | "type": "pd", 68 | "bytes": 5, 69 | "offset": 56, 70 | "dplaces": 2, 71 | "name": "CLIENT-INCOME", 72 | "part-key": false, 73 | "sort-key": false 74 | }, 75 | { 76 | "type": "ch", 77 | "bytes": 439, 78 | "offset": 61, 79 | "dplaces": 0, 80 | "name": "FILLER_1", 81 | "part-key": false, 82 | "sort-key": false 83 | } 84 | ], 85 | "transf1": [ 86 | { 87 | "type": "bi", 88 | "bytes": 4, 89 | "offset": 0, 90 | "dplaces": 0, 91 | "name": "CLIENT-ID", 92 | "part-key": true, 93 | "sort-key": false 94 | }, 95 | { 96 | "type": "bi", 97 | "bytes": 2, 98 | "offset": 4, 99 | "dplaces": 0, 100 | "name": "CLIENT-TYPE", 101 | "part-key": false, 102 | "sort-key": true 103 | }, 104 | { 105 | "type": "bi", 106 | "bytes": 4, 107 | "offset": 6, 108 | "dplaces": 0, 109 | "name": "CLIENT-ADDR-NUMBER", 110 | "part-key": false, 111 | "sort-key": false 112 | }, 113 | { 114 | "type": "ch", 115 | "bytes": 40, 116 | "offset": 10, 117 | "dplaces": 0, 118 | "name": "CLIENT-ADDR-STREET", 119 | "part-key": false, 120 | "sort-key": false 121 | }, 122 | { 123 | "type": "ch", 124 | "bytes": 450, 125 | "offset": 50, 126 | "dplaces": 0, 127 | "name": "FILLER_2", 128 | "part-key": false, 129 | "sort-key": false 130 | } 131 | ], 132 | "transf2": [ 133 | { 134 | "type": "bi", 135 | "bytes": 4, 136 | "offset": 0, 137 | "dplaces": 0, 138 | "name": "CLIENT-ID", 139 | "part-key": true, 140 | "sort-key": false 141 | }, 142 | { 143 | "type": "bi", 144 | "bytes": 2, 145 | "offset": 4, 146 | "dplaces": 0, 147 | "name": "CLIENT-TYPE", 148 | "part-key": false, 149 | "sort-key": true 150 | }, 151 | { 152 | "type": "bi", 153 | "bytes": 4, 154 | "offset": 6, 155 | "dplaces": 0, 156 | "name": "CLIENT-RECORD-COUNT", 157 | "part-key": false, 158 | "sort-key": false 159 | }, 160 | { 161 | "type": "ch", 162 | "bytes": 490, 163 | "offset": 10, 164 | "dplaces": 0, 165 | "name": "FILLER_3", 166 | "part-key": false, 167 | "sort-key": false 168 | } 169 | ] 170 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-dict.json: -------------------------------------------------------------------------------- 1 | { 2 | "REC-CLIENT": { 3 | "id": 1, 4 | "level": 1, 5 | "group": true, 6 | "CLIENT-KEY": { 7 | "id": 2, 8 | "level": 3, 9 | "group": true, 10 | "CLIENT-ID": { 11 | "id": 3, 12 | "level": 5, 13 | "group": false, 14 | "pict": "9(009)", 15 | "type": "bi", 16 | "length": 9, 17 | "bytes": 4, 18 | "dplaces": 0 19 | }, 20 | "CLIENT-TYPE": { 21 | "id": 4, 22 | "level": 5, 23 | "group": false, 24 | "pict": "9(004)", 25 | "type": "bi", 26 | "length": 4, 27 | "bytes": 2, 28 | "dplaces": 0 29 | } 30 | }, 31 | "CLIENT-MAIN": { 32 | "id": 5, 33 | "level": 3, 34 | "group": true, 35 | "CLIENT-NAME": { 36 | "id": 6, 37 | "level": 5, 38 | "group": false, 39 | "pict": "X(030)", 40 | "type": "ch", 41 | "length": 30, 42 | "bytes": 30, 43 | "dplaces": 0 44 | }, 45 | "CLIENT-BDATE": { 46 | "id": 7, 47 | "level": 5, 48 | "group": false, 49 | "pict": "X(010)", 50 | "type": "ch", 51 | "length": 10, 52 | "bytes": 10, 53 | "dplaces": 0 54 | }, 55 | "CLIENT-ED-LVL": { 56 | "id": 8, 57 | "level": 5, 58 | "group": false, 59 | "pict": "X(010)", 60 | "type": "ch", 61 | "length": 10, 62 | "bytes": 10, 63 | "dplaces": 0 64 | }, 65 | "CLIENT-INCOME": { 66 | "id": 9, 67 | "level": 5, 68 | "group": false, 69 | "pict": "9(007)V99", 70 | "type": "pd", 71 | "length": 9, 72 | "bytes": 5, 73 | "dplaces": 2 74 | }, 75 | "FILLER_1": { 76 | "id": 10, 77 | "level": 5, 78 | "group": false, 79 | "pict": "X(439)", 80 | "type": "ch", 81 | "length": 439, 82 | "bytes": 439, 83 | "dplaces": 0 84 | } 85 | }, 86 | "CLIENT-ADDRESS": { 87 | "id": 11, 88 | "level": 3, 89 | "group": true, 90 | "redefines": "CLIENT-MAIN", 91 | "CLIENT-ADDR-NUMBER": { 92 | "id": 12, 93 | "level": 5, 94 | "group": false, 95 | "pict": "9(009)", 96 | "type": "bi", 97 | "length": 9, 98 | "bytes": 4, 99 | "dplaces": 0 100 | }, 101 | "CLIENT-ADDR-STREET": { 102 | "id": 13, 103 | "level": 5, 104 | "group": false, 105 | "pict": "X(040)", 106 | "type": "ch", 107 | "length": 40, 108 | "bytes": 40, 109 | "dplaces": 0 110 | }, 111 | "FILLER_2": { 112 | "id": 14, 113 | "level": 5, 114 | "group": false, 115 | "pict": "X(450)", 116 | "type": "ch", 117 | "length": 450, 118 | "bytes": 450, 119 | "dplaces": 0 120 | } 121 | }, 122 | "CLIENT-HEADER": { 123 | "id": 15, 124 | "level": 3, 125 | "group": true, 126 | "redefines": "CLIENT-MAIN", 127 | "CLIENT-RECORD-COUNT": { 128 | "id": 16, 129 | "level": 5, 130 | "group": false, 131 | "pict": "9(009)", 132 | "type": "bi", 133 | "length": 9, 134 | "bytes": 4, 135 | "dplaces": 0 136 | }, 137 | "FILLER_3": { 138 | "id": 17, 139 | "level": 5, 140 | "group": false, 141 | "pict": "X(490)", 142 | "type": "ch", 143 | "length": 490, 144 | "bytes": 490, 145 | "dplaces": 0 146 | } 147 | } 148 | } 149 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-list-2.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC-2.txt", 3 | "output": "sample-data/CLIENT.ASCII-2.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "file", 7 | "req-size": 10, 8 | "print": 20, 9 | "max": 0, 10 | "skip": 0, 11 | "lrecl": 500, 12 | "partklen": 0, 13 | "sortklen": 0, 14 | "rem-low-values": true, 15 | "separator": "|", 16 | "transf-rule": [ 17 | { 18 | "offset": 4, 19 | "size": 2, 20 | "hex": "0002", 21 | "transf": "transf1" 22 | }, 23 | { 24 | "offset": 4, 25 | "size": 2, 26 | "hex": "0000", 27 | "transf": "transf2" 28 | } 29 | ], 30 | "transf": [ 31 | { 32 | "type": "bi", 33 | "bytes": 4, 34 | "offset": 0, 35 | "dplaces": 0, 36 | "name": "CLIENT-ID", 37 | "part-key": false, 38 | "sort-key": false 39 | }, 40 | { 41 | "type": "bi", 42 | "bytes": 2, 43 | "offset": 4, 44 | "dplaces": 0, 45 | "name": "CLIENT-TYPE", 46 | "part-key": false, 47 | "sort-key": false 48 | }, 49 | { 50 | "type": "ch", 51 | "bytes": 30, 52 | "offset": 6, 53 | "dplaces": 0, 54 | "name": "CLIENT-NAME", 55 | "part-key": false, 56 | "sort-key": false 57 | }, 58 | { 59 | "type": "ch", 60 | "bytes": 10, 61 | "offset": 36, 62 | "dplaces": 0, 63 | "name": "CLIENT-BDATE", 64 | "part-key": false, 65 | "sort-key": false 66 | }, 67 | { 68 | "type": "ch", 69 | "bytes": 10, 70 | "offset": 46, 71 | "dplaces": 0, 72 | "name": "CLIENT-ED-LVL", 73 | "part-key": false, 74 | "sort-key": false 75 | }, 76 | { 77 | "type": "pd", 78 | "bytes": 5, 79 | "offset": 56, 80 | "dplaces": 2, 81 | "name": "CLIENT-INCOME", 82 | "part-key": false, 83 | "sort-key": false 84 | }, 85 | { 86 | "type": "ch", 87 | "bytes": 439, 88 | "offset": 61, 89 | "dplaces": 0, 90 | "name": "FILLER_1", 91 | "part-key": false, 92 | "sort-key": false 93 | } 94 | ], 95 | "transf1": [ 96 | { 97 | "type": "bi", 98 | "bytes": 4, 99 | "offset": 0, 100 | "dplaces": 0, 101 | "name": "CLIENT-ID", 102 | "part-key": false, 103 | "sort-key": false 104 | }, 105 | { 106 | "type": "bi", 107 | "bytes": 2, 108 | "offset": 4, 109 | "dplaces": 0, 110 | "name": "CLIENT-TYPE", 111 | "part-key": false, 112 | "sort-key": false 113 | }, 114 | { 115 | "type": "bi", 116 | "bytes": 4, 117 | "offset": 6, 118 | "dplaces": 0, 119 | "name": "CLIENT-ADDR-NUMBER", 120 | "part-key": false, 121 | "sort-key": false 122 | }, 123 | { 124 | "type": "ch", 125 | "bytes": 40, 126 | "offset": 10, 127 | "dplaces": 0, 128 | "name": "CLIENT-ADDR-STREET", 129 | "part-key": false, 130 | "sort-key": false 131 | }, 132 | { 133 | "type": "ch", 134 | "bytes": 450, 135 | "offset": 50, 136 | "dplaces": 0, 137 | "name": "FILLER_2", 138 | "part-key": false, 139 | "sort-key": false 140 | } 141 | ], 142 | "transf2": [ 143 | { 144 | "type": "bi", 145 | "bytes": 4, 146 | "offset": 0, 147 | "dplaces": 0, 148 | "name": "CLIENT-ID", 149 | "part-key": false, 150 | "sort-key": false 151 | }, 152 | { 153 | "type": "bi", 154 | "bytes": 2, 155 | "offset": 4, 156 | "dplaces": 0, 157 | "name": "CLIENT-TYPE", 158 | "part-key": false, 159 | "sort-key": false 160 | }, 161 | { 162 | "type": "bi", 163 | "bytes": 4, 164 | "offset": 6, 165 | "dplaces": 0, 166 | "name": "CLIENT-RECORD-COUNT", 167 | "part-key": false, 168 | "sort-key": false 169 | }, 170 | { 171 | "type": "ch", 172 | "bytes": 490, 173 | "offset": 10, 174 | "dplaces": 0, 175 | "name": "FILLER_3", 176 | "part-key": false, 177 | "sort-key": false 178 | } 179 | ] 180 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-list-s3-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "asciifile.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "s3", 7 | "output-s3key": "CLIENT.ASCII.txt", 8 | "output-s3bkt": "your-bucket-name", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 0, 16 | "sortklen": 0, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [ 20 | { 21 | "offset": 4, 22 | "size": 2, 23 | "hex": "0002", 24 | "transf": "transf1" 25 | }, 26 | { 27 | "offset": 4, 28 | "size": 2, 29 | "hex": "0000", 30 | "transf": "transf2" 31 | } 32 | ], 33 | "transf": [ 34 | { 35 | "type": "bi", 36 | "bytes": 4, 37 | "offset": 0, 38 | "dplaces": 0, 39 | "name": "CLIENT-ID", 40 | "part-key": false, 41 | "sort-key": false 42 | }, 43 | { 44 | "type": "bi", 45 | "bytes": 2, 46 | "offset": 4, 47 | "dplaces": 0, 48 | "name": "CLIENT-TYPE", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "ch", 54 | "bytes": 30, 55 | "offset": 6, 56 | "dplaces": 0, 57 | "name": "CLIENT-NAME", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 10, 64 | "offset": 36, 65 | "dplaces": 0, 66 | "name": "CLIENT-BDATE", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 46, 74 | "dplaces": 0, 75 | "name": "CLIENT-ED-LVL", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "pd", 81 | "bytes": 5, 82 | "offset": 56, 83 | "dplaces": 2, 84 | "name": "CLIENT-INCOME", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "ch", 90 | "bytes": 439, 91 | "offset": 61, 92 | "dplaces": 0, 93 | "name": "FILLER_1", 94 | "part-key": false, 95 | "sort-key": false 96 | } 97 | ], 98 | "transf1": [ 99 | { 100 | "type": "bi", 101 | "bytes": 4, 102 | "offset": 0, 103 | "dplaces": 0, 104 | "name": "CLIENT-ID", 105 | "part-key": false, 106 | "sort-key": false 107 | }, 108 | { 109 | "type": "bi", 110 | "bytes": 2, 111 | "offset": 4, 112 | "dplaces": 0, 113 | "name": "CLIENT-TYPE", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 4, 120 | "offset": 6, 121 | "dplaces": 0, 122 | "name": "CLIENT-ADDR-NUMBER", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "ch", 128 | "bytes": 40, 129 | "offset": 10, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-STREET", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 450, 138 | "offset": 50, 139 | "dplaces": 0, 140 | "name": "FILLER_2", 141 | "part-key": false, 142 | "sort-key": false 143 | } 144 | ], 145 | "transf2": [ 146 | { 147 | "type": "bi", 148 | "bytes": 4, 149 | "offset": 0, 150 | "dplaces": 0, 151 | "name": "CLIENT-ID", 152 | "part-key": false, 153 | "sort-key": false 154 | }, 155 | { 156 | "type": "bi", 157 | "bytes": 2, 158 | "offset": 4, 159 | "dplaces": 0, 160 | "name": "CLIENT-TYPE", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 4, 167 | "offset": 6, 168 | "dplaces": 0, 169 | "name": "CLIENT-RECORD-COUNT", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "ch", 175 | "bytes": 490, 176 | "offset": 10, 177 | "dplaces": 0, 178 | "name": "FILLER_3", 179 | "part-key": false, 180 | "sort-key": false 181 | } 182 | ] 183 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-list-s3.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "asciifile.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "s3", 7 | "output-s3key": "CLIENT.ASCII.txt", 8 | "output-s3bkt": "your-bucket-name", 9 | "recfm": "fb", 10 | "req-size": 10, 11 | "print": 20, 12 | "max": 0, 13 | "skip": 0, 14 | "lrecl": 500, 15 | "partklen": 0, 16 | "sortklen": 0, 17 | "rem-low-values": true, 18 | "separator": "|", 19 | "transf-rule": [ 20 | { 21 | "offset": 4, 22 | "size": 2, 23 | "hex": "0002", 24 | "transf": "transf1" 25 | }, 26 | { 27 | "offset": 4, 28 | "size": 2, 29 | "hex": "0000", 30 | "transf": "transf2" 31 | } 32 | ], 33 | "transf": [ 34 | { 35 | "type": "bi", 36 | "bytes": 4, 37 | "offset": 0, 38 | "dplaces": 0, 39 | "name": "CLIENT-ID", 40 | "part-key": false, 41 | "sort-key": false 42 | }, 43 | { 44 | "type": "bi", 45 | "bytes": 2, 46 | "offset": 4, 47 | "dplaces": 0, 48 | "name": "CLIENT-TYPE", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "ch", 54 | "bytes": 30, 55 | "offset": 6, 56 | "dplaces": 0, 57 | "name": "CLIENT-NAME", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 10, 64 | "offset": 36, 65 | "dplaces": 0, 66 | "name": "CLIENT-BDATE", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 46, 74 | "dplaces": 0, 75 | "name": "CLIENT-ED-LVL", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "pd", 81 | "bytes": 5, 82 | "offset": 56, 83 | "dplaces": 2, 84 | "name": "CLIENT-INCOME", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "ch", 90 | "bytes": 439, 91 | "offset": 61, 92 | "dplaces": 0, 93 | "name": "FILLER_1", 94 | "part-key": false, 95 | "sort-key": false 96 | } 97 | ], 98 | "transf1": [ 99 | { 100 | "type": "bi", 101 | "bytes": 4, 102 | "offset": 0, 103 | "dplaces": 0, 104 | "name": "CLIENT-ID", 105 | "part-key": false, 106 | "sort-key": false 107 | }, 108 | { 109 | "type": "bi", 110 | "bytes": 2, 111 | "offset": 4, 112 | "dplaces": 0, 113 | "name": "CLIENT-TYPE", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 4, 120 | "offset": 6, 121 | "dplaces": 0, 122 | "name": "CLIENT-ADDR-NUMBER", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "ch", 128 | "bytes": 40, 129 | "offset": 10, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-STREET", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 450, 138 | "offset": 50, 139 | "dplaces": 0, 140 | "name": "FILLER_2", 141 | "part-key": false, 142 | "sort-key": false 143 | } 144 | ], 145 | "transf2": [ 146 | { 147 | "type": "bi", 148 | "bytes": 4, 149 | "offset": 0, 150 | "dplaces": 0, 151 | "name": "CLIENT-ID", 152 | "part-key": false, 153 | "sort-key": false 154 | }, 155 | { 156 | "type": "bi", 157 | "bytes": 2, 158 | "offset": 4, 159 | "dplaces": 0, 160 | "name": "CLIENT-TYPE", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 4, 167 | "offset": 6, 168 | "dplaces": 0, 169 | "name": "CLIENT-RECORD-COUNT", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "ch", 175 | "bytes": 490, 176 | "offset": 10, 177 | "dplaces": 0, 178 | "name": "FILLER_3", 179 | "part-key": false, 180 | "sort-key": false 181 | } 182 | ] 183 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-list.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "sample-data/CLIENT.ASCII.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "file", 7 | "recfm": "fb", 8 | "req-size": 10, 9 | "print": 20, 10 | "max": 0, 11 | "skip": 0, 12 | "lrecl": 500, 13 | "partklen": 0, 14 | "sortklen": 0, 15 | "rem-low-values": true, 16 | "separator": "|", 17 | "transf-rule": [], 18 | "transf": [ 19 | { 20 | "type": "bi", 21 | "bytes": 4, 22 | "offset": 0, 23 | "dplaces": 0, 24 | "name": "CLIENT-ID", 25 | "part-key": false, 26 | "sort-key": false 27 | }, 28 | { 29 | "type": "bi", 30 | "bytes": 2, 31 | "offset": 4, 32 | "dplaces": 0, 33 | "name": "CLIENT-TYPE", 34 | "part-key": false, 35 | "sort-key": false 36 | }, 37 | { 38 | "type": "ch", 39 | "bytes": 30, 40 | "offset": 6, 41 | "dplaces": 0, 42 | "name": "CLIENT-NAME", 43 | "part-key": false, 44 | "sort-key": false 45 | }, 46 | { 47 | "type": "ch", 48 | "bytes": 10, 49 | "offset": 36, 50 | "dplaces": 0, 51 | "name": "CLIENT-BDATE", 52 | "part-key": false, 53 | "sort-key": false 54 | }, 55 | { 56 | "type": "ch", 57 | "bytes": 10, 58 | "offset": 46, 59 | "dplaces": 0, 60 | "name": "CLIENT-ED-LVL", 61 | "part-key": false, 62 | "sort-key": false 63 | }, 64 | { 65 | "type": "pd", 66 | "bytes": 5, 67 | "offset": 56, 68 | "dplaces": 2, 69 | "name": "CLIENT-INCOME", 70 | "part-key": false, 71 | "sort-key": false 72 | }, 73 | { 74 | "type": "ch", 75 | "bytes": 439, 76 | "offset": 61, 77 | "dplaces": 0, 78 | "name": "FILLER_1", 79 | "part-key": false, 80 | "sort-key": false 81 | } 82 | ], 83 | "transf1": [ 84 | { 85 | "type": "bi", 86 | "bytes": 4, 87 | "offset": 0, 88 | "dplaces": 0, 89 | "name": "CLIENT-ID", 90 | "part-key": false, 91 | "sort-key": false 92 | }, 93 | { 94 | "type": "bi", 95 | "bytes": 2, 96 | "offset": 4, 97 | "dplaces": 0, 98 | "name": "CLIENT-TYPE", 99 | "part-key": false, 100 | "sort-key": false 101 | }, 102 | { 103 | "type": "bi", 104 | "bytes": 4, 105 | "offset": 6, 106 | "dplaces": 0, 107 | "name": "CLIENT-ADDR-NUMBER", 108 | "part-key": false, 109 | "sort-key": false 110 | }, 111 | { 112 | "type": "ch", 113 | "bytes": 40, 114 | "offset": 10, 115 | "dplaces": 0, 116 | "name": "CLIENT-ADDR-STREET", 117 | "part-key": false, 118 | "sort-key": false 119 | }, 120 | { 121 | "type": "ch", 122 | "bytes": 450, 123 | "offset": 50, 124 | "dplaces": 0, 125 | "name": "FILLER_2", 126 | "part-key": false, 127 | "sort-key": false 128 | } 129 | ], 130 | "transf2": [ 131 | { 132 | "type": "bi", 133 | "bytes": 4, 134 | "offset": 0, 135 | "dplaces": 0, 136 | "name": "CLIENT-ID", 137 | "part-key": false, 138 | "sort-key": false 139 | }, 140 | { 141 | "type": "bi", 142 | "bytes": 2, 143 | "offset": 4, 144 | "dplaces": 0, 145 | "name": "CLIENT-TYPE", 146 | "part-key": false, 147 | "sort-key": false 148 | }, 149 | { 150 | "type": "bi", 151 | "bytes": 4, 152 | "offset": 6, 153 | "dplaces": 0, 154 | "name": "CLIENT-RECORD-COUNT", 155 | "part-key": false, 156 | "sort-key": false 157 | }, 158 | { 159 | "type": "ch", 160 | "bytes": 490, 161 | "offset": 10, 162 | "dplaces": 0, 163 | "name": "FILLER_3", 164 | "part-key": false, 165 | "sort-key": false 166 | } 167 | ] 168 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "input": "sample-data/CLIENT.EBCDIC.txt", 3 | "output": "sample-data/CLIENT.ASCII.txt", 4 | "partkname": "", 5 | "sortkname": "", 6 | "output-type": "file", 7 | "recfm": "fb", 8 | "req-size": 10, 9 | "print": 20, 10 | "max": 0, 11 | "skip": 0, 12 | "lrecl": 500, 13 | "partklen": 0, 14 | "sortklen": 0, 15 | "rem-low-values": true, 16 | "separator": "|", 17 | "transf-rule": [ 18 | { 19 | "offset": 4, 20 | "size": 2, 21 | "hex": "0002", 22 | "transf": "transf1" 23 | }, 24 | { 25 | "offset": 4, 26 | "size": 2, 27 | "hex": "0000", 28 | "transf": "transf2" 29 | } 30 | ], 31 | "transf": [ 32 | { 33 | "type": "bi", 34 | "bytes": 4, 35 | "offset": 0, 36 | "dplaces": 0, 37 | "name": "CLIENT-ID", 38 | "part-key": false, 39 | "sort-key": false 40 | }, 41 | { 42 | "type": "bi", 43 | "bytes": 2, 44 | "offset": 4, 45 | "dplaces": 0, 46 | "name": "CLIENT-TYPE", 47 | "part-key": false, 48 | "sort-key": false 49 | }, 50 | { 51 | "type": "ch", 52 | "bytes": 30, 53 | "offset": 6, 54 | "dplaces": 0, 55 | "name": "CLIENT-NAME", 56 | "part-key": false, 57 | "sort-key": false 58 | }, 59 | { 60 | "type": "ch", 61 | "bytes": 10, 62 | "offset": 36, 63 | "dplaces": 0, 64 | "name": "CLIENT-BDATE", 65 | "part-key": false, 66 | "sort-key": false 67 | }, 68 | { 69 | "type": "ch", 70 | "bytes": 10, 71 | "offset": 46, 72 | "dplaces": 0, 73 | "name": "CLIENT-ED-LVL", 74 | "part-key": false, 75 | "sort-key": false 76 | }, 77 | { 78 | "type": "pd", 79 | "bytes": 5, 80 | "offset": 56, 81 | "dplaces": 2, 82 | "name": "CLIENT-INCOME", 83 | "part-key": false, 84 | "sort-key": false 85 | }, 86 | { 87 | "type": "ch", 88 | "bytes": 439, 89 | "offset": 61, 90 | "dplaces": 0, 91 | "name": "FILLER_1", 92 | "part-key": false, 93 | "sort-key": false 94 | } 95 | ], 96 | "transf1": [ 97 | { 98 | "type": "bi", 99 | "bytes": 4, 100 | "offset": 0, 101 | "dplaces": 0, 102 | "name": "CLIENT-ID", 103 | "part-key": false, 104 | "sort-key": false 105 | }, 106 | { 107 | "type": "bi", 108 | "bytes": 2, 109 | "offset": 4, 110 | "dplaces": 0, 111 | "name": "CLIENT-TYPE", 112 | "part-key": false, 113 | "sort-key": false 114 | }, 115 | { 116 | "type": "bi", 117 | "bytes": 4, 118 | "offset": 6, 119 | "dplaces": 0, 120 | "name": "CLIENT-ADDR-NUMBER", 121 | "part-key": false, 122 | "sort-key": false 123 | }, 124 | { 125 | "type": "ch", 126 | "bytes": 40, 127 | "offset": 10, 128 | "dplaces": 0, 129 | "name": "CLIENT-ADDR-STREET", 130 | "part-key": false, 131 | "sort-key": false 132 | }, 133 | { 134 | "type": "ch", 135 | "bytes": 450, 136 | "offset": 50, 137 | "dplaces": 0, 138 | "name": "FILLER_2", 139 | "part-key": false, 140 | "sort-key": false 141 | } 142 | ], 143 | "transf2": [ 144 | { 145 | "type": "bi", 146 | "bytes": 4, 147 | "offset": 0, 148 | "dplaces": 0, 149 | "name": "CLIENT-ID", 150 | "part-key": false, 151 | "sort-key": false 152 | }, 153 | { 154 | "type": "bi", 155 | "bytes": 2, 156 | "offset": 4, 157 | "dplaces": 0, 158 | "name": "CLIENT-TYPE", 159 | "part-key": false, 160 | "sort-key": false 161 | }, 162 | { 163 | "type": "bi", 164 | "bytes": 4, 165 | "offset": 6, 166 | "dplaces": 0, 167 | "name": "CLIENT-RECORD-COUNT", 168 | "part-key": false, 169 | "sort-key": false 170 | }, 171 | { 172 | "type": "ch", 173 | "bytes": 490, 174 | "offset": 10, 175 | "dplaces": 0, 176 | "name": "FILLER_3", 177 | "part-key": false, 178 | "sort-key": false 179 | } 180 | ] 181 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-split-local-err.json: -------------------------------------------------------------------------------- 1 | { 2 | "input-file": "sample-data/CLIENT.EBCDIC.txt", 3 | "input-bucket": "", 4 | "output-type": "file", 5 | "print": 20, 6 | "max": 0, 7 | "skip": 0, 8 | "lrecl": 500, 9 | "recfm": "fb", 10 | "split-rules": [ 11 | ] 12 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-split-local.json: -------------------------------------------------------------------------------- 1 | { 2 | "input-file": "sample-data/CLIENT.EBCDIC.txt", 3 | "input-bucket": "", 4 | "output-type": "file", 5 | "print": 20, 6 | "max": 0, 7 | "skip": 0, 8 | "lrecl": 500, 9 | "recfm": "fb", 10 | "split-rules": [ 11 | { 12 | "offset": 4, 13 | "size": 2, 14 | "hex": "0000", 15 | "cond": "eq", 16 | "bucket" : "", 17 | "file": "sample-data/CLIENT.EBCDIC-0.txt" 18 | }, 19 | { 20 | "offset": 4, 21 | "size": 2, 22 | "hex": "0001", 23 | "cond": "eq", 24 | "bucket" : "", 25 | "file": "sample-data/CLIENT.EBCDIC-1.txt" 26 | }, 27 | { 28 | "offset": 4, 29 | "size": 2, 30 | "hex": "0002", 31 | "cond": "eq", 32 | "bucket" : "", 33 | "file": "sample-data/CLIENT.EBCDIC-2.txt" 34 | } 35 | ] 36 | } -------------------------------------------------------------------------------- /sample-data/COBKS05-split.json: -------------------------------------------------------------------------------- 1 | { 2 | "input-file": "CLIENT.EBCDIC.txt", 3 | "input-bucket": "myBucket", 4 | "output-type": "file", 5 | "print": 20, 6 | "max": 0, 7 | "skip": 0, 8 | "lrecl": 500, 9 | "recfm": "fb", 10 | "split-rules": [ 11 | { 12 | "offset": 4, 13 | "size": 2, 14 | "hex": "0000", 15 | "bucket" : "myBucket", 16 | "key" : "CLIENT.EBCDIC-0.txt", 17 | "file": "sample-data/CLIENT.EBCDIC-0.txt" 18 | }, 19 | { 20 | "offset": 4, 21 | "size": 2, 22 | "hex": "0001", 23 | "bucket" : "myBucket", 24 | "key" : "CLIENT.EBCDIC-1.txt", 25 | "file": "sample-data/CLIENT.EBCDIC-1.txt" 26 | }, 27 | { 28 | "offset": 4, 29 | "size": 2, 30 | "hex": "0002", 31 | "bucket" : "myBucket", 32 | "key" : "CLIENT.EBCDIC-2.txt", 33 | "file": "sample-data/CLIENT.EBCDIC-2.txt" 34 | } 35 | ] 36 | } -------------------------------------------------------------------------------- /sample-data/COBPACK-TEST.json: -------------------------------------------------------------------------------- 1 | { 2 | "Records": [ 3 | { 4 | "eventVersion": "2.0", 5 | "eventSource": "aws:s3", 6 | "awsRegion": "us-west-2", 7 | "eventTime": "1970-01-01T00:00:00.000Z", 8 | "eventName": "ObjectCreated:Put", 9 | "userIdentity": { 10 | "principalId": "EXAMPLE" 11 | }, 12 | "requestParameters": { 13 | "sourceIPAddress": "127.0.0.1" 14 | }, 15 | "responseElements": { 16 | "x-amz-request-id": "EXAMPLE123456789", 17 | "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH" 18 | }, 19 | "s3": { 20 | "s3SchemaVersion": "1.0", 21 | "configurationId": "testConfigRule", 22 | "bucket": { 23 | "name": "your-bucket-name", 24 | "ownerIdentity": { 25 | "principalId": "EXAMPLE" 26 | }, 27 | "arn": "arn:aws:s3:::your-bucket-name" 28 | }, 29 | "object": { 30 | "key": "COBPACK.OUTFILE.txt", 31 | "size": 1024, 32 | "eTag": "0123456789abcdef0123456789abcdef", 33 | "sequencer": "0A1B2C3D4E5F678901" 34 | } 35 | } 36 | } 37 | ] 38 | } -------------------------------------------------------------------------------- /sample-data/COBPACK.OUTFILE.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/COBPACK.OUTFILE.txt -------------------------------------------------------------------------------- /sample-data/COBPACK3.EBCDIC.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/COBPACK3.EBCDIC.txt -------------------------------------------------------------------------------- /sample-data/COBVBFM2-split-local.json: -------------------------------------------------------------------------------- 1 | { 2 | "input-file": "sample-data/COBVBFM2.EBCDIC.txt", 3 | "input-bucket": "", 4 | "output-type": "file", 5 | "print": 0, 6 | "max": 0, 7 | "skip": 0, 8 | "lrecl": 0, 9 | "recfm": "vb", 10 | "split-rules": [ 11 | { 12 | "offset": 2, 13 | "size": 2, 14 | "hex": "f1f0", 15 | "cond": "le", 16 | "bucket" : "", 17 | "key" : "", 18 | "file": "sample-data/COBVBFM2.EBCDIC-le10.txt" 19 | }, 20 | { 21 | "offset": 2, 22 | "size": 2, 23 | "hex": "f1f0", 24 | "cond": "gt", 25 | "bucket" : "", 26 | "key" : "", 27 | "file": "sample-data/COBVBFM2.EBCDIC-gt10.txt" 28 | } 29 | ] 30 | } -------------------------------------------------------------------------------- /sample-data/COBVBFM2.ASCII.txt: -------------------------------------------------------------------------------- 1 | 00|01|001|000000001|NAME NUMBE000000001|||||||||||||||||| 2 | 00|02|002|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|||||||||||||||| 3 | 00|03|003|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|||||||||||||| 4 | 00|04|004|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|||||||||||| 5 | 00|05|005|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|||||||||| 6 | 00|06|006|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|||||||| 7 | 00|07|007|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|||||| 8 | 00|08|008|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|||| 9 | 00|09|009|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|000000009|NAME NUMBE000000009|| 10 | 00|10|010|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|000000009|NAME NUMBE000000009|000000010|NAME NUMBE000000010 11 | 00|11|001|000000001|NAME NUMBE000000001|||||||||||||||||| 12 | 00|12|002|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|||||||||||||||| 13 | 00|13|003|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|||||||||||||| 14 | 00|14|004|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|||||||||||| 15 | 00|15|005|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|||||||||| 16 | 00|16|006|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|||||||| 17 | 00|17|007|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|||||| 18 | 00|18|008|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|||| 19 | 00|19|009|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|000000009|NAME NUMBE000000009|| 20 | 00|20|010|000000001|NAME NUMBE000000001|000000002|NAME NUMBE000000002|000000003|NAME NUMBE000000003|000000004|NAME NUMBE000000004|000000005|NAME NUMBE000000005|000000006|NAME NUMBE000000006|000000007|NAME NUMBE000000007|000000008|NAME NUMBE000000008|000000009|NAME NUMBE000000009|000000010|NAME NUMBE000000010 -------------------------------------------------------------------------------- /sample-data/COBVBFM2.EBCDIC-gt10.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/COBVBFM2.EBCDIC-gt10.txt -------------------------------------------------------------------------------- /sample-data/COBVBFM2.EBCDIC-le10.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/COBVBFM2.EBCDIC-le10.txt -------------------------------------------------------------------------------- /sample-data/COBVBFM2.EBCDIC.txt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/mainframe-data-utilities/fc2aa9ed14186219ba969e3b92bdaf6e4f28d570/sample-data/COBVBFM2.EBCDIC.txt -------------------------------------------------------------------------------- /sample-json/COBKS05-ddb-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-ddb.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "ddb", 11 | "output": "CLIENT", 12 | "output_s3": "", 13 | "output_separator": "|", 14 | "part_k_len": 4, 15 | "part_k_name": "CLIENT_ID", 16 | "sort_k_len": 2, 17 | "sort_k_name": "CLIENT_R_TYPE", 18 | "req_size": 25, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": false, 26 | "threads": 1, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": true, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": true 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": true, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": true 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": true, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": true 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-ddb.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-ddb.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "ddb", 11 | "output": "CLIENT", 12 | "output_s3": "", 13 | "output_separator": "|", 14 | "part_k_len": 4, 15 | "part_k_name": "CLIENT_ID", 16 | "sort_k_len": 2, 17 | "sort_k_name": "CLIENT_R_TYPE", 18 | "req_size": 25, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": false, 26 | "threads": 1, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": true, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": true 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": true, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": true 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": true, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": true 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_s3_url": "", 9 | "input_recfm": "fb", 10 | "input_recl": 500, 11 | "output_type": "file", 12 | "output": "sample-data/CLIENT.ASCII.txt", 13 | "output_s3": "", 14 | "threads": 1, 15 | "output_separator": "|", 16 | "part_k_len": 0, 17 | "part_k_name": "", 18 | "sort_k_len": 0, 19 | "sort_k_name": "", 20 | "req_size": 0, 21 | "print": 20, 22 | "skip": 0, 23 | "max": 0, 24 | "rem_low_values": true, 25 | "rem_spaces": false, 26 | "working_folder": "", 27 | "verbose": true, 28 | "transf_rule": [ 29 | { 30 | "offset": 4, 31 | "size": 2, 32 | "hex": "0002", 33 | "transf": "transf1" 34 | }, 35 | { 36 | "offset": 4, 37 | "size": 2, 38 | "hex": "0000", 39 | "transf": "transf2" 40 | } 41 | ], 42 | "transf": [ 43 | { 44 | "type": "bi", 45 | "bytes": 4, 46 | "offset": 0, 47 | "dplaces": 0, 48 | "name": "CLIENT-ID", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "bi", 54 | "bytes": 2, 55 | "offset": 4, 56 | "dplaces": 0, 57 | "name": "CLIENT-TYPE", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 30, 64 | "offset": 6, 65 | "dplaces": 0, 66 | "name": "CLIENT-NAME", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 36, 74 | "dplaces": 0, 75 | "name": "CLIENT-BDATE", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "ch", 81 | "bytes": 10, 82 | "offset": 46, 83 | "dplaces": 0, 84 | "name": "CLIENT-ED-LVL", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "pd", 90 | "bytes": 5, 91 | "offset": 56, 92 | "dplaces": 2, 93 | "name": "CLIENT-INCOME", 94 | "part-key": false, 95 | "sort-key": false 96 | }, 97 | { 98 | "type": "ch", 99 | "bytes": 439, 100 | "offset": 61, 101 | "dplaces": 0, 102 | "name": "FILLER_1", 103 | "part-key": false, 104 | "sort-key": false 105 | } 106 | ], 107 | "transf1": [ 108 | { 109 | "type": "bi", 110 | "bytes": 4, 111 | "offset": 0, 112 | "dplaces": 0, 113 | "name": "CLIENT-ID", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 2, 120 | "offset": 4, 121 | "dplaces": 0, 122 | "name": "CLIENT-TYPE", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "bi", 128 | "bytes": 4, 129 | "offset": 6, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-NUMBER", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 40, 138 | "offset": 10, 139 | "dplaces": 0, 140 | "name": "CLIENT-ADDR-STREET", 141 | "part-key": false, 142 | "sort-key": false 143 | }, 144 | { 145 | "type": "ch", 146 | "bytes": 450, 147 | "offset": 50, 148 | "dplaces": 0, 149 | "name": "FILLER_2", 150 | "part-key": false, 151 | "sort-key": false 152 | } 153 | ], 154 | "transf2": [ 155 | { 156 | "type": "bi", 157 | "bytes": 4, 158 | "offset": 0, 159 | "dplaces": 0, 160 | "name": "CLIENT-ID", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 2, 167 | "offset": 4, 168 | "dplaces": 0, 169 | "name": "CLIENT-TYPE", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "bi", 175 | "bytes": 4, 176 | "offset": 6, 177 | "dplaces": 0, 178 | "name": "CLIENT-RECORD-COUNT", 179 | "part-key": false, 180 | "sort-key": false 181 | }, 182 | { 183 | "type": "ch", 184 | "bytes": 490, 185 | "offset": 10, 186 | "dplaces": 0, 187 | "name": "FILLER_3", 188 | "part-key": false, 189 | "sort-key": false 190 | } 191 | ] 192 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list-s3-out-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list-s3-out.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_s3_url": "", 9 | "input_recfm": "fb", 10 | "input_recl": 500, 11 | "output_type": "file", 12 | "output": "sample-data/CLIENT.ASCII.txt", 13 | "output_s3": "your-bucket-name", 14 | "output_separator": "|", 15 | "part_k_len": 0, 16 | "part_k_name": "", 17 | "sort_k_len": 0, 18 | "sort_k_name": "", 19 | "req_size": 0, 20 | "print": 20, 21 | "skip": 0, 22 | "max": 0, 23 | "rem_low_values": true, 24 | "rem_spaces": false, 25 | "working_folder": "", 26 | "verbose": true, 27 | "threads": 1, 28 | "transf_rule": [ 29 | { 30 | "offset": 4, 31 | "size": 2, 32 | "hex": "0002", 33 | "transf": "transf1" 34 | }, 35 | { 36 | "offset": 4, 37 | "size": 2, 38 | "hex": "0000", 39 | "transf": "transf2" 40 | } 41 | ], 42 | "transf": [ 43 | { 44 | "type": "bi", 45 | "bytes": 4, 46 | "offset": 0, 47 | "dplaces": 0, 48 | "name": "CLIENT-ID", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "bi", 54 | "bytes": 2, 55 | "offset": 4, 56 | "dplaces": 0, 57 | "name": "CLIENT-TYPE", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 30, 64 | "offset": 6, 65 | "dplaces": 0, 66 | "name": "CLIENT-NAME", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 36, 74 | "dplaces": 0, 75 | "name": "CLIENT-BDATE", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "ch", 81 | "bytes": 10, 82 | "offset": 46, 83 | "dplaces": 0, 84 | "name": "CLIENT-ED-LVL", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "pd", 90 | "bytes": 5, 91 | "offset": 56, 92 | "dplaces": 2, 93 | "name": "CLIENT-INCOME", 94 | "part-key": false, 95 | "sort-key": false 96 | }, 97 | { 98 | "type": "ch", 99 | "bytes": 439, 100 | "offset": 61, 101 | "dplaces": 0, 102 | "name": "FILLER_1", 103 | "part-key": false, 104 | "sort-key": false 105 | } 106 | ], 107 | "transf1": [ 108 | { 109 | "type": "bi", 110 | "bytes": 4, 111 | "offset": 0, 112 | "dplaces": 0, 113 | "name": "CLIENT-ID", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 2, 120 | "offset": 4, 121 | "dplaces": 0, 122 | "name": "CLIENT-TYPE", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "bi", 128 | "bytes": 4, 129 | "offset": 6, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-NUMBER", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 40, 138 | "offset": 10, 139 | "dplaces": 0, 140 | "name": "CLIENT-ADDR-STREET", 141 | "part-key": false, 142 | "sort-key": false 143 | }, 144 | { 145 | "type": "ch", 146 | "bytes": 450, 147 | "offset": 50, 148 | "dplaces": 0, 149 | "name": "FILLER_2", 150 | "part-key": false, 151 | "sort-key": false 152 | } 153 | ], 154 | "transf2": [ 155 | { 156 | "type": "bi", 157 | "bytes": 4, 158 | "offset": 0, 159 | "dplaces": 0, 160 | "name": "CLIENT-ID", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 2, 167 | "offset": 4, 168 | "dplaces": 0, 169 | "name": "CLIENT-TYPE", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "bi", 175 | "bytes": 4, 176 | "offset": 6, 177 | "dplaces": 0, 178 | "name": "CLIENT-RECORD-COUNT", 179 | "part-key": false, 180 | "sort-key": false 181 | }, 182 | { 183 | "type": "ch", 184 | "bytes": 490, 185 | "offset": 10, 186 | "dplaces": 0, 187 | "name": "FILLER_3", 188 | "part-key": false, 189 | "sort-key": false 190 | } 191 | ] 192 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list-s3-out.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list-s3-out.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "file", 11 | "output": "sample-data/CLIENT.ASCII.txt", 12 | "output_s3": "your-bucket-name", 13 | "output_separator": "|", 14 | "part_k_len": 0, 15 | "part_k_name": "", 16 | "sort_k_len": 0, 17 | "sort_k_name": "", 18 | "req_size": 0, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": true, 26 | "threads": 2, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": false, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": false 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": false, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": false 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": false, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": false 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list-s3-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list-s3.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "your-bucket-name", 8 | "input_s3_url": "", 9 | "input_recfm": "fb", 10 | "input_recl": 500, 11 | "output_type": "file", 12 | "output": "sample-data/CLIENT.ASCII.txt", 13 | "output_s3": "", 14 | "output_separator": "|", 15 | "part_k_len": 0, 16 | "part_k_name": "", 17 | "sort_k_len": 0, 18 | "sort_k_name": "", 19 | "req_size": 0, 20 | "print": 20, 21 | "skip": 0, 22 | "max": 0, 23 | "rem_low_values": true, 24 | "rem_spaces": false, 25 | "working_folder": "", 26 | "verbose": true, 27 | "threads": 1, 28 | "transf_rule": [ 29 | { 30 | "offset": 4, 31 | "size": 2, 32 | "hex": "0002", 33 | "transf": "transf1" 34 | }, 35 | { 36 | "offset": 4, 37 | "size": 2, 38 | "hex": "0000", 39 | "transf": "transf2" 40 | } 41 | ], 42 | "transf": [ 43 | { 44 | "type": "bi", 45 | "bytes": 4, 46 | "offset": 0, 47 | "dplaces": 0, 48 | "name": "CLIENT-ID", 49 | "part-key": false, 50 | "sort-key": false 51 | }, 52 | { 53 | "type": "bi", 54 | "bytes": 2, 55 | "offset": 4, 56 | "dplaces": 0, 57 | "name": "CLIENT-TYPE", 58 | "part-key": false, 59 | "sort-key": false 60 | }, 61 | { 62 | "type": "ch", 63 | "bytes": 30, 64 | "offset": 6, 65 | "dplaces": 0, 66 | "name": "CLIENT-NAME", 67 | "part-key": false, 68 | "sort-key": false 69 | }, 70 | { 71 | "type": "ch", 72 | "bytes": 10, 73 | "offset": 36, 74 | "dplaces": 0, 75 | "name": "CLIENT-BDATE", 76 | "part-key": false, 77 | "sort-key": false 78 | }, 79 | { 80 | "type": "ch", 81 | "bytes": 10, 82 | "offset": 46, 83 | "dplaces": 0, 84 | "name": "CLIENT-ED-LVL", 85 | "part-key": false, 86 | "sort-key": false 87 | }, 88 | { 89 | "type": "pd", 90 | "bytes": 5, 91 | "offset": 56, 92 | "dplaces": 2, 93 | "name": "CLIENT-INCOME", 94 | "part-key": false, 95 | "sort-key": false 96 | }, 97 | { 98 | "type": "ch", 99 | "bytes": 439, 100 | "offset": 61, 101 | "dplaces": 0, 102 | "name": "FILLER_1", 103 | "part-key": false, 104 | "sort-key": false 105 | } 106 | ], 107 | "transf1": [ 108 | { 109 | "type": "bi", 110 | "bytes": 4, 111 | "offset": 0, 112 | "dplaces": 0, 113 | "name": "CLIENT-ID", 114 | "part-key": false, 115 | "sort-key": false 116 | }, 117 | { 118 | "type": "bi", 119 | "bytes": 2, 120 | "offset": 4, 121 | "dplaces": 0, 122 | "name": "CLIENT-TYPE", 123 | "part-key": false, 124 | "sort-key": false 125 | }, 126 | { 127 | "type": "bi", 128 | "bytes": 4, 129 | "offset": 6, 130 | "dplaces": 0, 131 | "name": "CLIENT-ADDR-NUMBER", 132 | "part-key": false, 133 | "sort-key": false 134 | }, 135 | { 136 | "type": "ch", 137 | "bytes": 40, 138 | "offset": 10, 139 | "dplaces": 0, 140 | "name": "CLIENT-ADDR-STREET", 141 | "part-key": false, 142 | "sort-key": false 143 | }, 144 | { 145 | "type": "ch", 146 | "bytes": 450, 147 | "offset": 50, 148 | "dplaces": 0, 149 | "name": "FILLER_2", 150 | "part-key": false, 151 | "sort-key": false 152 | } 153 | ], 154 | "transf2": [ 155 | { 156 | "type": "bi", 157 | "bytes": 4, 158 | "offset": 0, 159 | "dplaces": 0, 160 | "name": "CLIENT-ID", 161 | "part-key": false, 162 | "sort-key": false 163 | }, 164 | { 165 | "type": "bi", 166 | "bytes": 2, 167 | "offset": 4, 168 | "dplaces": 0, 169 | "name": "CLIENT-TYPE", 170 | "part-key": false, 171 | "sort-key": false 172 | }, 173 | { 174 | "type": "bi", 175 | "bytes": 4, 176 | "offset": 6, 177 | "dplaces": 0, 178 | "name": "CLIENT-RECORD-COUNT", 179 | "part-key": false, 180 | "sort-key": false 181 | }, 182 | { 183 | "type": "ch", 184 | "bytes": 490, 185 | "offset": 10, 186 | "dplaces": 0, 187 | "name": "FILLER_3", 188 | "part-key": false, 189 | "sort-key": false 190 | } 191 | ] 192 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list-s3.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list-s3.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "your-bucket-name", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "file", 11 | "output": "sample-data/CLIENT.ASCII.txt", 12 | "output_s3": "", 13 | "output_separator": "|", 14 | "part_k_len": 0, 15 | "part_k_name": "", 16 | "sort_k_len": 0, 17 | "sort_k_name": "", 18 | "req_size": 0, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": true, 26 | "threads": 1, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": false, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": false 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": false, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": false 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": false, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": false 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-list.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "file", 11 | "output": "sample-data/CLIENT.ASCII.txt", 12 | "output_s3": "", 13 | "output_separator": "|", 14 | "part_k_len": 0, 15 | "part_k_name": "", 16 | "sort_k_len": 0, 17 | "sort_k_name": "", 18 | "req_size": 0, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": true, 26 | "threads": 1, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": false, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": false 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": false, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": false 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": false, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": false 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /sample-json/COBKS05-rules.json: -------------------------------------------------------------------------------- 1 | { 2 | "function": "parse", 3 | "copybook": "LegacyReference/COBKS05.cpy", 4 | "json": "sample-json/COBKS05-list.json", 5 | "json_debug": "", 6 | "input": "sample-data/CLIENT.EBCDIC.txt", 7 | "input_s3": "", 8 | "input_recfm": "fb", 9 | "input_recl": 500, 10 | "output_type": "file", 11 | "output": "sample-data/CLIENT.ASCII.txt", 12 | "output_s3": "", 13 | "output_separator": "|", 14 | "part_k_len": 0, 15 | "part_k_name": "", 16 | "sort_k_len": 0, 17 | "sort_k_name": "", 18 | "req_size": 0, 19 | "print": 20, 20 | "skip": 0, 21 | "max": 0, 22 | "rem_low_values": true, 23 | "rem_spaces": false, 24 | "working_folder": "", 25 | "verbose": true, 26 | "threads": 1, 27 | "transf_rule": [ 28 | { 29 | "offset": 4, 30 | "size": 2, 31 | "hex": "0002", 32 | "transf": "transf1" 33 | }, 34 | { 35 | "offset": 4, 36 | "size": 2, 37 | "hex": "0000", 38 | "transf": "transf2" 39 | } 40 | ], 41 | "transf": [ 42 | { 43 | "type": "bi", 44 | "bytes": 4, 45 | "offset": 0, 46 | "dplaces": 0, 47 | "name": "CLIENT-ID", 48 | "part-key": false, 49 | "sort-key": false 50 | }, 51 | { 52 | "type": "bi", 53 | "bytes": 2, 54 | "offset": 4, 55 | "dplaces": 0, 56 | "name": "CLIENT-TYPE", 57 | "part-key": false, 58 | "sort-key": false 59 | }, 60 | { 61 | "type": "ch", 62 | "bytes": 30, 63 | "offset": 6, 64 | "dplaces": 0, 65 | "name": "CLIENT-NAME", 66 | "part-key": false, 67 | "sort-key": false 68 | }, 69 | { 70 | "type": "ch", 71 | "bytes": 10, 72 | "offset": 36, 73 | "dplaces": 0, 74 | "name": "CLIENT-BDATE", 75 | "part-key": false, 76 | "sort-key": false 77 | }, 78 | { 79 | "type": "ch", 80 | "bytes": 10, 81 | "offset": 46, 82 | "dplaces": 0, 83 | "name": "CLIENT-ED-LVL", 84 | "part-key": false, 85 | "sort-key": false 86 | }, 87 | { 88 | "type": "pd", 89 | "bytes": 5, 90 | "offset": 56, 91 | "dplaces": 2, 92 | "name": "CLIENT-INCOME", 93 | "part-key": false, 94 | "sort-key": false 95 | }, 96 | { 97 | "type": "ch", 98 | "bytes": 439, 99 | "offset": 61, 100 | "dplaces": 0, 101 | "name": "FILLER_1", 102 | "part-key": false, 103 | "sort-key": false 104 | } 105 | ], 106 | "transf1": [ 107 | { 108 | "type": "bi", 109 | "bytes": 4, 110 | "offset": 0, 111 | "dplaces": 0, 112 | "name": "CLIENT-ID", 113 | "part-key": false, 114 | "sort-key": false 115 | }, 116 | { 117 | "type": "bi", 118 | "bytes": 2, 119 | "offset": 4, 120 | "dplaces": 0, 121 | "name": "CLIENT-TYPE", 122 | "part-key": false, 123 | "sort-key": false 124 | }, 125 | { 126 | "type": "bi", 127 | "bytes": 4, 128 | "offset": 6, 129 | "dplaces": 0, 130 | "name": "CLIENT-ADDR-NUMBER", 131 | "part-key": false, 132 | "sort-key": false 133 | }, 134 | { 135 | "type": "ch", 136 | "bytes": 40, 137 | "offset": 10, 138 | "dplaces": 0, 139 | "name": "CLIENT-ADDR-STREET", 140 | "part-key": false, 141 | "sort-key": false 142 | }, 143 | { 144 | "type": "ch", 145 | "bytes": 450, 146 | "offset": 50, 147 | "dplaces": 0, 148 | "name": "FILLER_2", 149 | "part-key": false, 150 | "sort-key": false 151 | } 152 | ], 153 | "transf2": [ 154 | { 155 | "type": "bi", 156 | "bytes": 4, 157 | "offset": 0, 158 | "dplaces": 0, 159 | "name": "CLIENT-ID", 160 | "part-key": false, 161 | "sort-key": false 162 | }, 163 | { 164 | "type": "bi", 165 | "bytes": 2, 166 | "offset": 4, 167 | "dplaces": 0, 168 | "name": "CLIENT-TYPE", 169 | "part-key": false, 170 | "sort-key": false 171 | }, 172 | { 173 | "type": "bi", 174 | "bytes": 4, 175 | "offset": 6, 176 | "dplaces": 0, 177 | "name": "CLIENT-RECORD-COUNT", 178 | "part-key": false, 179 | "sort-key": false 180 | }, 181 | { 182 | "type": "ch", 183 | "bytes": 490, 184 | "offset": 10, 185 | "dplaces": 0, 186 | "name": "FILLER_3", 187 | "part-key": false, 188 | "sort-key": false 189 | } 190 | ] 191 | } -------------------------------------------------------------------------------- /split_ebcdic_file.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | # How to run: split_ebcdic.py -local-json sample-data/COBKS05-split.json 4 | 5 | import utils as utils, sys, boto3, json 6 | 7 | log = utils.Log() 8 | 9 | def getRDW(b: bytearray): 10 | return int("0x" + b[:2].hex(), 0) - 4 if len(b) > 0 else 0 11 | 12 | def stats(reads, rules, writes): 13 | 14 | log.Write(['Records read', str(reads)]) 15 | 16 | for rule in rules: 17 | log.Write(['Records written', rule['file'], str(writes[rule['file']])]) 18 | 19 | def run(inputfile, lrecl, split_rule, bucket = '', max = 0, skip = 0, print=0, recfm='fb'): 20 | 21 | if bucket != '': 22 | Input = boto3.client('s3').get_object(Bucket=bucket, Key=inputfile)['Body'] 23 | else: 24 | Input=open(inputfile,"rb") 25 | 26 | output = {} 27 | ctWrit = {} 28 | ctRead = 0 29 | 30 | for rule in split_rule: 31 | output[rule['file']] = open(rule['file'],'wb') 32 | ctWrit[rule['file']] = 0 33 | 34 | i=0 35 | while max == 0 or i < max: 36 | 37 | if recfm == 'fb': 38 | rdw = bytearray() 39 | record = Input.read(lrecl) 40 | else: 41 | l = getRDW(rdw:= Input.read(4)) 42 | record = Input.read(l) 43 | 44 | if not record: break 45 | 46 | ctRead += 1 47 | 48 | i+= 1 49 | if i > skip: 50 | 51 | if (print != 0 and i % print == 0): stats(ctRead, split_rule, ctWrit) 52 | 53 | if len(split_rule) == 0: raise Exception('Please define split rules') 54 | 55 | for r in split_rule: 56 | if utils.cond[r['cond']](record[r['offset']:r['offset']+r['size']].hex() , r['hex'].lower()): 57 | output[r['file']].write(rdw + record) 58 | ctWrit[r['file']] += 1 59 | 60 | for rule in split_rule: 61 | if rule['bucket'] != '' and rule['key'] != '': 62 | boto3.client('s3').put_object(Body=open(rule['file'], 'rb'), Bucket=rule['bucket'], Key=rule['key']) 63 | 64 | stats(ctRead, split_rule, ctWrit) 65 | 66 | if __name__ == '__main__': 67 | 68 | arg = dict(zip(sys.argv[1::2], sys.argv[2::2])) 69 | 70 | if not '-local-json' in arg: 71 | raise Exception('Error! Sintax must be: python3 ' + sys.argv[0] + ' -local-json path/to/layout.json') 72 | 73 | with open(arg['-local-json']) as json_file: param = json.load(json_file) 74 | 75 | run(param['input-file'],param['lrecl'],param['split-rules'], param['input-bucket'], param['max'], param['skip'],param['print'],param['recfm']) -------------------------------------------------------------------------------- /src/bincompare.json: -------------------------------------------------------------------------------- 1 | { 2 | "max": 0, 3 | "lrecl": 24, 4 | "printeach": 1000000, 5 | "ebcdic": "my.ebcdic.txt", 6 | "ascii": "my.asccii.txt", 7 | "report": "report.txt", 8 | "layout": [ 9 | { 10 | "type": "bin", 11 | "val": 6, 12 | "description": "CPF" 13 | }, 14 | { 15 | "type": "nobin", 16 | "val": 7, 17 | "description": "LIN" 18 | }, 19 | { 20 | "type": "skip", 21 | "val": 11, 22 | "description": "DTIN" 23 | } 24 | ] 25 | } -------------------------------------------------------------------------------- /src/bincompare.py: -------------------------------------------------------------------------------- 1 | import datetime, json, sys 2 | 3 | print (datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f"), ": started") 4 | 5 | with open(sys.argv[1]) as json_file: param = json.load(json_file) 6 | 7 | InpA=open(param["ascii"],"rb") 8 | InpE=open(param["ebcdic"],"rb") 9 | 10 | i=0 11 | 12 | while i < param["max"] or param["max"] == 0: 13 | 14 | liASCII = InpA.read(param["lrecl"]) 15 | liEBCDI = InpE.read(param["lrecl"]) 16 | 17 | biASCII = b'' 18 | biEBCDI = b'' 19 | 20 | if not liEBCDI and not liASCII: break 21 | 22 | i += 1 23 | if(i % param["printeach"] == 0): print(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f") + " : " + str(i)) 24 | 25 | ini = 0 26 | for layout in param["layout"]: 27 | 28 | fim = layout["val"] 29 | 30 | if layout["type"] == "bin": 31 | biASCII += liASCII[ini:fim] 32 | biEBCDI += liEBCDI[ini:fim] 33 | elif layout["type"] == "nobin": 34 | biASCII += liASCII[ini:fim].decode('utf-8').encode('cp037') 35 | biEBCDI += liEBCDI[ini:fim] 36 | 37 | ini = fim 38 | 39 | biASCII += liASCII[ini:] 40 | biEBCDI += liEBCDI[ini:] 41 | 42 | if biASCII != biEBCDI: 43 | print("ASCII.: ", i) 44 | print(liASCII) 45 | print("EBCDIC: ", i) 46 | print(liEBCDI) 47 | print("==========================================================================================================================================================================") 48 | 49 | print (datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f"), ": ended") -------------------------------------------------------------------------------- /src/core/cli.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import json 3 | import os 4 | 5 | class CommandLine: 6 | 7 | def __init__(self, _arg=[]): 8 | 9 | parser = {} 10 | 11 | with open(os.path.dirname(os.path.realpath(__file__)) + '//config.json', 'r') as configfile: 12 | config = json.load(configfile) 13 | 14 | main_arg = argparse.ArgumentParser() 15 | 16 | main_function = main_arg.add_subparsers(title="Function", dest="function", required=True) 17 | 18 | for func in config: 19 | 20 | parser[func] = main_function.add_parser(func, help=config[func]['help']) 21 | 22 | for arg in config[func]['argument']: 23 | 24 | parser[func].add_argument( 25 | arg, 26 | default=config[func]['argument'][arg]['default'], 27 | help=config[func]['argument'][arg]['help'], 28 | type=type(config[func]['argument'][arg]['default']) 29 | ) 30 | 31 | if _arg == []: 32 | self.args = main_arg.parse_args() 33 | else: 34 | self.args = main_arg.parse_args(_arg) 35 | 36 | if self.args.verbose: 37 | self.verbose = True 38 | print('CLI arguments', vars(self.args)) 39 | else: 40 | self.verbose = False -------------------------------------------------------------------------------- /src/core/copybook.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | # FUNCTIONS TO HANDLE THE HIERARCHICAL STACK # 5 | def fGetSetack(): 6 | global stack, output 7 | tmp = output 8 | for k in stack: 9 | tmp = tmp[stack[k]] 10 | return tmp 11 | 12 | def fRemStack(iStack,iLevel): 13 | NewStack = {} 14 | for k in iStack: 15 | if k < iLevel: NewStack[k] = iStack[k] 16 | return NewStack 17 | 18 | #PIC 999, 9(3), XXX, X(3)... 19 | def getPicSize(arg): 20 | if arg.find("(") > 0: 21 | return int(arg[arg.find("(")+1:arg.find(")")]) 22 | else: 23 | return len(arg) 24 | 25 | # TYPE AND LENGTH CALCULATION # 26 | def getLenType(atr, p): 27 | ret = {} 28 | #FirstCh = atr[3][:1].upper() 29 | #Picture = atr[3].upper() 30 | FirstCh = atr[p][:1].upper() 31 | Picture = atr[p].upper() 32 | 33 | #data type 34 | if 'COMP-3'in atr and FirstCh=='S': ret['type'] = "pd+" 35 | elif 'COMP-3'in atr: ret['type'] = "pd" 36 | elif 'COMP' in atr and FirstCh=='S': ret['type'] = "bi+" 37 | elif 'COMP' in atr: ret['type'] = "bi" 38 | elif FirstCh=='S': ret['type'] = "zd+" 39 | elif FirstCh=='9': ret['type'] = "zd" 40 | else: ret['type'] = "ch" 41 | 42 | #Total data length 43 | PicNum = Picture.replace("V"," ").replace("S","").replace("-","").split() 44 | 45 | Lgt = getPicSize(PicNum[0]) 46 | 47 | if len(PicNum) == 1 and FirstCh !='V': 48 | ret['dplaces'] = 0 49 | elif FirstCh !='V': 50 | ret['dplaces'] = getPicSize(PicNum[1]) 51 | Lgt += ret['dplaces'] 52 | else: 53 | ret['dplaces'] = getPicSize(PicNum[0]) 54 | 55 | ret['length'] = Lgt 56 | 57 | #Data size in bytes 58 | if ret['type'][:2] == "pd": ret['bytes'] = int(Lgt/2)+1 59 | elif ret['type'][:2] == "bi": 60 | if Lgt < 5: ret['bytes'] = 2 61 | elif Lgt < 10: ret['bytes'] = 4 62 | else : ret['bytes'] = 8 63 | else: 64 | if FirstCh=='-': Lgt += 1 65 | ret['bytes'] = Lgt 66 | 67 | return ret 68 | 69 | ############# DICTIONARY AND HIERARCHICAL LOGIC ########################### 70 | def add2dict(lvl, grp, itm, stt, id): 71 | 72 | global cur, output, last, stack, FillerCount 73 | 74 | if itm.upper() == "FILLER": 75 | FillerCount += 1 76 | itm = itm + "_" + str(FillerCount) 77 | 78 | if lvl <= cur: stack = fRemStack(stack, lvl) 79 | 80 | stk = fGetSetack() 81 | stk[itm]= {} 82 | stk[itm]['id'] = id 83 | stk[itm]['level'] = lvl 84 | stk[itm]['group'] = grp 85 | 86 | if 'OCCURS' in stt: 87 | if 'TIMES' in stt: 88 | stk[itm]['occurs'] = int(stt[stt.index('TIMES')-1]) 89 | else: 90 | raise Exception('OCCURS WITHOU TIMES?' + ' '.join(stt)) 91 | 92 | if 'REDEFINES'in stt: stk[itm]['redefines'] = stt[stt.index('REDEFINES')+1] 93 | 94 | if grp == True: 95 | stack[lvl] = itm 96 | cur = lvl 97 | else: 98 | tplen = {} 99 | pic = stt.index('PIC')+1 100 | tplen = getLenType(stt, pic) 101 | #stk[itm]['pict'] = stt[3] 102 | stk[itm]['pict'] = stt[pic] 103 | stk[itm]['type'] = tplen['type'] 104 | stk[itm]['length'] = tplen['length'] 105 | stk[itm]['bytes'] = tplen['bytes'] 106 | stk[itm]['dplaces'] = tplen['dplaces'] 107 | 108 | ############################### MAIN ################################### 109 | # READS, CLEANS AND JOINS LINES # 110 | 111 | FillerCount=0 112 | cur=0 113 | output={} 114 | stack = {} 115 | 116 | def toDict(lines): 117 | 118 | id = 0 119 | stt = "" 120 | for line in lines: 121 | if len(line[6:72].strip()) > 1: 122 | if line[6] in [' ' , '-']: 123 | if not line[6:72].split()[0] in ['SKIP1','SKIP2','SKIP3']: 124 | stt += line[6:72].replace('\t', ' ') 125 | elif line[6] != '*': 126 | print('Unnexpected character in column 7:', line) 127 | quit() 128 | 129 | # READS FIELD BY FIELD / SPLITS ATTRIBUTES # 130 | for variable in stt.split("."): 131 | 132 | attribute=variable.split() 133 | 134 | if len(attribute) > 0: 135 | if attribute[0] != '88': 136 | id += 1 137 | add2dict(int(attribute[0]), False if 'PIC'in attribute else True, attribute[1], attribute, id) 138 | 139 | return output -------------------------------------------------------------------------------- /src/core/ebcdic.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | HighestPositive = "7fffffffffffffffffff" 5 | 6 | def unpack(bytes: bytearray, type: str, dec_places: int, rem_lv: bool, rem_spaces: bool = False): 7 | # 8 | # Formats ebcdic text, zoned, big endian binary or decinal data into unpacked/string ascii data. 9 | # 10 | # Parameters: 11 | # - bytes (bytearray): The content to be extracted 12 | # - type (str)......: 13 | # - ch : text | pic x 14 | # - zd : zoned | pic 9 15 | # - zd+: signed zoned | pic s9 16 | # - bi : binary | pic 9 comp 17 | # - bi+: signed binary | pic s9 comp 18 | # - pd : packed-decimal | pic 9 comp-3 19 | # - pd+: signed packed-decimal| pic s9 comp-3 20 | # 21 | # Returns: 22 | # - ascii string 23 | # 24 | # Test sample: 25 | # import struct 26 | # ori = 9223372036854775807 27 | # print(ori, unpack(struct.pack(">q",ori),"bi+")) 28 | # ori = ori * -1 29 | # print(ori, unpack(struct.pack(">q",ori),"bi+")) 30 | # print(unpack(bytearray.fromhex("f0f0f1c1"), "zd+")) 31 | # 32 | # Input examples: 33 | # - 8 bytes comp-signed struct q: -9,223,372,036,854,775,808 through +9,223,372,036,854,775,807 34 | # - 8 bytes comp-unsigned struct Q: 0 through 18,446,744,073,709,551,615 35 | # - 4 bytes comp-signed struct i: -2147483648 through +2147483647 36 | # - 4 bytes comp-unsigned struct I: 0 through +4294967295 37 | # - 2 bytes comp-signed struct h: -32768 through +32767 38 | # - 2 bytes comp-unsigned struct H: 0 through +65535 39 | 40 | if type.lower() == "ch": 41 | return bytes.decode('cp037').replace('\x00', '').rstrip() if rem_lv == True else bytes.decode('cp037') 42 | 43 | elif type.lower() == "pd" or type.lower() == "pd+": 44 | return AddDecPlaces(("" if bytes.hex()[-1:] != "d" and bytes.hex()[-1:] != "b" else "-") + bytes.hex()[:-1], dec_places) 45 | 46 | elif type.lower() == "bi" or (type.lower() == "bi+" and bytes.hex() <= HighestPositive[:len(bytes)*2]): 47 | return AddDecPlaces(str(int("0x" + bytes.hex(), 0)),dec_places) 48 | 49 | elif type.lower() == "bi+": 50 | return AddDecPlaces(str(int("0x" + bytes.hex(), 0) - int("0x" + len(bytes) * 2 * "f", 0) -1), dec_places) 51 | 52 | if type.lower() == "zd": 53 | return AddDecPlaces(bytes.decode('cp037').replace('\x00', '').rstrip(), dec_places) 54 | 55 | elif type.lower() == "zd+": 56 | return AddDecPlaces(("" if bytes.hex()[-2:-1] != "d" else "-") + bytes[:-1].decode('cp037') + bytes.hex()[-1:], dec_places) 57 | 58 | elif type.lower() == "hex": 59 | return bytes.hex() 60 | 61 | elif type.lower() == "bit": 62 | return ';'.join( bin(bytes[0]).replace("0b", "").zfill(len(bytes)*8)) 63 | 64 | else: 65 | print("---------------------------\nLength & Type not supported\nLength: ",len(bytes),"\nType..: " ,type) 66 | exit() 67 | 68 | def AddDecPlaces(value, dplaces) -> int: 69 | 70 | if dplaces == 0: return value 71 | 72 | return value[:len(value)-dplaces] + '.' + value[len(value)-dplaces:] -------------------------------------------------------------------------------- /src/core/filemeta.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | import json 5 | import boto3 6 | 7 | class FileMetaData: 8 | 9 | def __init__(self, log, args): 10 | 11 | if args.json_s3 == '': 12 | json_local = args.working_folder + args.json 13 | 14 | with open(json_local) as json_file: 15 | self.general = json.load(json_file) 16 | else: 17 | json_local = args.working_folder + args.json.split("/")[-1] 18 | 19 | log.Write(['Downloading json metadata from s3', args.json_s3, args.json]) 20 | 21 | #pending try / except 22 | self.general = json.load(boto3.client('s3').get_object(Bucket=args.json_s3, Key=args.json)['Body']) 23 | 24 | #override and validate json parameters 25 | if args.input != '': self.general['input'] = args.input 26 | 27 | if args.input_s3 != '': 28 | self.general['input_s3'] = args.input_s3 29 | elif 'input_s3' not in self.general: 30 | self.general['input_s3'] = '' 31 | 32 | if args.output_s3 != '': 33 | self.general['output_s3'] = args.output_s3 34 | elif 'output_s3' not in self.general: 35 | self.general['output_s3'] = '' 36 | 37 | if args.working_folder != '': 38 | self.general['working_folder'] = args.working_folder 39 | if self.general['working_folder'][-1] != '/': self.general['working_folder'] += '/' 40 | elif 'working_folder' not in self.general: 41 | self.general['working_folder'] = '' 42 | 43 | if args.input_s3_url != '': 44 | self.general['input_s3_url'] = args.input_s3_url 45 | self.general['input_s3_route'] = args.input_s3_route 46 | self.general['input_s3_token'] = args.input_s3_token 47 | else: 48 | self.general['input_s3_url'] = '' 49 | 50 | # new parameter to define parallelism 51 | if 'threads' not in self.general: 52 | self.general['threads'] = 1 53 | 54 | #identify the input file source 55 | if self.general['input_s3_url'] != '': 56 | self.inputtype = 's3_url' 57 | elif self.general['input_s3'] != '': 58 | self.inputtype = 's3' 59 | else: 60 | self.inputtype = 'local' 61 | 62 | self.rules = [] 63 | 64 | for paramrule in self.general['transf_rule']: 65 | self.rules.append(TransformationRule( 66 | paramrule['offset'], 67 | paramrule['size'], 68 | paramrule['hex'], 69 | paramrule['transf'], 70 | paramrule['skip'] if 'skip' in paramrule else False 71 | ) 72 | ) 73 | 74 | def GetLayout(self, _data): 75 | if len(self.rules) == 0: return self.general['transf'] 76 | 77 | for r in self.rules: 78 | if _data[r.offset:r.end].hex() == r.hexv.lower(): 79 | return self.general[r.transf] 80 | 81 | return self.general['transf'] 82 | 83 | def Layout(self, _data): 84 | if len(self.rules) == 0: return 'transf' 85 | 86 | for r in self.rules: 87 | if _data[r.offset:r.end].hex() == r.hexv.lower(): 88 | return r.transf 89 | 90 | return 'transf' 91 | 92 | class TransformationRule: 93 | def __init__(self, _offset, _size, _hex, _transf, _skip): 94 | self.offset = _offset 95 | self.size = _size 96 | self.end = _offset + _size 97 | self.hexv = _hex 98 | self.transf = _transf 99 | self.skip = _skip -------------------------------------------------------------------------------- /src/core/log.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | 4 | import datetime 5 | 6 | class Log: 7 | def __init__(self, verbose = False) -> None: 8 | self.start = datetime.datetime.now() 9 | if verbose: self.verbose = True 10 | print(self.start.strftime("%Y-%m-%d %H:%M:%S.%f") ,'|',' | '.join(['started'])) 11 | 12 | def Finish(self): 13 | sec = str((datetime.datetime.now() - self.start).total_seconds()) 14 | print(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f") ,'Seconds', sec) 15 | 16 | def Write (self, content=[]): 17 | if len(content) > 0: 18 | print(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f") ,'|',' | '.join(content)) 19 | else: 20 | print("------------------------------------------------------------------") -------------------------------------------------------------------------------- /src/core/parsecp.py: -------------------------------------------------------------------------------- 1 | import json, sys, core.copybook as copybook 2 | 3 | transf = [] 4 | altlay = [] 5 | lrecl = 0 6 | altpos = 0 7 | partklen = 0 8 | sortklen = 0 9 | 10 | ###### Create the extraction parameter file 11 | def CreateExtraction(obj, altstack=[], partklen=0, sortklen=0): 12 | global lrecl 13 | global altpos 14 | for k in obj: 15 | if type(obj[k]) is dict: 16 | t = 1 if 'occurs' not in obj[k] else obj[k]['occurs'] 17 | 18 | iTimes = 0 19 | while iTimes < t: 20 | iTimes +=1 21 | 22 | if 'redefines' not in obj[k]: 23 | if obj[k]['group'] == True: 24 | altstack.append(k) 25 | CreateExtraction(obj[k], altstack, partklen, sortklen) 26 | altstack.remove(k) 27 | else: 28 | item = {} 29 | item['type'] = obj[k]['type'] 30 | item['bytes'] = obj[k]['bytes'] 31 | item['offset'] = lrecl 32 | item['dplaces'] = obj[k]['dplaces'] 33 | item['name'] = k 34 | item['part-key'] = True if (lrecl + obj[k]['bytes']) <= partklen else False 35 | item['sort-key'] = True if (lrecl + obj[k]['bytes']) <= (sortklen + partklen) and (lrecl + obj[k]['bytes']) > partklen else False 36 | transf.append(item) 37 | 38 | lrecl = lrecl + obj[k]['bytes'] 39 | else: 40 | add2alt = True 41 | for x in altlay: 42 | if x[list(x)[0]]['newname'] == k: 43 | add2alt = False 44 | if add2alt: 45 | red = {} 46 | red[obj[k]['redefines']] = obj[k].copy() 47 | red[obj[k]['redefines']]['newname'] = k 48 | red[obj[k]['redefines']]['stack'] = altstack.copy() 49 | altpos+=1 50 | altlay.insert(altpos,red) 51 | 52 | ############################### MAIN ################################### 53 | 54 | def RunParse(log, iparm): 55 | 56 | global transf 57 | global lrecl 58 | 59 | # Open the copybook for reading and creates the dictionary 60 | with open(iparm.copybook, 'r') as finp: output = copybook.toDict(finp.readlines()) 61 | 62 | # Write the dict into a file if requested 63 | if iparm.json_debug != '': 64 | with open(iparm.json_debug,"w") as fout: 65 | fout.write(json.dumps(output,indent=4)) 66 | 67 | # get the default values 68 | param = vars(iparm) 69 | 70 | partklen = iparm.part_k_len 71 | sortklen = iparm.sort_k_len 72 | 73 | CreateExtraction(output, [], partklen, sortklen) 74 | 75 | param['input_recl'] = lrecl 76 | param['transf_rule'] = [] 77 | param['transf'] = transf 78 | 79 | ialt = 0 80 | for r in altlay: 81 | transf = [] 82 | lrecl = 0 83 | redfkey = list(r.keys())[0] 84 | 85 | #POSITIONS ON REDEFINES 86 | newout = output 87 | for s in r[redfkey]['stack']: 88 | newout = newout[s] 89 | 90 | newout[redfkey] = r[redfkey].copy() 91 | newout[redfkey].pop('redefines') 92 | 93 | altpos = ialt 94 | CreateExtraction(output, [], partklen, sortklen) 95 | ialt += 1 96 | param['transf' + str(ialt)] = transf 97 | 98 | with open(iparm.json,"w") as fout: fout.write(json.dumps(param,indent=4)) -------------------------------------------------------------------------------- /src/lambda_function.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | import json 4 | import os 5 | import core.extract as Extract 6 | from core.log import Log 7 | from core.cli import CommandLine 8 | 9 | json_s3 = os.environ.get('json_s3') 10 | json_pre = os.environ.get('json_pre') 11 | 12 | def lambda_handler(event, context): 13 | 14 | bkt = event['Records'][0]['s3']['bucket']['name'] 15 | key = event['Records'][0]['s3']['object']['key'] 16 | 17 | json = json_pre + key.split('/')[-1].split('.')[0] + '.json' 18 | 19 | cli = CommandLine( 20 | [ 21 | 'extract', json, 22 | '-json-s3', json_s3, 23 | '-input', key, 24 | '-input-s3', bkt 25 | ] 26 | ) 27 | 28 | # -output and -output-s3 can be added to override the JSON content. 29 | 30 | log = Log(cli.verbose) 31 | 32 | Extract.FileProcess(log, cli.args) 33 | 34 | return {'statusCode': 200} 35 | 36 | def s3_obj_lambda_handler(event, context): 37 | 38 | request_route = event["getObjectContext"]["outputRoute"] 39 | request_token = event["getObjectContext"]["outputToken"] 40 | s3_url = event["getObjectContext"]["inputS3Url"] 41 | key = event['userRequest']['url'] 42 | 43 | json = json_pre + key.split('/')[-1].split('.')[0] + '.json' 44 | 45 | cli = CommandLine( 46 | [ 47 | 'extract', json, 48 | '-json-s3', json_s3, 49 | '-input-s3-url', s3_url, 50 | '-input-s3-route', request_route, 51 | '-input-s3-token', request_token 52 | ] 53 | ) 54 | 55 | log = Log(cli.verbose) 56 | 57 | Extract.FileProcess(log, cli.args) 58 | 59 | return {'statusCode': 200} -------------------------------------------------------------------------------- /src/mdu.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/bin/python3 2 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 3 | # SPDX-License-Identifier: Apache-2.0 4 | 5 | from core.log import Log 6 | from core.cli import CommandLine 7 | import core.extract as Extract 8 | import core.parsecp as Parsecp 9 | 10 | import sys 11 | 12 | def main(arg): 13 | 14 | cli = CommandLine() 15 | 16 | log = Log(cli.verbose) 17 | 18 | if cli.args.function == 'extract': 19 | Extract.FileProcess(log, cli.args) 20 | elif cli.args.function == 'parse': 21 | Parsecp.RunParse(log, cli.args) 22 | else: 23 | log.Write(['not implemented yet']) 24 | 25 | log.Finish() 26 | 27 | if __name__ == '__main__': 28 | main(sys.argv) -------------------------------------------------------------------------------- /src/split.py: -------------------------------------------------------------------------------- 1 | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | # SPDX-License-Identifier: Apache-2.0 3 | # How to run: split_ebcdic.py -local-json sample-data/COBKS05-split.json 4 | 5 | import core.utils as utils, sys, boto3, json 6 | 7 | log = utils.Log() 8 | 9 | def getRDW(b: bytearray): 10 | return int("0x" + b[:2].hex(), 0) - 4 if len(b) > 0 else 0 11 | 12 | def stats(reads, rules, writes): 13 | 14 | log.Write(['Records read', str(reads)]) 15 | 16 | for rule in rules: 17 | log.Write(['Records written', rule['file'], str(writes[rule['file']])]) 18 | 19 | def run(inputfile, lrecl, split_rule, bucket = '', max = 0, skip = 0, print=0, recfm='fb'): 20 | 21 | if bucket != '': 22 | Input = boto3.client('s3').get_object(Bucket=bucket, Key=inputfile)['Body'] 23 | else: 24 | Input=open(inputfile,"rb") 25 | 26 | output = {} 27 | ctWrit = {} 28 | ctRead = 0 29 | 30 | for rule in split_rule: 31 | output[rule['file']] = open(rule['file'],'wb') 32 | ctWrit[rule['file']] = 0 33 | 34 | i=0 35 | while max == 0 or i < max: 36 | 37 | if recfm == 'fb': 38 | rdw = bytearray() 39 | record = Input.read(lrecl) 40 | else: 41 | l = getRDW(rdw:= Input.read(4)) 42 | record = Input.read(l) 43 | 44 | if not record: break 45 | 46 | ctRead += 1 47 | 48 | i+= 1 49 | if i > skip: 50 | 51 | if (print != 0 and i % print == 0): stats(ctRead, split_rule, ctWrit) 52 | 53 | if len(split_rule) == 0: raise Exception('Please define split rules') 54 | 55 | for r in split_rule: 56 | if utils.cond[r['cond']](record[r['offset']:r['offset']+r['size']].hex() , r['hex'].lower()): 57 | output[r['file']].write(rdw + record) 58 | ctWrit[r['file']] += 1 59 | 60 | for rule in split_rule: 61 | if rule['bucket'] != '' and rule['key'] != '': 62 | boto3.client('s3').put_object(Body=open(rule['file'], 'rb'), Bucket=rule['bucket'], Key=rule['key']) 63 | 64 | stats(ctRead, split_rule, ctWrit) 65 | 66 | if __name__ == '__main__': 67 | 68 | arg = dict(zip(sys.argv[1::2], sys.argv[2::2])) 69 | 70 | if not '-local-json' in arg: 71 | raise Exception('Error! Sintax must be: python3 ' + sys.argv[0] + ' -local-json path/to/layout.json') 72 | 73 | with open(arg['-local-json']) as json_file: param = json.load(json_file) 74 | 75 | run(param['input-file'],param['lrecl'],param['split-rules'], param['input-bucket'], param['max'], param['skip'],param['print'],param['recfm']) -------------------------------------------------------------------------------- /utils.py: -------------------------------------------------------------------------------- 1 | import datetime, json, boto3, operator 2 | 3 | cond = { 4 | "lt" : operator.lt, 5 | "le" : operator.le, 6 | "eq" : operator.eq, 7 | "ne" : operator.ne, 8 | "ge" : operator.ge, 9 | "gt" : operator.gt 10 | } 11 | 12 | class ParamReader: 13 | def __init__(self, sysargv): 14 | l = Log() 15 | l.Write() 16 | 17 | desc = { 18 | 'unknown' : 'Unknown argument ', 19 | '-local-json' : 'Local Json file ', 20 | '-s3-json' : 'S3 Json file ', 21 | '-s3-input' : 'S3 input file ', 22 | '-s3-output' : 'S3 output file ' 23 | } 24 | 25 | arg = dict(zip(sysargv[1::2], sysargv[2::2])) 26 | 27 | for a in arg: 28 | if a in desc: 29 | l.Write([desc[a],a,arg[a]]) 30 | else: 31 | l.Write([desc['unknown'],a,arg[a]]) 32 | 33 | if '-s3-json' in arg: 34 | slash = arg['-s3-json'].find('/',5) 35 | bucket = arg['-s3-json'][5:slash] 36 | s3obje = arg['-s3-json'][slash+1:] 37 | self.general = json.load(boto3.client('s3').get_object(Bucket=bucket, Key=s3obje)['Body']) 38 | elif '-local-json' in arg: 39 | with open(arg['-local-json']) as json_file: self.general = json.load(json_file) 40 | else: 41 | l.Write(['Error! Sintax must be: python3 ' + sysargv[0] + ' -local-json (path/to/layout.json | -s3-json s3://buketname/filename)']) 42 | l.Write() 43 | quit() 44 | 45 | if '-s3-input' in arg: self.general['input'] = arg['-s3-input'] 46 | if '-s3-output' in arg: self.general['output'] = arg['-s3-output'] 47 | 48 | self.rules = [] 49 | for paramrule in self.general["transf-rule"]: 50 | self.rules.append(TransformationRule(paramrule["offset"],paramrule["size"],paramrule["hex"],paramrule["transf"])) 51 | 52 | def GetLayout(self, _data): 53 | if len(self.rules) == 0: return self.general['transf'] 54 | 55 | for r in self.rules: 56 | if _data[r.offset:r.end].hex() == r.hexv.lower(): 57 | return self.general[r.transf] 58 | 59 | return self.general['transf'] 60 | 61 | def AddDecPlaces(self, value, dplaces): 62 | if dplaces == 0: return value 63 | 64 | return value[:len(value)-dplaces] + '.' + value[len(value)-dplaces:] 65 | 66 | class TransformationRule: 67 | def __init__(self, _offset, _size, _hex, _transf): 68 | self.offset = _offset 69 | self.size = _size 70 | self.end = _offset + _size 71 | self.hexv = _hex 72 | self.transf = _transf 73 | 74 | class S3File: 75 | def __init__(self, url) -> None: 76 | slash = url.find('/',5) 77 | self.bucket = url[5:slash] 78 | self.s3obje = url[slash+1:] 79 | 80 | class Log: 81 | def __init__(self) -> None: 82 | self.start = datetime.datetime.now() 83 | def Finish(self): 84 | sec = str((datetime.datetime.now() - self.start).total_seconds()) 85 | print(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f") ,'Seconds', sec) 86 | def Write (self, content=[]): 87 | if len(content) > 0: 88 | print(datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f") ,'|',' | '.join(content)) 89 | else: 90 | print("------------------------------------------------------------------") --------------------------------------------------------------------------------