├── .gitignore ├── .travis.yml ├── LICENSE.txt ├── README.md ├── RELEASE.md ├── pytdtest.sh ├── setup.cfg ├── setup.py ├── teradata ├── __init__.py ├── api.py ├── datatypes.py ├── pulljson.py ├── tdodbc.py ├── tdrest.py ├── udaexec.py ├── util.py └── version.py └── test ├── testBteqScript.sql ├── testClobSp.sql ├── testScript.sql ├── testScript2.sql ├── test_pulljson.py ├── test_tdodbc.py ├── test_tdrest.py ├── test_udaexec_config.py ├── test_udaexec_datatypes.py ├── test_udaexec_execute.py ├── testlargeview.sql ├── udaexec.ini └── udaexec2.ini /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | /teradata.egg-info/ 3 | -------------------------------------------------------------------------------- /.travis.yml: -------------------------------------------------------------------------------- 1 | language: python 2 | python: 3 | - "2.7" 4 | - "3.4" 5 | install: 6 | - pip install flake8 7 | script: 8 | - "flake8 --show-source teradata" 9 | -------------------------------------------------------------------------------- /LICENSE.txt: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | [![No Maintenance Intended](http://unmaintained.tech/badge.svg)](http://unmaintained.tech/) 3 | 4 | Deprecation notice 5 | ==== 6 | The driver/package has been deprecated. For new Python driver implementations consider using https://github.com/Teradata/python-driver. 7 | 8 | 9 | Teradata Python DevOps Module [![Build Status](https://travis-ci.org/Teradata/PyTd.svg?branch=master)](https://travis-ci.org/Teradata/PyTd) 10 | ============================= 11 | 12 | The Teradata Python Module is a freely available, open source, library for the Python programming language, whose aim is to make it easy to script powerful interactions with Teradata Database. It adopts the philosophy of udaSQL, providing a DevOps focused SQL Execution Engine that allows developers to focus on their SQL and procedural logic without worrying about Operational requirements such as external configuration, query banding, and logging. 13 | 14 | INSTALLATION 15 | ------------ 16 | 17 | [sudo] pip install teradata 18 | 19 | The module is hosted on PyPi: https://pypi.python.org/pypi/teradata 20 | 21 | DOCUMENTATION 22 | ------------- 23 | 24 | Documentation for the Teradata Python Module is available on the Teradata Developer Exchange. 25 | 26 | UNIT TESTS 27 | ---------- 28 | 29 | To execute the unit tests, you can run the following command at the root of the project checkout. 30 | 31 | python -m unittest discover -s test 32 | 33 | The unit tests use the connection information specified in test/udaexec.ini. The unit tests depend on Teradata ODBC being installed and also on access to Teradata REST Services. 34 | -------------------------------------------------------------------------------- /RELEASE.md: -------------------------------------------------------------------------------- 1 | Steps to release: 2 | 3 | 1) Commit and push all changes to GitHub: https://github.com/teradata/PyTd 4 | 5 | 2) Tag the release, E.g. 6 | 7 | git tag -a v15.10.00.03 -m 'Release version 15.10.00.03' 8 | git push origin --tags 9 | 10 | 3) Release to PyPI: 11 | 12 | python setup.py register -r pypi 13 | python setup.py sdist upload -r pypi 14 | 15 | 4) Increment version in teradata/versions.py to next release version. 16 | -------------------------------------------------------------------------------- /pytdtest.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # Runs the pytd unit tests on a remote server that supports ssh. 3 | # The MIT License (MIT) 4 | # 5 | # Copyright (c) 2015 by Teradata 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a copy 8 | # of this software and associated documentation files (the "Software"), to deal 9 | # in the Software without restriction, including without limitation the rights 10 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 11 | # copies of the Software, and to permit persons to whom the Software is 12 | # furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included in all 15 | # copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 18 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 20 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 22 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 23 | # SOFTWARE. 24 | 25 | ################################################################################################################################# 26 | # TEST SERVER SETUP 27 | # --------------------------- 28 | # 29 | # The pydtest.sh bash script can be run to execute the unit test on a remote host that has the following prerequisites: 30 | # 31 | # 1) SSH Daemon running3 32 | # 2) BASH shell 33 | # 3) A pytd user that has the testidentity.pub file as an authorized_key for password less authentication. 34 | # 4) Python installed (e.g. multiple version of python can be installed for testing multiple versions.) 35 | # 36 | # To install by source on Linux, download source distribution and run: 37 | # 38 | # ./configure --prefix=/usr/local 39 | # make 40 | # make altinstall 41 | # 42 | # 4) Install pip if python version is before 3.4: curl https://raw.githubusercontent.com/pypa/pip/master/contrib/get-pip.py | python2.7 - 43 | # 5) Install virtualenv: python3.4 -m pip install virtualenv 44 | # 6) Install Teradata ODBC Driver 45 | ################################################################################################################################### 46 | 47 | function printUsage 48 | { 49 | echo "" 50 | echo "Usage: $0 [host] [pythonVersion]" 51 | echo "" 52 | echo "Example: $0 sdlc0000.labs.teradata.com python3.4" 53 | echo "" 54 | exit 1 55 | } 56 | 57 | # Asserts that the previous command succeeded. 58 | function assert { 59 | # Gets the return code from the previous command. 60 | local returnCode=$? 61 | 62 | # Checks if the return code is not zero, signaling a failure. 63 | if [ $returnCode -ne 0 ] ; 64 | then 65 | error "$1 failed (Error Code: $returnCode)." 66 | exit 1 67 | else 68 | echo "$1 successful." 69 | fi 70 | } 71 | 72 | # Writes error information to the console. 73 | # Param 1 - The error message. 74 | function error { 75 | echo "ERROR: $1" 76 | } 77 | 78 | if [ -z "$1" ]; then 79 | error "Missing host argument." 80 | printUsage 81 | exit 1 82 | fi 83 | 84 | if [ -z "$2" ]; then 85 | error "Missing python version argument." 86 | printUsage 87 | exit 1 88 | fi 89 | 90 | ENV_NAME=pytd_$2 91 | TEST_DIR="~/$ENV_NAME/pytd" 92 | SSH_COMMAND="ssh -i testidentity -oBatchMode=yes -oStrictHostKeyChecking=no pytd@$1" 93 | 94 | chmod 600 testidentity 95 | assert "Setup test identity" 96 | 97 | echo "Checking login to $1..." 98 | $SSH_COMMAND "echo \"Connected to $1\"" 99 | assert "Login check to $1" 100 | 101 | $SSH_COMMAND "rm -rf ~/$ENV_NAME" 102 | assert "Cleanup of previous virtual environment" 103 | 104 | $SSH_COMMAND "virtualenv -p $2 $ENV_NAME" 105 | assert "Creation of test virtual environment" 106 | 107 | $SSH_COMMAND "~/$ENV_NAME/bin/pip install teamcity-messages" 108 | assert "Install teamcity integration module" 109 | 110 | $SSH_COMMAND "mkdir $TEST_DIR" 111 | assert "Creation of test directory" 112 | 113 | scp -i testidentity -r * pytd@$1:$TEST_DIR/ 114 | assert "Copy of source files to test machine" 115 | 116 | $SSH_COMMAND "cd $TEST_DIR;../bin/$2 -m teamcity.unittestpy discover -s test" 117 | -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | [metadata] 2 | description-file = README.md -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | # The MIT License (MIT) 3 | # 4 | # Copyright (c) 2015 by Teradata 5 | # 6 | # Permission is hereby granted, free of charge, to any person obtaining a copy 7 | # of this software and associated documentation files (the "Software"), to deal 8 | # in the Software without restriction, including without limitation the rights 9 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 10 | # copies of the Software, and to permit persons to whom the Software is 11 | # furnished to do so, subject to the following conditions: 12 | # 13 | # The above copyright notice and this permission notice shall be included in all 14 | # copies or substantial portions of the Software. 15 | # 16 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 17 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 18 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 19 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 20 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 21 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 22 | # SOFTWARE. 23 | import sys 24 | from setuptools import setup 25 | 26 | # Make sure correct version of python is being used. 27 | if sys.version_info[0] < 2 or (sys.version_info[0] == 2 and sys.version_info[1] < 7): 28 | print("The teradata module does not support this version of Python, the version must be 2.7 or later.") 29 | sys.exit(1) 30 | 31 | with open('teradata/version.py') as f: 32 | exec(f.read()) 33 | 34 | setup(name='teradata', 35 | version=__version__, # @UndefinedVariable 36 | description='The Teradata python module for DevOps enabled SQL scripting for Teradata UDA.', 37 | url='http://github.com/teradata/PyTd', 38 | author='Teradata Corporation', 39 | author_email='eric.scheie@teradata.com', 40 | license='MIT', 41 | packages=['teradata'], 42 | zip_safe=True) 43 | -------------------------------------------------------------------------------- /teradata/__init__.py: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in 13 | # all copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | from .udaexec import UdaExec, UdaExecCheckpointManager # noqa 23 | from .util import SqlScript, BteqScript # noqa 24 | from .api import * # noqa 25 | -------------------------------------------------------------------------------- /teradata/api.py: -------------------------------------------------------------------------------- 1 | """Defines global variables, helper classes, and exceptions classes 2 | for implementations of the Python Database API Specification v2.0""" 3 | 4 | # The MIT License (MIT) 5 | # 6 | # Copyright (c) 2015 by Teradata 7 | # 8 | # Permission is hereby granted, free of charge, to any person obtaining a copy 9 | # of this software and associated documentation files (the "Software"), to deal 10 | # in the Software without restriction, including without limitation the rights 11 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 12 | # copies of the Software, and to permit persons to whom the Software is 13 | # furnished to do so, subject to the following conditions: 14 | # 15 | # The above copyright notice and this permission notice shall be included in 16 | # all copies or substantial portions of the Software. 17 | # 18 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 19 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 20 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 21 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 22 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 23 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 24 | # SOFTWARE. 25 | from .version import * # noqa 26 | import datetime 27 | import decimal 28 | import logging 29 | 30 | # DB API 2.0 globals 31 | apilevel = "2.0" 32 | threadsafety = 1 33 | paramstyle = 'qmark' 34 | 35 | CONFIG_ERROR = "CONFIG_ERROR" 36 | 37 | 38 | class NullHandler(logging.Handler): 39 | 40 | def emit(self, record): 41 | pass 42 | 43 | logging.getLogger("teradata").addHandler(NullHandler()) 44 | 45 | 46 | class OutParam (): 47 | 48 | """Represents an output parameter from a Stored Procedure""" 49 | 50 | def __init__(self, name=None, dataType=None, size=None): 51 | self.name = name 52 | self.dataType = dataType 53 | self.size = size 54 | self.valueFunc = None 55 | 56 | def setValueFunc(self, valueFunc): 57 | self.valueFunc = valueFunc 58 | 59 | def value(self): 60 | return None if self.valueFunc is None else self.valueFunc() 61 | 62 | def __repr__(self): 63 | return "OutParam(name={}, size={})".format(self.name, self.size) 64 | 65 | 66 | class InOutParam (OutParam): 67 | 68 | """Represents an input and output parameter from a Stored Procedure""" 69 | 70 | def __init__(self, value, name=None, dataType=None, size=None): 71 | OutParam.__init__(self, name, dataType, size) 72 | self.inValue = value 73 | 74 | def __repr__(self): 75 | return "InOutParam(value={}, name={}, dataType={}, size={})".format( 76 | self.inValue, self.name, self.dataType, self.size) 77 | 78 | # Define exceptions 79 | 80 | 81 | class Warning(Exception): # @ReservedAssignment 82 | 83 | def __init__(self, msg): 84 | self.args = (msg,) 85 | self.msg = msg 86 | 87 | 88 | class Error(Exception): 89 | 90 | def __init__(self, msg): 91 | self.args = (msg,) 92 | self.msg = msg 93 | 94 | 95 | class InterfaceError(Error): 96 | 97 | """Represents an error in using Teradata's implementation of the Python 98 | Database API Specification v2.0""" 99 | 100 | def __init__(self, code, msg): 101 | self.args = (code, msg) 102 | self.code = code 103 | self.msg = msg 104 | 105 | 106 | class DatabaseError(Error): 107 | 108 | """Represents an error returned by the Database""" 109 | 110 | def __init__(self, code, msg, sqlState=None): 111 | self.args = (code, msg) 112 | self.code = code 113 | self.msg = msg 114 | self.sqlState = sqlState 115 | 116 | 117 | class InternalError(DatabaseError): 118 | 119 | def __init__(self, code, msg): 120 | self.value = (code, msg) 121 | self.args = (code, msg) 122 | 123 | 124 | class ProgrammingError(DatabaseError): 125 | 126 | def __init__(self, code, msg): 127 | self.value = (code, msg) 128 | self.args = (code, msg) 129 | 130 | 131 | class DataError(DatabaseError): 132 | 133 | def __init__(self, code, msg): 134 | self.value = (code, msg) 135 | self.args = (code, msg) 136 | 137 | 138 | class IntegrityError(DatabaseError): 139 | 140 | def __init__(self, code, msg): 141 | self.value = (code, msg) 142 | self.args = (code, msg) 143 | 144 | 145 | class NotSupportedError(Error): 146 | 147 | def __init__(self, code, msg): 148 | self.value = (code, msg) 149 | self.args = (code, msg) 150 | 151 | 152 | class OperationalError(DatabaseError): 153 | 154 | def __init__(self, code, msg): 155 | self.value = (code, msg) 156 | self.args = (code, msg) 157 | 158 | # Definitions for types 159 | BINARY = bytearray 160 | Binary = bytearray 161 | DATETIME = datetime.datetime 162 | Date = datetime.date 163 | Time = datetime.time 164 | Timestamp = datetime.datetime 165 | STRING = str 166 | NUMBER = decimal.Decimal 167 | ROWID = int 168 | DateFromTicks = datetime.date.fromtimestamp 169 | 170 | 171 | def TimeFromTicks(x): 172 | return datetime.datetime.fromtimestamp(x).time() 173 | 174 | TimestampFromTicks = datetime.datetime.fromtimestamp 175 | -------------------------------------------------------------------------------- /teradata/datatypes.py: -------------------------------------------------------------------------------- 1 | """Data types and converters.""" 2 | 3 | # The MIT License (MIT) 4 | # 5 | # Copyright (c) 2015 by Teradata 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a copy 8 | # of this software and associated documentation files (the "Software"), to deal 9 | # in the Software without restriction, including without limitation the rights 10 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 11 | # copies of the Software, and to permit persons to whom the Software is 12 | # furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included in 15 | # all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 18 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 20 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 22 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 23 | # SOFTWARE. 24 | 25 | import re 26 | import json 27 | from . import util 28 | from .api import * # @UnusedWildImport # noqa 29 | 30 | logger = logging.getLogger(__name__) 31 | 32 | SECS_IN_MILLISECS = MILLISECS_IN_MICROSECS = 1000 33 | dateRegExStr = r"(\d{4})-(\d{2})-(\d{2})" 34 | timeRegExStr = r"(\d{2}):(\d{2}):(\d{2})(\.(\d{1,6}))?(([-+])(\d{2}):(\d{2}))?" 35 | dateRegEx = re.compile("^{}$".format(dateRegExStr)) 36 | timeRegEx = re.compile("^{}$".format(timeRegExStr)) 37 | timestampRegEx = re.compile("^{} {}$".format(dateRegExStr, timeRegExStr)) 38 | scalarIntervalRegEx = re.compile("^(-?)(\d+)$") 39 | yearToMonthIntervalRegEx = re.compile("^(-?)(\d+)-(\d+)$") 40 | dayToHourIntervalRegEx = re.compile("^(-?)(\d+) (\d+)$") 41 | dayToMinuteIntervalRegEx = re.compile("^(-?)(\d+) (\d+):(\d+)$") 42 | dayToSecondIntervalRegEx = re.compile("^(-?)(\d+) (\d+):(\d+):(\d+\.?\d*)$") 43 | hourToMinuteIntervalRegEx = re.compile("^(-?)(\d+):(\d+)$") 44 | hourToSecondIntervalRegEx = re.compile("^(-?)(\d+):(\d+):(\d+\.?\d*)$") 45 | minuteToSecondIntervalRegEx = re.compile("^(-?)(\d+):(\d+\.?\d*)$") 46 | secondIntervalRegEx = re.compile("^(-?)(\d+\.?\d*)$") 47 | periodRegEx1 = re.compile("\('(.*)',\s*'(.*)'\)") 48 | periodRegEx2 = re.compile("ResultStruct:PERIOD\(.*\)\[(.*),\s*(.*)\]") 49 | 50 | NUMBER_TYPES = {"BYTEINT", "BIGINT", "DECIMAL", "DOUBLE", "DOUBLE PRECISION", 51 | "INTEGER", "NUMBER", "SMALLINT", "FLOAT", "INT", "NUMERIC", 52 | "REAL"} 53 | 54 | INT_TYPES = {"BYTEINT", "BIGINT", "INTEGER", "SMALLINT", "INT"} 55 | 56 | FLOAT_TYPES = {"FLOAT", "DOUBLE", "DOUBLE PRECISION", "REAL"} 57 | 58 | BINARY_TYPES = {"BLOB", "BYTE", "VARBYTE"} 59 | 60 | 61 | def _getMs(m, num): 62 | ms = m.group(num) 63 | if ms: 64 | ms = int(ms.ljust(6, "0")) 65 | else: 66 | ms = 0 67 | return ms 68 | 69 | 70 | def _getInt(m, num): 71 | return int(m.group(num)) 72 | 73 | 74 | def _getFloat(m, num): 75 | return float(m.group(num)) 76 | 77 | 78 | def convertDate(value): 79 | m = dateRegEx.match(value) 80 | if m: 81 | return datetime.date(_getInt(m, 1), _getInt(m, 2), _getInt(m, 3)) 82 | else: 83 | raise InterfaceError( 84 | "INVALID_DATE", "Date format invalid: {}".format(value)) 85 | 86 | 87 | def convertTime(value): 88 | m = timeRegEx.match(value) 89 | if m: 90 | tz = None 91 | if m.group(7): 92 | tz = TimeZone(m.group(7), _getInt(m, 8), _getInt(m, 9)) 93 | return datetime.time(_getInt(m, 1), _getInt(m, 2), _getInt(m, 3), 94 | _getMs(m, 5), tz) 95 | else: 96 | raise InterfaceError( 97 | "INVALID_TIME", "Time format invalid: {}".format(value)) 98 | 99 | 100 | def convertTimestamp(value): 101 | m = timestampRegEx.match(value) 102 | if m: 103 | tz = None 104 | if m.group(10): 105 | tz = TimeZone(m.group(10), _getInt(m, 11), _getInt(m, 12)) 106 | return datetime.datetime(_getInt(m, 1), _getInt(m, 2), _getInt(m, 3), 107 | _getInt(m, 4), _getInt(m, 5), _getInt(m, 6), 108 | _getMs(m, 8), tz) 109 | else: 110 | raise InterfaceError( 111 | "INVALID_TIMESTAMP", "Timestamp format invalid: {}".format(value)) 112 | 113 | 114 | def _convertScalarInterval(dataType, value, *args): 115 | return _convertInterval(dataType, value, scalarIntervalRegEx, *args) 116 | 117 | 118 | def _convertInterval(dataType, value, regEx, *args): 119 | m = regEx.match(value) 120 | if m: 121 | kwargs = {} 122 | index = 2 123 | for field in args: 124 | if field != "seconds": 125 | kwargs[field] = _getInt(m, index) 126 | else: 127 | kwargs[field] = _getFloat(m, index) 128 | index += 1 129 | return Interval(negative=True if m.group(1) else False, **kwargs) 130 | else: 131 | raise InterfaceError("INVALID_INTERVAL", 132 | "{} format invalid: {}".format(dataType, value)) 133 | 134 | 135 | def convertInterval(dataType, value): 136 | value = value.strip() 137 | if dataType == "INTERVAL YEAR": 138 | return _convertScalarInterval(dataType, value, "years") 139 | elif dataType == "INTERVAL YEAR TO MONTH": 140 | return _convertInterval(dataType, value, yearToMonthIntervalRegEx, 141 | "years", "months") 142 | elif dataType == "INTERVAL MONTH": 143 | return _convertScalarInterval(dataType, value, "months") 144 | elif dataType == "INTERVAL DAY": 145 | return _convertScalarInterval(dataType, value, "days") 146 | elif dataType == "INTERVAL DAY TO HOUR": 147 | return _convertInterval(dataType, value, dayToHourIntervalRegEx, 148 | "days", "hours") 149 | elif dataType == "INTERVAL DAY TO MINUTE": 150 | return _convertInterval(dataType, value, dayToMinuteIntervalRegEx, 151 | "days", "hours", "minutes") 152 | elif dataType == "INTERVAL DAY TO SECOND": 153 | return _convertInterval(dataType, value, dayToSecondIntervalRegEx, 154 | "days", "hours", "minutes", "seconds") 155 | elif dataType == "INTERVAL HOUR": 156 | return _convertScalarInterval(dataType, value, "hours") 157 | elif dataType == "INTERVAL HOUR TO MINUTE": 158 | return _convertInterval(dataType, value, hourToMinuteIntervalRegEx, 159 | "hours", "minutes") 160 | elif dataType == "INTERVAL HOUR TO SECOND": 161 | return _convertInterval(dataType, value, hourToSecondIntervalRegEx, 162 | "hours", "minutes", "seconds") 163 | elif dataType == "INTERVAL MINUTE": 164 | return _convertScalarInterval(dataType, value, "minutes") 165 | elif dataType == "INTERVAL MINUTE TO SECOND": 166 | return _convertInterval(dataType, value, minuteToSecondIntervalRegEx, 167 | "minutes", "seconds") 168 | elif dataType == "INTERVAL SECOND": 169 | return _convertInterval(dataType, value, secondIntervalRegEx, 170 | "seconds") 171 | return value 172 | 173 | 174 | def convertPeriod(dataType, value): 175 | m = periodRegEx1.match(value) 176 | if not m: 177 | m = periodRegEx2.match(value) 178 | if m: 179 | if "TIMESTAMP" in dataType: 180 | start = convertTimestamp(m.group(1)) 181 | end = convertTimestamp(m.group(2)) 182 | elif "TIME" in dataType: 183 | start = convertTime(m.group(1)) 184 | end = convertTime(m.group(2)) 185 | elif "DATE" in dataType: 186 | start = convertDate(m.group(1)) 187 | end = convertDate(m.group(2)) 188 | else: 189 | raise InterfaceError("INVALID_PERIOD", 190 | "Unknown PERIOD data type: {}".format( 191 | dataType, value)) 192 | else: 193 | raise InterfaceError( 194 | "INVALID_PERIOD", "{} format invalid: {}".format(dataType, value)) 195 | return Period(start, end) 196 | 197 | 198 | def zeroIfNone(value): 199 | if value is None: 200 | value = 0 201 | return value 202 | 203 | 204 | class DataTypeConverter: 205 | 206 | """Handles conversion of result set data types into python objects.""" 207 | 208 | def convertValue(self, dbType, dataType, typeCode, value): 209 | """Converts the value returned by the database into the desired 210 | python object.""" 211 | raise NotImplementedError( 212 | "convertValue must be implemented by sub-class") 213 | 214 | def convertType(self, dbType, dataType): 215 | """Converts the data type to a python type code.""" 216 | raise NotImplementedError( 217 | "convertType must be implemented by sub-class") 218 | 219 | 220 | class DefaultDataTypeConverter (DataTypeConverter): 221 | 222 | """Handles conversion of result set data types into python objects.""" 223 | 224 | def __init__(self, useFloat=False): 225 | self.useFloat = useFloat 226 | 227 | def convertValue(self, dbType, dataType, typeCode, value): 228 | """Converts the value returned by the database into the desired 229 | python object.""" 230 | logger.trace( 231 | "Converting \"%s\" to (%s, %s).", value, dataType, typeCode) 232 | if value is not None: 233 | if typeCode == NUMBER: 234 | try: 235 | return NUMBER(value) 236 | except: 237 | # Handle infinity and NaN for older ODBC drivers. 238 | if value == "1.#INF": 239 | return NUMBER('Infinity') 240 | elif value == "-1.#INF": 241 | return NUMBER('-Infinity') 242 | else: 243 | return NUMBER('NaN') 244 | elif typeCode == float: 245 | return value if not util.isString else float(value) 246 | elif typeCode == Timestamp: 247 | if util.isString(value): 248 | return convertTimestamp(value) 249 | else: 250 | return datetime.datetime.fromtimestamp( 251 | value // SECS_IN_MILLISECS).replace( 252 | microsecond=value % SECS_IN_MILLISECS * 253 | MILLISECS_IN_MICROSECS) 254 | elif typeCode == Time: 255 | if util.isString(value): 256 | return convertTime(value) 257 | else: 258 | return datetime.datetime.fromtimestamp( 259 | value // SECS_IN_MILLISECS).replace( 260 | microsecond=value % SECS_IN_MILLISECS * 261 | MILLISECS_IN_MICROSECS).time() 262 | elif typeCode == Date: 263 | if util.isString(value): 264 | return convertDate(value) 265 | else: 266 | return datetime.datetime.fromtimestamp( 267 | value // SECS_IN_MILLISECS).replace( 268 | microsecond=value % SECS_IN_MILLISECS * 269 | MILLISECS_IN_MICROSECS).date() 270 | elif typeCode == BINARY: 271 | if util.isString(value): 272 | return bytearray.fromhex(value) 273 | elif dataType.startswith("INTERVAL"): 274 | return convertInterval(dataType, value) 275 | elif dataType.startswith("JSON") and util.isString(value): 276 | return json.loads(value, parse_int=decimal.Decimal, 277 | parse_float=decimal.Decimal) 278 | elif dataType.startswith("PERIOD"): 279 | return convertPeriod(dataType, value) 280 | return value 281 | 282 | def convertType(self, dbType, dataType): 283 | """Converts the data type to a python type code.""" 284 | typeCode = STRING 285 | if dataType in NUMBER_TYPES: 286 | typeCode = NUMBER 287 | if self.useFloat and dataType in FLOAT_TYPES: 288 | typeCode = float 289 | elif dataType in BINARY_TYPES: 290 | typeCode = BINARY 291 | elif dataType.startswith("DATE"): 292 | typeCode = Date 293 | elif dataType.startswith("TIMESTAMP"): 294 | typeCode = Timestamp 295 | elif dataType.startswith("TIME"): 296 | typeCode = Time 297 | return typeCode 298 | 299 | 300 | class TimeZone (datetime.tzinfo): 301 | 302 | """Represents a Fixed Time Zone offset from UTC.""" 303 | 304 | def __init__(self, sign, hours, minutes): 305 | self.offset = datetime.timedelta(hours=hours, minutes=minutes) 306 | if sign == "-": 307 | self.offset = -self.offset 308 | 309 | def utcoffset(self, dt): 310 | return self.offset 311 | 312 | def tzname(self, dt): 313 | return "TeradataTimestamp" 314 | 315 | def dst(self, dt): 316 | return 0 317 | 318 | 319 | def _appendInterval(arr, value, padding=2, separator=" "): 320 | if value is not None: 321 | if arr and separator: 322 | arr.append(separator) 323 | s = (("%0" + str(padding + 7) + ".6f") % value).rstrip("0").rstrip(".") 324 | arr.append(s) 325 | 326 | 327 | class Interval: 328 | 329 | """Represents a SQL date/time interval.""" 330 | 331 | def __init__(self, negative=False, years=None, months=None, days=None, 332 | hours=None, minutes=None, seconds=None): 333 | self.negative = negative 334 | self.years = years 335 | self.months = months 336 | self.days = days 337 | self.hours = hours 338 | self.minutes = minutes 339 | self.seconds = seconds 340 | self.type = None 341 | if years is not None: 342 | if months is not None: 343 | self.type = "YEAR TO MONTH" 344 | else: 345 | self.type = "YEAR" 346 | if days or hours or minutes or seconds: 347 | raise InterfaceError( 348 | "INVALID INTERVAL", 349 | "A year/month interval cannot be " 350 | "shared with a day/hour/minute/second interval.") 351 | elif months is not None: 352 | self.type = "MONTH" 353 | if days or hours or minutes or seconds: 354 | raise InterfaceError( 355 | "INVALID INTERVAL", 356 | "A year/month interval cannot be shared " 357 | "with a day/hour/minute/second interval.") 358 | elif days is not None: 359 | if seconds is not None: 360 | self.type = "DAY TO SECOND" 361 | self.hours = zeroIfNone(hours) 362 | self.minutes = zeroIfNone(minutes) 363 | elif minutes is not None: 364 | self.type = "DAY TO MINUTE" 365 | self.hours = zeroIfNone(hours) 366 | elif hours is not None: 367 | self.type = "DAY TO HOUR" 368 | else: 369 | self.type = "DAY" 370 | elif hours is not None: 371 | if seconds is not None: 372 | self.type = "HOUR TO SECOND" 373 | self.minutes = zeroIfNone(minutes) 374 | elif minutes is not None: 375 | self.type = "HOUR TO MINUTE" 376 | elif hours is not None: 377 | self.type = "DAY TO HOUR" 378 | else: 379 | self.type = "HOUR" 380 | elif minutes is not None: 381 | if seconds is not None: 382 | self.type = "MINUTE TO SECOND" 383 | else: 384 | self.type = "MINUTE" 385 | elif seconds is not None: 386 | self.type = "SECOND" 387 | else: 388 | raise InterfaceError( 389 | "INVALID INTERVAL", 390 | "One of years, months, days, hours, minutes, " 391 | "seconds must not be None.") 392 | 393 | def timedelta(self): 394 | if self.years or self.months: 395 | raise InterfaceError( 396 | "UNSUPPORTED_INTERVAL", 397 | "timedelta() is not supported for Year " 398 | "and Month interval types. %s" % repr(self)) 399 | delta = datetime.timedelta(days=zeroIfNone(self.days), 400 | hours=zeroIfNone(self.hours), 401 | minutes=zeroIfNone(self.minutes), 402 | seconds=zeroIfNone(self.seconds)) 403 | if self.negative: 404 | delta = -delta 405 | return delta 406 | 407 | def __str__(self): 408 | s = [] 409 | _appendInterval(s, self.years, padding=1) 410 | _appendInterval(s, self.months, separator="-") 411 | _appendInterval(s, self.days, padding=1) 412 | _appendInterval(s, self.hours) 413 | _appendInterval(s, self.minutes, separator=":") 414 | _appendInterval(s, self.seconds, separator=":") 415 | if self.negative: 416 | s.insert(0, "-") 417 | return "".join(s) 418 | 419 | def __repr__(self): 420 | return str(self.__dict__) 421 | 422 | def __eq__(self, other): 423 | try: 424 | return self.__dict__ == other.__dict__ 425 | except AttributeError: 426 | return False 427 | 428 | def __ne__(self, other): 429 | return not self == other 430 | 431 | 432 | class Period: 433 | 434 | """ Represents a PERIOD data type. """ 435 | 436 | def __init__(self, start, end): 437 | self.start = start 438 | self.end = end 439 | 440 | def __str__(self): 441 | return "('" + str(self.start) + "', '" + str(self.end) + "')" 442 | 443 | def __eq__(self, other): 444 | try: 445 | return self.__dict__ == other.__dict__ 446 | except AttributeError: 447 | return False 448 | 449 | def __ne__(self, other): 450 | return not self == other 451 | -------------------------------------------------------------------------------- /teradata/pulljson.py: -------------------------------------------------------------------------------- 1 | """A pull parser for parsing JSON streams""" 2 | 3 | # The MIT License (MIT) 4 | # 5 | # Copyright (c) 2015 by Teradata 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a copy 8 | # of this software and associated documentation files (the "Software"), to deal 9 | # in the Software without restriction, including without limitation the rights 10 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 11 | # copies of the Software, and to permit persons to whom the Software is 12 | # furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included in 15 | # all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 18 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 20 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 22 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 23 | # SOFTWARE. 24 | 25 | import sys 26 | import decimal 27 | import re 28 | import json 29 | import logging 30 | from . import util # @UnusedImport # noqa 31 | if sys.version_info[0] == 2: 32 | from StringIO import StringIO # @UnresolvedImport #@UnusedImport 33 | else: 34 | from io import StringIO # @UnresolvedImport @UnusedImport @Reimport # noqa 35 | 36 | logger = logging.getLogger(__name__) 37 | 38 | # JSONNode and value types. 39 | OBJECT = "OBJECT" 40 | ARRAY = "ARRAY" 41 | FIELD = "FIELD" 42 | STRING = "STRING" 43 | NUMBER = "NUMBER" 44 | BOOLEAN = "BOOLEAN" 45 | NULL = "null" 46 | TRUE = "true" 47 | FALSE = "false" 48 | 49 | # JSONEvent types. 50 | START_OBJECT = "START_OBJECT" 51 | START_ARRAY = "START_ARRAY" 52 | FIELD_NAME = "FIELD_NAME" 53 | FIELD_VALUE = "FIELD_VALUE" 54 | ARRAY_VALUE = "ARRAY_VALUE" 55 | END_OBJECT = "END_OBJECT" 56 | END_ARRAY = "END_ARRAY" 57 | 58 | # JSONParseError codes 59 | JSON_SYNTAX_ERROR = "JSON_SYNTAX_ERROR" 60 | JSON_INCOMPLETE_ERROR = "JSON_INCOMPLETE_ERROR" 61 | JSON_UNEXPECTED_ELEMENT_ERROR = "JSON_UNEXPECTED_ELEMENT_ERROR" 62 | 63 | 64 | class JSONPullParser (object): 65 | 66 | def __init__(self, stream, size=2 ** 16): 67 | """Initialize pull parser with a JSON stream.""" 68 | self.stream = stream 69 | self.size = size 70 | self.node = None 71 | self.value = "" 72 | self.valueType = None 73 | self.tokens = [] 74 | self.tokenIndex = 0 75 | self.halfToken = "" 76 | self.pattern = re.compile('([\[\]{}:\\\\",])') 77 | 78 | def expectObject(self): 79 | """Raise JSONParseError if next event is not the start of an object.""" 80 | event = self.nextEvent() 81 | if event.type != START_OBJECT: 82 | raise JSONParseError( 83 | JSON_UNEXPECTED_ELEMENT_ERROR, 84 | "Expected START_OBJECT but got: " + str(event)) 85 | 86 | def expectArray(self): 87 | """Raise JSONParseError if next event is not the start of an array.""" 88 | event = self.nextEvent() 89 | if event.type != START_ARRAY: 90 | raise JSONParseError( 91 | JSON_UNEXPECTED_ELEMENT_ERROR, 92 | "Expected START_ARRAY but got: " + str(event)) 93 | return JSONArrayIterator(self) 94 | 95 | def expectField(self, expectedName, expectedType=None, allowNull=False, 96 | readAll=False): 97 | """Raise JSONParseError if next event is not the expected field with 98 | expected type else return the field value. If the next field is 99 | an OBJECT or ARRAY, only return whole object or array if 100 | readAll=True.""" 101 | event = self.nextEvent() 102 | if event.type != FIELD_NAME: 103 | raise JSONParseError( 104 | JSON_UNEXPECTED_ELEMENT_ERROR, 105 | "Expected FIELD_NAME but got: " + str(event)) 106 | if event.value != expectedName: 107 | raise JSONParseError(JSON_UNEXPECTED_ELEMENT_ERROR, "Expected " + 108 | expectedName + " field but got " + 109 | event.value + " instead.") 110 | return self._expectValue(FIELD_VALUE, expectedType, allowNull, readAll) 111 | 112 | def expectArrayValue(self, expectedType=None, allowNull=False, 113 | readAll=False): 114 | """Raise JSONParseError if next event is not an array element with 115 | the expected type else return the field value. If the next value 116 | is an OBJECT or ARRAY, only return whole object or array if 117 | readAll=True.""" 118 | return self._expectValue(ARRAY_VALUE, expectedType, allowNull, readAll) 119 | 120 | def _expectValue(self, eventType, expectedType, allowNull, readAll): 121 | event = self.nextEvent() 122 | if event.type == eventType: 123 | if allowNull and event.valueType == NULL: 124 | return None 125 | elif expectedType is not None and event.valueType != expectedType: 126 | raise JSONParseError( 127 | JSON_UNEXPECTED_ELEMENT_ERROR, "Expected " + expectedType + 128 | " but got " + event.valueType + " instead.") 129 | else: 130 | return event.value 131 | else: 132 | if eventType == ARRAY_VALUE: 133 | if event.node.parent is None or event.node.parent != ARRAY: 134 | raise JSONParseError( 135 | JSON_UNEXPECTED_ELEMENT_ERROR, 136 | "Expected array element but not in an array.") 137 | if event.type == START_OBJECT: 138 | if expectedType is not None and expectedType != OBJECT: 139 | raise JSONParseError( 140 | JSON_UNEXPECTED_ELEMENT_ERROR, "Expected " + 141 | expectedType + " but got an object instead.") 142 | elif expectedType is None or readAll: 143 | return self.readObject(event) 144 | elif event.type == START_ARRAY: 145 | if expectedType is not None and expectedType != ARRAY: 146 | raise JSONParseError( 147 | JSON_UNEXPECTED_ELEMENT_ERROR, "Expected " + 148 | expectedType + " but got array instead.") 149 | if expectedType is None or readAll: 150 | return self.readArray(event) 151 | else: 152 | return JSONArrayIterator(self) 153 | else: 154 | raise JSONParseError( 155 | JSON_UNEXPECTED_ELEMENT_ERROR, 156 | "Unexpected event: " + str(event)) 157 | 158 | def readObject(self, event=None): 159 | """Read and return a JSON object.""" 160 | if event is None: 161 | event = self.nextEvent() 162 | popRequired = False 163 | else: 164 | popRequired = True 165 | if event is None: 166 | return None 167 | if event.type != START_OBJECT: 168 | raise JSONParseError( 169 | JSON_UNEXPECTED_ELEMENT_ERROR, 170 | "Expected START_OBJECT but got " + event.type + " instead.") 171 | obj = self._load(event) 172 | if popRequired: 173 | self._pop() 174 | return obj 175 | 176 | def readArray(self, event=None): 177 | """Read and return a JSON array.""" 178 | if event is None: 179 | event = self.nextEvent() 180 | popRequired = False 181 | else: 182 | popRequired = True 183 | if event is None: 184 | return None 185 | if event.type != START_ARRAY: 186 | raise JSONParseError( 187 | JSON_UNEXPECTED_ELEMENT_ERROR, 188 | "Expected START_ARRAY but got " + event.type + " instead.") 189 | arr = self._load(event) 190 | if popRequired: 191 | self._pop() 192 | return arr 193 | 194 | def nextEvent(self): 195 | """Iterator method, return next JSON event from the stream, raises 196 | StopIteration() when complete.""" 197 | try: 198 | return self.__next__() 199 | except StopIteration: 200 | return None 201 | 202 | def next(self): 203 | """Iterator method, return next JSON event from the stream, raises 204 | StopIteration() when complete.""" 205 | return self.__next__() 206 | 207 | def __next__(self): 208 | """Iterator method, return next JSON event from the stream, raises 209 | StopIteration() when complete.""" 210 | while True: 211 | try: 212 | token = self.tokens[self.tokenIndex] 213 | self.tokenIndex += 1 214 | if token == "" or token.isspace(): 215 | pass 216 | elif token == '{': 217 | return self._push(OBJECT) 218 | elif token == '}': 219 | if self.node.type == FIELD: 220 | self.tokenIndex -= 1 221 | event = self._pop() 222 | if event is not None: 223 | return event 224 | elif self.node.type == OBJECT: 225 | return self._pop() 226 | else: 227 | raise JSONParseError( 228 | JSON_SYNTAX_ERROR, 229 | "A closing curly brace ('}') is only expected " 230 | "at the end of an object.") 231 | elif token == '[': 232 | if self.node is not None and self.node.type == OBJECT: 233 | raise JSONParseError( 234 | JSON_SYNTAX_ERROR, "An array in an object must " 235 | "be preceded by a field name.") 236 | return self._push(ARRAY) 237 | elif token == ']': 238 | if self.valueType is not None: 239 | self.tokenIndex -= 1 240 | event = self._arrayValue() 241 | if event is not None: 242 | return event 243 | elif self.node.type == ARRAY: 244 | if self.node.lastIndex == self.node.arrayLength: 245 | self.node.arrayLength += 1 246 | return self._pop() 247 | else: 248 | raise JSONParseError( 249 | JSON_SYNTAX_ERROR, "A closing bracket (']') " 250 | "is only expected at the end of an array.") 251 | elif token == ':': 252 | if self.node.type == OBJECT: 253 | if self.value != "" and self.valueType == STRING: 254 | event = self._push(FIELD, self.value) 255 | self.value = "" 256 | self.valueType = None 257 | return event 258 | else: 259 | raise JSONParseError( 260 | JSON_SYNTAX_ERROR, 261 | "Name for name/value pairs cannot be empty.") 262 | else: 263 | raise JSONParseError( 264 | JSON_SYNTAX_ERROR, 265 | "A colon (':') can only following a field " 266 | "name within an object.") 267 | elif token == ',': 268 | if self.node.type == ARRAY: 269 | event = self._arrayValue() 270 | self.node.arrayLength += 1 271 | elif self.node.type == FIELD: 272 | event = self._pop() 273 | else: 274 | raise JSONParseError( 275 | JSON_SYNTAX_ERROR, 276 | "A comma (',') is only expected between fields " 277 | "in objects or elements of an array.") 278 | if event is not None: 279 | return event 280 | else: 281 | if self.valueType is not None: 282 | raise JSONParseError( 283 | JSON_SYNTAX_ERROR, "Extra name or value found " 284 | "following: " + str(self.value)) 285 | elif self.node is None: 286 | raise JSONParseError( 287 | JSON_SYNTAX_ERROR, 288 | "Input must start with either an " 289 | "OBJECT ('{') or ARRAY ('['), got '" + token + 290 | "' instead.") 291 | elif token == '"': 292 | escape = False 293 | while True: 294 | try: 295 | token = self.tokens[self.tokenIndex] 296 | self.tokenIndex += 1 297 | if token == "": 298 | pass 299 | elif escape: 300 | escape = False 301 | self.value += token 302 | elif token == '"': 303 | break 304 | elif token == '\\': 305 | escape = True 306 | else: 307 | self.value += token 308 | except IndexError: 309 | data = self.stream.read(self.size) 310 | if data == "": 311 | raise JSONParseError( 312 | JSON_INCOMPLETE_ERROR, 313 | "Reached end of input before " + 314 | "reaching end of string.") 315 | self.tokens = self.pattern.split(data) 316 | self.tokenIndex = 0 317 | self.valueType = STRING 318 | else: 319 | token = token.strip() 320 | if self.tokenIndex == len(self.tokens): 321 | self.halfToken = token 322 | raise IndexError 323 | elif token[0].isdigit() or token[0] == '-': 324 | self.value = decimal.Decimal(token) 325 | self.valueType = NUMBER 326 | elif token == "null": 327 | self.value = None 328 | self.valueType = NULL 329 | elif token == "true": 330 | self.value = True 331 | self.valueType = BOOLEAN 332 | elif token == "false": 333 | self.value = False 334 | self.valueType = BOOLEAN 335 | else: 336 | raise JSONParseError( 337 | JSON_SYNTAX_ERROR, 338 | "Unexpected token: " + token) 339 | except IndexError: 340 | data = self.stream.read(self.size) 341 | if data == "": 342 | if self.node is not None: 343 | raise JSONParseError( 344 | JSON_INCOMPLETE_ERROR, "Reached end of input " 345 | "before reaching end of JSON structures.") 346 | else: 347 | raise StopIteration() 348 | return None 349 | logger.trace(data) 350 | self.tokens = self.pattern.split(data) 351 | self.tokenIndex = 0 352 | if self.halfToken is not None: 353 | self.tokens[0] = self.halfToken + self.tokens[0] 354 | self.halfToken = None 355 | 356 | def _load(self, event): 357 | if event.type == START_OBJECT: 358 | value = start = "{" 359 | end = "}" 360 | elif event.type == START_ARRAY: 361 | value = start = "[" 362 | end = "]" 363 | else: 364 | raise JSONParseError( 365 | JSON_UNEXPECTED_ELEMENT_ERROR, 366 | "Unexpected event: " + event.type) 367 | count = 1 368 | tokens = self.tokens 369 | tokenIndex = self.tokenIndex 370 | inString = False 371 | inEscape = False 372 | try: 373 | while True: 374 | startIndex = tokenIndex 375 | for token in tokens[startIndex:]: 376 | tokenIndex += 1 377 | if token == "": 378 | pass 379 | elif inString: 380 | if inEscape: 381 | inEscape = False 382 | elif token == '"': 383 | inString = False 384 | elif token == '\\': 385 | inEscape = True 386 | elif token == '"': 387 | inString = True 388 | elif token == start: 389 | count += 1 390 | elif token == end: 391 | count -= 1 392 | if count == 0: 393 | value += "".join(tokens[startIndex:tokenIndex]) 394 | raise StopIteration() 395 | value += "".join(tokens[startIndex:]) 396 | data = self.stream.read(self.size) 397 | if data == "": 398 | raise JSONParseError( 399 | JSON_INCOMPLETE_ERROR, "Reached end of input before " 400 | "reaching end of JSON structures.") 401 | tokens = self.pattern.split(data) 402 | tokenIndex = 0 403 | except StopIteration: 404 | pass 405 | self.tokens = tokens 406 | self.tokenIndex = tokenIndex 407 | try: 408 | return json.loads(value, parse_float=decimal.Decimal, 409 | parse_int=decimal.Decimal) 410 | except ValueError as e: 411 | raise JSONParseError(JSON_SYNTAX_ERROR, "".join(e.args)) 412 | 413 | def _push(self, nodeType, value=None): 414 | if self.node is not None and self.node.type == FIELD: 415 | self.node.valueType = nodeType 416 | self.node = JSONNode(self.node, nodeType, value) 417 | if self.node.parent is not None and self.node.parent.type == ARRAY: 418 | self.node.arrayIndex = self.node.parent.arrayLength 419 | if self.node.parent.lastIndex == self.node.parent.arrayLength: 420 | raise JSONParseError( 421 | JSON_SYNTAX_ERROR, 422 | "Missing comma separating array elements.") 423 | self.node.parent.lastIndex = self.node.parent.arrayLength 424 | return self.node.startEvent() 425 | 426 | def _pop(self): 427 | # Pop the current node from the stack. 428 | node = self.node 429 | self.node = self.node.parent 430 | # Set the value and value type on the node. 431 | if node.valueType is None: 432 | node.valueType = self.valueType 433 | node.value = self.value 434 | # Reset value and valueType 435 | self.value = "" 436 | self.valueType = None 437 | if node.type == FIELD and node.valueType is None: 438 | raise JSONParseError( 439 | JSON_SYNTAX_ERROR, "Expected value for field: " + node.name) 440 | # Return the end event for the node. 441 | return node.endEvent() 442 | 443 | def _arrayValue(self): 444 | endOfArray = self.node.lastIndex == self.node.arrayLength 445 | if self.valueType is None and endOfArray: 446 | pass 447 | elif self.valueType is None: 448 | raise JSONParseError( 449 | JSON_SYNTAX_ERROR, 450 | "Expected value for array element at index: " + 451 | str(self.node.arrayLength)) 452 | else: 453 | event = JSONEvent( 454 | self.node, ARRAY_VALUE, self.value, self.valueType, 455 | self.node.arrayLength) 456 | self.node.lastIndex = self.node.arrayLength 457 | # Reset value and valueType 458 | self.value = "" 459 | self.valueType = None 460 | # Return the end event for the node. 461 | return event 462 | 463 | def __iter__(self): 464 | return self 465 | 466 | # Define exceptions 467 | 468 | 469 | class JSONParseError(Exception): 470 | 471 | def __init__(self, code, msg): 472 | self.args = (code, msg) 473 | self.code = code 474 | self.msg = msg 475 | 476 | 477 | class JSONNode (object): 478 | 479 | def __init__(self, parent, nodeType, name=None, value=None, 480 | valueType=None): 481 | self.parent = parent 482 | self.type = nodeType 483 | self.name = name 484 | self.value = value 485 | self.valueType = valueType 486 | self.arrayIndex = None 487 | self.arrayLength = None 488 | self.lastIndex = -1 489 | if nodeType == ARRAY: 490 | self.arrayLength = 0 491 | 492 | def startEvent(self): 493 | if self.type == ARRAY: 494 | return JSONEvent(self, START_ARRAY, arrayIndex=self.arrayIndex) 495 | elif self.type == OBJECT: 496 | return JSONEvent(self, START_OBJECT, arrayIndex=self.arrayIndex) 497 | elif self.type == FIELD: 498 | return JSONEvent(self, FIELD_NAME, self.name) 499 | 500 | def endEvent(self): 501 | if self.type == ARRAY: 502 | return JSONEvent(self, END_ARRAY, arrayIndex=self.arrayIndex, 503 | arrayLength=self.arrayLength) 504 | elif self.type == OBJECT: 505 | return JSONEvent(self, END_OBJECT, arrayIndex=self.arrayIndex) 506 | elif self.type == FIELD and self.valueType not in (OBJECT, ARRAY): 507 | return JSONEvent(self, FIELD_VALUE, self.value, self.valueType) 508 | 509 | 510 | class JSONEvent (object): 511 | 512 | def __init__(self, node, eventType, value=None, valueType=None, 513 | arrayIndex=None, arrayLength=None): 514 | self.node = node 515 | self.type = eventType 516 | self.value = value 517 | self.valueType = valueType 518 | self.arrayIndex = arrayIndex 519 | self.arrayLength = arrayLength 520 | 521 | def __repr__(self): 522 | text = "JSONEvent (type=" + self.type 523 | if self.value is not None: 524 | text += ", value=" + str(self.value) 525 | if self.valueType is not None: 526 | text += ", valueType=" + str(self.valueType) 527 | if self.arrayIndex is not None: 528 | text += ", arrayIndex=" + str(self.arrayIndex) 529 | if self.arrayLength is not None: 530 | text += ", arrayLength=" + str(self.arrayLength) 531 | text += ")" 532 | return text 533 | 534 | 535 | class JSONArrayIterator (object): 536 | 537 | def __init__(self, parser): 538 | self.parser = parser 539 | self.complete = False 540 | 541 | def __iter__(self): 542 | return self 543 | 544 | def __next__(self): 545 | if self.complete: 546 | raise StopIteration() 547 | else: 548 | event = self.parser.nextEvent() 549 | if event.type == START_OBJECT: 550 | return self.parser.readObject(event) 551 | elif event.type == START_ARRAY: 552 | return self.parser.readArray(event) 553 | elif event.type == ARRAY_VALUE: 554 | return event.value 555 | elif event.type == END_ARRAY: 556 | self.complete = True 557 | raise StopIteration() 558 | else: 559 | raise JSONParseError( 560 | JSON_UNEXPECTED_ELEMENT_ERROR, 561 | "Unexpected event: " + str(event)) 562 | 563 | def next(self): 564 | return self.__next__() 565 | -------------------------------------------------------------------------------- /teradata/tdrest.py: -------------------------------------------------------------------------------- 1 | """An implementation of the Python Database API Specification v2.0 using 2 | Teradata REST.""" 3 | 4 | # The MIT License (MIT) 5 | # 6 | # Copyright (c) 2015 by Teradata 7 | # 8 | # Permission is hereby granted, free of charge, to any person obtaining a copy 9 | # of this software and associated documentation files (the "Software"), to deal 10 | # in the Software without restriction, including without limitation the rights 11 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 12 | # copies of the Software, and to permit persons to whom the Software is 13 | # furnished to do so, subject to the following conditions: 14 | # 15 | # The above copyright notice and this permission notice shall be included in 16 | # all copies or substantial portions of the Software. 17 | # 18 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 19 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 20 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 21 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 22 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 23 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 24 | # SOFTWARE. 25 | 26 | import atexit 27 | import base64 28 | import json 29 | import ssl 30 | import sys 31 | import time 32 | import io 33 | 34 | from . import pulljson, util, datatypes 35 | from .api import * # @UnusedWildImport # noqa 36 | 37 | 38 | if sys.version_info[0] == 2: 39 | import httplib as httplib # @UnresolvedImport #@UnusedImport 40 | else: 41 | import http.client as httplib # @UnresolvedImport @UnusedImport @Reimport 42 | unicode = str 43 | 44 | logger = logging.getLogger(__name__) 45 | 46 | REST_ERROR = "REST_ERROR" 47 | HTTP_STATUS_DATABASE_ERROR = 420 48 | ERROR_USER_GENERATED_TRANSACTION_ABORT = 3514 49 | MAX_CONNECT_RETRIES = 5 50 | 51 | connections = [] 52 | 53 | 54 | def cleanup(): 55 | for conn in connections: 56 | conn.close() 57 | atexit.register(cleanup) 58 | 59 | 60 | class RestConnection: 61 | 62 | """ Represents a Connection to Teradata using the REST API for 63 | Teradata Database """ 64 | 65 | def __init__(self, dbType="Teradata", host=None, system=None, 66 | username=None, password=None, protocol='http', port=None, 67 | webContext='/tdrest', autoCommit=False, implicit=False, 68 | transactionMode='TERA', queryBands=None, charset=None, 69 | verifyCerts=True, sslContext=None, database=None, 70 | authentication=None, 71 | dataTypeConverter=datatypes.DefaultDataTypeConverter()): 72 | self.dbType = dbType 73 | self.system = system 74 | self.sessionId = None 75 | self.implicit = implicit 76 | self.transactionMode = transactionMode 77 | self.dataTypeConverter = dataTypeConverter 78 | self.cursors = [] 79 | # Support TERA and Teradata as transaction mode to be consistent with 80 | # ODBC. 81 | if transactionMode == "Teradata": 82 | self.transactionMode = "TERA" 83 | self.autoCommit = False 84 | if port is None: 85 | if protocol == 'http': 86 | port = 1080 87 | elif protocol == 'https': 88 | port = 1443 89 | else: 90 | raise InterfaceError( 91 | CONFIG_ERROR, "Unsupported protocol: {}".format(protocol)) 92 | if host is None: 93 | raise InterfaceError(0, 94 | "\"host\" is a required field, " 95 | "set to location of " 96 | "TDREST server.") 97 | self.template = RestTemplate( 98 | protocol, host, int(port), webContext, username, password, 99 | accept='application/vnd.com.teradata.rest-v1.0+json', 100 | verifyCerts=util.booleanValue(verifyCerts), sslContext=sslContext) 101 | with self.template.connect() as conn: 102 | if not self.implicit: 103 | options = {} 104 | options['autoCommit'] = autoCommit 105 | options['transactionMode'] = transactionMode 106 | if queryBands: 107 | options['queryBands'] = queryBands 108 | if charset: 109 | options['charSet'] = charset 110 | if database: 111 | options['defaultDatabase'] = database 112 | if authentication: 113 | options['logMech'] = authentication 114 | try: 115 | session = conn.post( 116 | '/systems/{0}/sessions'.format(self.system), 117 | options).readObject() 118 | self.sessionId = session['sessionId'] 119 | connections.append(self) 120 | logger.info("Created explicit session: %s", session) 121 | except (pulljson.JSONParseError) as e: 122 | raise InterfaceError( 123 | e.code, "Error reading JSON response: " + e.msg) 124 | 125 | def close(self): 126 | """ Closes an Explicit Session using the REST API for Teradata 127 | Database """ 128 | if hasattr(self, 'sessionId') and self.sessionId is not None: 129 | with self.template.connect() as conn: 130 | try: 131 | conn.delete( 132 | '/systems/{0}/sessions/{1}'.format( 133 | self.system, self.sessionId)) 134 | except InterfaceError as e: 135 | # Ignore if the session is already closed. 136 | if e.code != 404: 137 | raise 138 | logger.info("Closing session: %s", self.sessionId) 139 | self.sessionId = None 140 | connections.remove(self) 141 | for cursor in list(self.cursors): 142 | cursor.close() 143 | 144 | def commit(self): 145 | with self.cursor() as cursor: 146 | if self.transactionMode == 'ANSI': 147 | cursor.execute("COMMIT") 148 | else: 149 | cursor.execute("ET") 150 | 151 | def rollback(self): 152 | with self.cursor() as cursor: 153 | try: 154 | cursor.execute("ROLLBACK") 155 | except DatabaseError as e: 156 | if e.code != ERROR_USER_GENERATED_TRANSACTION_ABORT: 157 | raise 158 | 159 | def cursor(self): 160 | return RestCursor(self) 161 | 162 | def __del__(self): 163 | self.close() 164 | 165 | def __enter__(self): 166 | return self 167 | 168 | def __exit__(self, t, value, traceback): 169 | self.close() 170 | 171 | connect = RestConnection 172 | 173 | 174 | class RestCursor (util.Cursor): 175 | 176 | def __init__(self, connection): 177 | self.conn = None 178 | util.Cursor.__init__( 179 | self, connection, connection.dbType, connection.dataTypeConverter) 180 | self.conn = connection.template.connect() 181 | connection.cursors.append(self) 182 | 183 | def callproc(self, procname, params, queryTimeout=None): 184 | inparams = None 185 | outparams = None 186 | count = 0 187 | query = "CALL {} (".format(procname) 188 | if params is not None: 189 | inparams = [[]] 190 | outparams = [] 191 | for p in params: 192 | if count > 0: 193 | query += ", " 194 | if isinstance(p, InOutParam): 195 | inparams[0].append(p.inValue) 196 | outparams.append(p.inValue) 197 | elif isinstance(p, OutParam): 198 | outparams.append(None) 199 | else: 200 | inparams[0].append(p) 201 | count += 1 202 | query += "?" 203 | query += ")" 204 | outparams = self._handleResults(self._execute( 205 | query, inparams, outparams, queryTimeout=queryTimeout), 206 | len(outparams) > 0) 207 | return util.OutParams(params, self.dbType, self.converter, outparams) 208 | 209 | def close(self): 210 | if self.conn: 211 | self.conn.close() 212 | 213 | def execute(self, query, params=None, queryTimeout=None): 214 | if params is not None: 215 | params = [params] 216 | self._handleResults( 217 | self._execute(query, params, queryTimeout=queryTimeout)) 218 | return self 219 | 220 | def executemany(self, query, params, batch=False, queryTimeout=None): 221 | self._handleResults( 222 | self._execute(query, params, batch=batch, 223 | queryTimeout=queryTimeout)) 224 | return self 225 | 226 | def _handleResults(self, results, hasOutParams=False): 227 | self.results = results 228 | try: 229 | results.expectObject() 230 | self.queueDuration = results.expectField( 231 | "queueDuration", pulljson.NUMBER) 232 | self.queryDuration = results.expectField( 233 | "queryDuration", pulljson.NUMBER) 234 | logger.debug("Durations reported by REST service: Queue Duration: " 235 | "%s, Query Duration: %s", self.queueDuration, 236 | self.queryDuration) 237 | results.expectField("results", pulljson.ARRAY) 238 | results.expectObject() 239 | return self._handleResultSet(results, hasOutParams) 240 | except (pulljson.JSONParseError) as e: 241 | raise InterfaceError( 242 | e.code, "Error reading JSON response: " + e.msg) 243 | 244 | def _execute(self, query, params=None, outParams=None, batch=False, 245 | queryTimeout=None): 246 | options = {} 247 | options['query'] = query 248 | options['format'] = 'array' 249 | options['includeColumns'] = 'true' 250 | options['rowLimit'] = 0 251 | if params is not None: 252 | options['params'] = list( 253 | list(_convertParam(p) for p in paramSet) 254 | for paramSet in params) 255 | options['batch'] = batch 256 | if outParams is not None: 257 | options['outParams'] = outParams 258 | if not self.connection.implicit: 259 | options['session'] = str(self.connection.sessionId) 260 | if queryTimeout is not None: 261 | options['queryTimeout'] = queryTimeout 262 | options['queueTimeout'] = queryTimeout 263 | return self.conn.post('/systems/{0}/queries'.format( 264 | self.connection.system), options) 265 | 266 | def _handleResultSet(self, results, hasOutParams=False): 267 | outParams = None 268 | if hasOutParams: 269 | outParams = results.expectField( 270 | "outParams", pulljson.ARRAY, readAll=True) 271 | self.resultSet = None 272 | else: 273 | try: 274 | self.resultSet = results.expectField( 275 | "resultSet", pulljson.BOOLEAN) 276 | except pulljson.JSONParseError: 277 | # Workaround for Batch mode and Stored procedures which doens't 278 | # include a resultSet. 279 | self.resultSet = None 280 | if self.resultSet: 281 | index = 0 282 | self.columns = {} 283 | self.description = [] 284 | self.types = [] 285 | self.rowcount = -1 286 | self.rownumber = None 287 | for column in results.expectField("columns", pulljson.ARRAY): 288 | self.columns[column["name"].lower()] = index 289 | type_code = self.converter.convertType( 290 | self.dbType, column["type"]) 291 | self.types.append((column["type"], type_code)) 292 | self.description.append( 293 | (column["name"], type_code, None, None, None, None, None)) 294 | index += 1 295 | self.iterator = results.expectField("data", pulljson.ARRAY) 296 | else: 297 | self.columns = None 298 | self.description = None 299 | self.rownumber = None 300 | self.rowcount = -1 301 | if self.resultSet is not None: 302 | self.rowcount = results.expectField("count") 303 | return outParams 304 | 305 | def nextset(self): 306 | for row in self: # @UnusedVariable 307 | pass 308 | for event in self.results: 309 | if event.type == pulljson.START_OBJECT: 310 | self._handleResultSet(self.results) 311 | return True 312 | 313 | 314 | def _convertParam(p): 315 | if util.isString(p) or p is None: 316 | return p 317 | elif isinstance(p, bytearray): 318 | return ''.join('{:02x}'.format(x) for x in p) 319 | else: 320 | return unicode(p) 321 | 322 | 323 | class RestTemplate: 324 | 325 | def __init__(self, protocol, host, port, webContext, username, password, 326 | sslContext=None, verifyCerts=True, accept=None): 327 | self.protocol = protocol 328 | self.host = host 329 | self.port = port 330 | self.webContext = webContext 331 | self.headers = {} 332 | self.headers['Content-Type'] = 'application/json' 333 | if accept is not None: 334 | self.headers['Accept'] = accept 335 | self.headers['Authorization'] = 'Basic ' + \ 336 | base64.b64encode( 337 | (username + ":" + password).encode('utf_8')).decode('ascii') 338 | self.sslContext = sslContext 339 | if sslContext is None and not verifyCerts: 340 | self.sslContext = ssl.create_default_context() 341 | self.sslContext.check_hostname = False 342 | self.sslContext.verify_mode = ssl.CERT_NONE 343 | 344 | def connect(self): 345 | return HttpConnection(self) 346 | 347 | 348 | class HttpConnection: 349 | 350 | def __init__(self, template): 351 | self.template = template 352 | if template.protocol.lower() == "http": 353 | self.conn = httplib.HTTPConnection(template.host, template.port) 354 | elif template.protocol.lower() == "https": 355 | self.conn = httplib.HTTPSConnection( 356 | template.host, template.port, context=template.sslContext) 357 | else: 358 | raise InterfaceError( 359 | REST_ERROR, "Unknown protocol: %s" % template.protocol) 360 | failureCount = 0 361 | while True: 362 | try: 363 | self.conn.connect() 364 | break 365 | except Exception as e: 366 | eofError = "EOF occurred in violation of protocol" in str(e) 367 | failureCount += 1 368 | if not eofError or failureCount > MAX_CONNECT_RETRIES: 369 | raise InterfaceError( 370 | REST_ERROR, 371 | "Error accessing {}:{}. ERROR: {}".format( 372 | template.host, template.port, e)) 373 | else: 374 | logger.debug( 375 | "Received an \"EOF occurred in violation of " 376 | "protocol\" error, retrying connection.") 377 | 378 | def close(self): 379 | if self.conn: 380 | self.conn.close() 381 | 382 | def post(self, uri, data={}): 383 | return self.send(uri, 'POST', data) 384 | 385 | def delete(self, uri): 386 | self.send(uri, 'DELETE', None) 387 | 388 | def get(self, uri): 389 | return self.send(uri, 'GET', None) 390 | 391 | def __enter__(self): 392 | return self 393 | 394 | def __exit__(self, t, value, traceback): 395 | self.close() 396 | 397 | def send(self, uri, method, data): 398 | response = None 399 | url = self.template.webContext + uri 400 | try: 401 | start = time.time() 402 | payload = json.dumps(data).encode('utf8') if data else None 403 | logger.trace("%s: %s, %s", method, url, payload) 404 | self.conn.request(method, url, payload, self.template.headers) 405 | response = self.conn.getresponse() 406 | duration = time.time() - start 407 | logger.debug("Roundtrip Duration: %.3f seconds", duration) 408 | except Exception as e: 409 | raise InterfaceError( 410 | REST_ERROR, 'Error accessing {}. ERROR: {}'.format(url, e)) 411 | if response.status < 300: 412 | return pulljson.JSONPullParser( 413 | HttpResponseAsUnicodeStream(response)) 414 | if response.status < 400: 415 | raise InterfaceError( 416 | response.status, 417 | "HTTP Status: {}. ERROR: Redirection not supported.") 418 | else: 419 | msg = response.read().decode("utf8") 420 | try: 421 | errorDetails = json.loads(msg) 422 | except Exception: 423 | raise InterfaceError( 424 | response.status, "HTTP Status: " + str(response.status) + 425 | ", URL: " + url + ", Details: " + str(msg)) 426 | if response.status == HTTP_STATUS_DATABASE_ERROR: 427 | raise DatabaseError( 428 | int(errorDetails['error']), errorDetails['message']) 429 | else: 430 | raise InterfaceError(response.status, "HTTP Status: " + str( 431 | response.status) + ", URL: " + url + 432 | ", Details: " + str(errorDetails)) 433 | 434 | 435 | class HttpResponseAsUnicodeStream: 436 | 437 | def __init__(self, buf): 438 | self.stream = io.TextIOWrapper( 439 | HttpResponseIOWrapper(buf), encoding="utf8") 440 | 441 | def read(self, size): 442 | data = "" 443 | if not self.stream.closed: 444 | data = self.stream.read(size) 445 | return data 446 | 447 | 448 | class HttpResponseIOWrapper: 449 | 450 | def __init__(self, buf): 451 | self.buf = buf 452 | self.closed = False 453 | 454 | def readable(self): 455 | return True 456 | 457 | def writable(self): 458 | return False 459 | 460 | def seekable(self): 461 | return False 462 | 463 | def read1(self, n=-1): 464 | return self.read(n) 465 | 466 | def read(self, size): 467 | return self.buf.read(size) 468 | -------------------------------------------------------------------------------- /teradata/util.py: -------------------------------------------------------------------------------- 1 | """ Helper classes and interfaces for Teradata Python module. """ 2 | 3 | # The MIT License (MIT) 4 | # 5 | # Copyright (c) 2015 by Teradata 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a copy 8 | # of this software and associated documentation files (the "Software"), to deal 9 | # in the Software without restriction, including without limitation the rights 10 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 11 | # copies of the Software, and to permit persons to whom the Software is 12 | # furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included in 15 | # all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 18 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 20 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 22 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 23 | # SOFTWARE. 24 | 25 | import sys 26 | import re 27 | import codecs 28 | import argparse 29 | import inspect 30 | import copy 31 | import getpass 32 | from .api import * # @UnusedWildImport # noqa 33 | 34 | INVALID_ARGUMENT = "INVALID_ARGUMENT" 35 | 36 | # Create new trace log level 37 | TRACE = 5 38 | logging.addLevelName(TRACE, "TRACE") 39 | 40 | 41 | def trace(self, message, *args, **kws): 42 | # Yes, logger takes its '*args' as 'args'. 43 | if self.isEnabledFor(TRACE): 44 | self._log(TRACE, message, args, **kws) 45 | logging.TRACE = TRACE 46 | logging.Logger.trace = trace 47 | 48 | logger = logging.getLogger(__name__) 49 | 50 | if sys.version_info[0] == 2: 51 | openfile = codecs.open 52 | else: 53 | openfile = open 54 | 55 | 56 | def isString(value): 57 | # Implement python version specific setup. 58 | if sys.version_info[0] == 2: 59 | return isinstance(value, basestring) # @UndefinedVariable 60 | else: 61 | return isinstance(value, str) # @UndefinedVariable 62 | 63 | 64 | def toUnicode(string): 65 | if not isString(string): 66 | string = str(string) 67 | if sys.version_info[0] == 2: 68 | if isinstance(string, str): 69 | string = string.decode("utf8") 70 | return string 71 | 72 | 73 | def raiseIfNone(name, value): 74 | if not value: 75 | raise InterfaceError( 76 | INVALID_ARGUMENT, "Missing value for \"{}\".".format(name)) 77 | 78 | 79 | def booleanValue(value): 80 | retval = value 81 | if isString(value): 82 | retval = value.lower() in ["1", "on", "true", "yes"] 83 | return retval 84 | 85 | 86 | class Cursor: 87 | 88 | """An abstract cursor for encapsulating shared functionality of connection 89 | specific implementations (e.g. ODBC, REST)""" 90 | 91 | def __init__(self, connection, dbType, dataTypeConverter): 92 | self.connection = connection 93 | self.converter = dataTypeConverter 94 | self.dbType = dbType 95 | self.results = None 96 | self.arraysize = 1 97 | self.fetchSize = None 98 | self.rowcount = -1 99 | self.description = None 100 | self.types = None 101 | self.iterator = None 102 | self.rownumber = None 103 | 104 | def callproc(self, procname, params): 105 | # Abstract method, defined by convention only 106 | raise NotImplementedError("Subclass must implement abstract method") 107 | 108 | def close(self): 109 | pass 110 | 111 | def execute(self, query, params=None): 112 | # Abstract method, defined by convention only 113 | raise NotImplementedError("Subclass must implement abstract method") 114 | 115 | def executemany(self, query, params, batch=False): 116 | # Abstract method, defined by convention only 117 | raise NotImplementedError("Subclass must implement abstract method") 118 | 119 | def fetchone(self): 120 | self.fetchSize = 1 121 | return next(self, None) 122 | 123 | def fetchmany(self, size=None): 124 | if size is None: 125 | size = self.arraysize 126 | self.fetchSize = size 127 | rows = [] 128 | count = 0 129 | for row in self: 130 | rows.append(row) 131 | count += 1 132 | if count == size: 133 | break 134 | return rows 135 | 136 | def fetchall(self): 137 | self.fetchSize = self.arraysize 138 | rows = [] 139 | for row in self: 140 | rows.append(row) 141 | return rows 142 | 143 | def nextset(self): 144 | # Abstract method, defined by convention only 145 | raise NotImplementedError("Subclass must implement abstract method") 146 | 147 | def setinputsizes(self, sizes): 148 | pass 149 | 150 | def setoutputsize(self, size, column=None): 151 | pass 152 | 153 | def __iter__(self): 154 | return self 155 | 156 | def __next__(self): 157 | self.fetchSize = self.arraysize 158 | if self.iterator: 159 | if self.rownumber is None: 160 | self.rownumber = 0 161 | else: 162 | self.rownumber += 1 163 | values = next(self.iterator) 164 | for i in range(0, len(values)): 165 | values[i] = self.converter.convertValue( 166 | self.dbType, self.types[i][0], self.types[i][1], values[i]) 167 | row = Row(self.columns, values, self.rownumber + 1) 168 | # logger.debug("%s", row) 169 | return row 170 | raise StopIteration() 171 | 172 | def next(self): 173 | return self.__next__() 174 | 175 | def __enter__(self): 176 | return self 177 | 178 | def __exit__(self, t, value, traceback): 179 | self.close() 180 | 181 | 182 | class Row (object): 183 | 184 | """Represents a table row.""" 185 | 186 | def __init__(self, columns, values, rowNum): 187 | super(Row, self).__setattr__("columns", columns) 188 | super(Row, self).__setattr__("values", values) 189 | super(Row, self).__setattr__("rowNum", rowNum) 190 | 191 | def __getattr__(self, name): 192 | try: 193 | index = self.columns[name.lower()] 194 | return self.values[index] 195 | except KeyError: 196 | raise AttributeError("No such attribute: " + name) 197 | 198 | def __setattr__(self, name, value): 199 | try: 200 | self.values[self.columns[name.lower()]] = value 201 | except KeyError: 202 | raise AttributeError("No such attribute: " + name) 203 | 204 | def __setitem__(self, key, value): 205 | try: 206 | self.values[key] = value 207 | except TypeError: 208 | self.values[self.columns[key.lower()]] = value 209 | 210 | def __getitem__(self, key): 211 | try: 212 | return self.values[key] 213 | except TypeError: 214 | index = self.columns[key.lower()] 215 | return self.values[index] 216 | 217 | def __len__(self): 218 | return len(self.values) 219 | 220 | def __str__(self): 221 | return "Row " + str(self.rowNum) + ": [" + \ 222 | ", ".join(map(str, self.values)) + "]" 223 | 224 | def __iter__(self): 225 | return self.values.__iter__() 226 | 227 | 228 | class OutParams (object): 229 | 230 | """ Represents a set of Output parameters. """ 231 | 232 | def __init__(self, params, dbType, dataTypeConverter, outparams=None): 233 | names = {} 234 | copy = [] 235 | for p in params: 236 | if isinstance(p, OutParam): 237 | if outparams: 238 | value = outparams.pop(0) 239 | else: 240 | value = p.value() 241 | if p.dataType is not None: 242 | typeCode = dataTypeConverter.convertType( 243 | dbType, p.dataType) 244 | value = dataTypeConverter.convertValue( 245 | dbType, p.dataType, typeCode, value) 246 | copy.append(value) 247 | if p.name is not None: 248 | names[p.name] = value 249 | else: 250 | copy.append(p) 251 | super(OutParams, self).__setattr__("config", copy) 252 | super(OutParams, self).__setattr__("names", names) 253 | 254 | def __getattr__(self, name): 255 | try: 256 | return self.names[name] 257 | except KeyError: 258 | raise AttributeError("No such attribute: " + name) 259 | 260 | def __setattr__(self, name, value): 261 | raise AttributeError("Output parameters are read only.") 262 | 263 | def __setitem__(self, key, value): 264 | raise AttributeError("Output parameters are read only.") 265 | 266 | def __getitem__(self, key): 267 | try: 268 | return self.config[key] 269 | except TypeError: 270 | return self.names[key] 271 | 272 | def __len__(self): 273 | return len(self.config) 274 | 275 | def __str__(self): 276 | return str(self.config) 277 | 278 | def __iter__(self): 279 | return self.config.__iter__() 280 | 281 | 282 | class SqlScript: 283 | 284 | """An iterator for iterating through the queries in a SQL script.""" 285 | 286 | def __init__(self, filename, delimiter=";", encoding=None): 287 | self.delimiter = delimiter 288 | with openfile(filename, mode='r', encoding=encoding) as f: 289 | self.sql = f.read() 290 | 291 | def __iter__(self): 292 | return sqlsplit(self.sql, self.delimiter) 293 | 294 | 295 | class BteqScript: 296 | 297 | """An iterator for iterating through the queries in a BTEQ script.""" 298 | 299 | def __init__(self, filename, encoding=None): 300 | self.file = filename 301 | with openfile(self.file, mode='r', encoding=encoding) as f: 302 | self.lines = f.readlines() 303 | 304 | def __iter__(self): 305 | return bteqsplit(self.lines) 306 | 307 | 308 | def sqlsplit(sql, delimiter=";"): 309 | """A generator function for splitting out SQL statements according to the 310 | specified delimiter. Ignores delimiter when in strings or comments.""" 311 | tokens = re.split("(--|'|\n|" + re.escape(delimiter) + "|\"|/\*|\*/)", 312 | sql if isString(sql) else delimiter.join(sql)) 313 | statement = [] 314 | inComment = False 315 | inLineComment = False 316 | inString = False 317 | inQuote = False 318 | for t in tokens: 319 | if not t: 320 | continue 321 | if inComment: 322 | if t == "*/": 323 | inComment = False 324 | elif inLineComment: 325 | if t == "\n": 326 | inLineComment = False 327 | elif inString: 328 | if t == '"': 329 | inString = False 330 | elif inQuote: 331 | if t == "'": 332 | inQuote = False 333 | elif t == delimiter: 334 | sql = "".join(statement).strip() 335 | if sql: 336 | yield sql 337 | statement = [] 338 | continue 339 | elif t == "'": 340 | inQuote = True 341 | elif t == '"': 342 | inString = True 343 | elif t == "/*": 344 | inComment = True 345 | elif t == "--": 346 | inLineComment = True 347 | statement.append(t) 348 | sql = "".join(statement).strip() 349 | if sql: 350 | yield sql 351 | 352 | 353 | def linesplit(sql, newline="\n"): 354 | """A generator function for splitting out SQL statements according to the 355 | specified delimiter. Ignores delimiter when in strings or comments.""" 356 | tokens = re.split("(--|'|" + re.escape(newline) + "|\"|/\*|\*/)", 357 | sql if isString(sql) else newline.join(sql)) 358 | statement = [] 359 | inComment = False 360 | inLineComment = False 361 | inString = False 362 | inQuote = False 363 | for t in tokens: 364 | if inComment: 365 | if t == "*/": 366 | inComment = False 367 | if t == newline: 368 | sql = "".join(statement) 369 | yield sql 370 | statement = [] 371 | continue 372 | elif inLineComment: 373 | if t == "\n": 374 | inLineComment = False 375 | if t == newline: 376 | sql = "".join(statement) 377 | yield sql 378 | statement = [] 379 | continue 380 | elif inString: 381 | if t == '"': 382 | inString = False 383 | elif inQuote: 384 | if t == "'": 385 | inQuote = False 386 | elif t == newline: 387 | sql = "".join(statement) 388 | yield sql 389 | statement = [] 390 | continue 391 | elif t == "'": 392 | inQuote = True 393 | elif t == '"': 394 | inString = True 395 | elif t == "/*": 396 | inComment = True 397 | elif t == "--": 398 | inLineComment = True 399 | statement.append(t) 400 | sql = "".join(statement) 401 | if sql: 402 | yield sql 403 | 404 | 405 | def bteqsplit(lines): 406 | """A generator function for splitting out SQL statements according 407 | BTEQ rule.""" 408 | statement = [] 409 | inStatement = False 410 | inComment = False 411 | for originalLine in lines: 412 | line = originalLine.strip() 413 | if not inStatement: 414 | if inComment: 415 | if "*/" in line: 416 | inComment = False 417 | line = line.split("*/", 1)[1].strip() 418 | originalLine = originalLine.split("*/", 1)[1] 419 | else: 420 | continue 421 | if not line: 422 | continue 423 | # Else if BTEQ command. 424 | elif line.startswith("."): 425 | continue 426 | # Else if BTEQ comment. 427 | elif line.startswith("*"): 428 | continue 429 | elif line.startswith("/*"): 430 | if not line.endswith("*/"): 431 | inComment = True 432 | continue 433 | else: 434 | inStatement = True 435 | statement.append(originalLine) 436 | if line.endswith(";"): 437 | sql = "".join(statement).strip() 438 | if sql: 439 | yield sql 440 | statement = [] 441 | inStatement = False 442 | if inStatement: 443 | sql = "".join(statement).strip() 444 | if sql: 445 | yield sql 446 | 447 | 448 | def createTestCasePerDSN(testCase, baseCls, dataSourceNames): 449 | """A method for duplicating test cases, once for each named data source.""" 450 | for dsn in dataSourceNames: 451 | attr = dict(testCase.__dict__) 452 | attr['dsn'] = dsn 453 | newTestCase = type( 454 | testCase.__name__ + "_" + dsn, (testCase, baseCls), attr) 455 | setattr(sys.modules[testCase.__module__], 456 | newTestCase.__name__, newTestCase) 457 | 458 | 459 | def setupTestUser(udaExec, dsn, user=None, passwd=None, perm=100000000): 460 | """A utility method for creating a test user to be use by unittests.""" 461 | if user is None: 462 | user = "py%s_%std_%s_test" % ( 463 | sys.version_info[0], sys.version_info[1], getpass.getuser()) 464 | if passwd is None: 465 | passwd = user 466 | with udaExec.connect(dsn) as conn: 467 | try: 468 | conn.execute("DELETE DATABASE " + user) 469 | conn.execute("MODIFY USER " + user + " AS PERM = %s" % perm) 470 | except DatabaseError as e: 471 | if e.code == 3802: 472 | conn.execute( 473 | "CREATE USER " + user + 474 | " FROM DBC AS PERM = %s, PASSWORD = %s" % (perm, passwd)) 475 | conn.execute("GRANT UDTTYPE ON SYSUDTLIB to %s" % user) 476 | conn.execute( 477 | "GRANT CREATE PROCEDURE ON %s to %s" % (user, user)) 478 | return user 479 | 480 | 481 | class CommandLineArgumentParser: 482 | 483 | """ Command Line Argument Parser that matches command line arguments to 484 | the functions in a module.""" 485 | 486 | def __init__(self, moduleName, optionalArgs=None, positionalArgs=None): 487 | module = sys.modules[moduleName] 488 | preparser = argparse.ArgumentParser(add_help=False) 489 | if optionalArgs: 490 | for argument in optionalArgs: 491 | preparser.add_argument(*argument.args, **argument.kwargs) 492 | glob, extra = preparser.parse_known_args() 493 | parser = argparse.ArgumentParser( 494 | description=module.__doc__, parents=[preparser]) 495 | targetparser = parser.add_subparsers( 496 | metavar="targets", help="one or more targets to execute") 497 | targetparser.required = True 498 | for name, func in inspect.getmembers(module, inspect.isfunction): 499 | if not name.startswith("_") and func.__doc__: 500 | p = targetparser.add_parser(name, help=func.__doc__) 501 | if optionalArgs: 502 | for argument in optionalArgs: 503 | if argument.targets is None or func \ 504 | in argument.targets: 505 | p.add_argument(*argument.args, **argument.kwargs) 506 | if positionalArgs: 507 | for argument in positionalArgs: 508 | if argument.targets is None or func \ 509 | in argument.targets: 510 | p.add_argument(*argument.args, **argument.kwargs) 511 | p.set_defaults(func=func, name=name) 512 | self.arguments = [] 513 | while True: 514 | args, extra = parser.parse_known_args( 515 | extra, namespace=copy.copy(glob)) 516 | self.arguments.append(args) 517 | if sum((0 if arg.startswith("-") else 1 for arg in extra)) == 0: 518 | break 519 | 520 | def __iter__(self): 521 | return iter(self.arguments) 522 | 523 | 524 | class CommandLineArgument: 525 | 526 | """ Represents a command line argument.""" 527 | 528 | def __init__(self, *args, **kwargs): 529 | self.targets = None 530 | if "targets" in kwargs: 531 | self.targets = kwargs.pop("targets") 532 | self.args = args 533 | self.kwargs = kwargs 534 | -------------------------------------------------------------------------------- /teradata/version.py: -------------------------------------------------------------------------------- 1 | """Maintains the module version.""" 2 | 3 | # The MIT License (MIT) 4 | # 5 | # Copyright (c) 2015 by Teradata 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a copy 8 | # of this software and associated documentation files (the "Software"), to deal 9 | # in the Software without restriction, including without limitation the rights 10 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 11 | # copies of the Software, and to permit persons to whom the Software is 12 | # furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included in 15 | # all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 18 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 20 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 22 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 23 | # SOFTWARE. 24 | __version__ = "15.10.0.22" 25 | -------------------------------------------------------------------------------- /test/testBteqScript.sql: -------------------------------------------------------------------------------- 1 | /******************************************************************************** 2 | * Project: Project_name 3 | * Describtion: Details 4 | *********************************************************s***********************/ 5 | .Logon Server/Userid,PWD; 6 | 7 | ** Initiate remark for start of script 8 | .REMARK "<<<< Processing Initiated >>>>" 9 | 10 | SELECT DATE,TIME; 11 | .SET WIDTH 132; 12 | 13 | /* This is a mult-line comment that starts and ends on the same line */ 14 | 15 | CREATE TABLE Sou_EMP_Tab 16 | ( EMP_ID Integer, 17 | EMP_Name Char(10) 18 | )Primary Index (EMP_ID); 19 | 20 | INSERT INTO Sou_EMP_Tab 21 | (1, 'bala'); 22 | 23 | .export report file=c:\p\hi.txt 24 | .set retlimit 4 25 | select 'test;' 26 | ; 27 | select 'test' as "test;" 28 | ; 29 | .export reset; 30 | 31 | .import vartext ',' file = c:\p\var.txt 32 | .quiet on; 33 | .repeat *; 34 | /*using i_eid(integer), 35 | i_ename(varchar(30)), 36 | i_sal(dec(6,2)), 37 | i_grade(varchar(30)), 38 | i_dept(varchar(30)) 39 | insert into tab_name(eid,ename,sal,grade,dept) 40 | values(:i_eid,:i_ename,:i_sal,:i_grade,:i_dept);*/ 41 | 42 | INSERT INTO Sou_EMP_Tab 43 | (2, 'nawab') 44 | -------------------------------------------------------------------------------- /test/testClobSp.sql: -------------------------------------------------------------------------------- 1 | CREATE SET TABLE GCFR_Execution_Log ,NO FALLBACK , 2 | NO BEFORE JOURNAL, 3 | NO AFTER JOURNAL, 4 | CHECKSUM = DEFAULT, 5 | DEFAULT MERGEBLOCKRATIO 6 | ( 7 | Logger_Name VARCHAR(128) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL, 8 | Logger_Id INTEGER NOT NULL, 9 | Message_Type INTEGER NOT NULL, 10 | Stream_Key SMALLINT, 11 | Stream_Id INTEGER, 12 | Process_Id INTEGER, 13 | TD_Session_Id INTEGER NOT NULL, 14 | Execution_Text VARCHAR(1024) CHARACTER SET UNICODE NOT CASESPECIFIC, 15 | Calling_API VARCHAR(128) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL, 16 | Calling_API_Step CHAR(2) CHARACTER SET UNICODE NOT CASESPECIFIC, 17 | Sql_Activity_Count INTEGER, 18 | Sql_Text CLOB(1000000000) CHARACTER SET UNICODE, 19 | Update_Date DATE FORMAT 'YYYY-MM-DD' NOT NULL DEFAULT DATE , 20 | Update_User VARCHAR(128) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL DEFAULT USER , 21 | Update_Ts TIMESTAMP(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6)) 22 | PRIMARY INDEX ( Logger_Name ,Logger_Id );; 23 | 24 | REPLACE PROCEDURE GCFR_BB_ExecutionLog_Set 25 | /* Stored Procedure Parameters */ 26 | ( 27 | IN iLogger_Name VARCHAR(128) 28 | , IN iLogger_Id INTEGER 29 | , IN iMessage_Type INTEGER 30 | , IN iStream_Key SMALLINT 31 | , IN iStream_Id INTEGER 32 | , IN iProcess_Id INTEGER 33 | , IN iExecution_Text VARCHAR(1024) 34 | , IN iCalling_API VARCHAR(128) 35 | , IN iCalling_API_Step CHAR(4) 36 | , IN iSql_Activity_Count INTEGER 37 | , IN iSql_Text CLOB(1000000000) 38 | ) 39 | BEGIN 40 | /* Inserting a new execution log row using the given inputs */ 41 | INSERT INTO GCFR_Execution_Log 42 | ( 43 | Logger_Name 44 | ,Logger_Id 45 | ,Message_Type 46 | ,Stream_Key 47 | ,Stream_Id 48 | ,Process_Id 49 | ,Execution_Text 50 | ,TD_Session_Id 51 | ,Calling_API 52 | ,Calling_API_Step 53 | ,Sql_Activity_Count 54 | ,Sql_Text 55 | ,Update_Date 56 | ,Update_User 57 | ,Update_Ts 58 | ) 59 | VALUES 60 | ( 61 | :iLogger_Name 62 | ,:iLogger_Id 63 | ,:iMessage_Type 64 | ,:iStream_Key 65 | ,:iStream_Id 66 | ,:iProcess_Id 67 | ,:iExecution_Text 68 | ,SESSION /* Teradata Session Id */ 69 | ,:iCalling_API 70 | ,:iCalling_API_Step 71 | ,:iSql_Activity_Count 72 | ,:iSql_Text 73 | ,CURRENT_DATE 74 | ,USER 75 | ,CURRENT_TIMESTAMP(6) 76 | ) 77 | ; 78 | 79 | END 80 | ;; 81 | -------------------------------------------------------------------------------- /test/testScript.sql: -------------------------------------------------------------------------------- 1 | 2 | 3 | -- This is a comment; with a semi-colon 4 | /** This comment's open quote should not break multi-statement parsing **/ 5 | 6 | -- CREATING TABLE 7 | CREATE TABLE ${sampleTable} (a INTEGER, 8 | /* THis is a comment */ 9 | b VARCHAR(100), 10 | c TIMESTAMP WITH TIME ZONE, 11 | e NUMERIC(20,10), 12 | f NUMBER(18,2) NULL FORMAT '-----------------.99' TITLE 'f' 13 | ); 14 | 15 | -- THIS IS ALSO A TEST 16 | INSERT INTO ${sampleTable} VALUES (23, 'This ----- is a test;Making sure semi-colons 17 | in statements work.$$', '2015-05-30 12:00:00-GMT', 1.23456, 789); 18 | 19 | -- AND THIS 20 | SELECT COUNT(*) AS ";count" FROM ${sampleTable}; 21 | 22 | ; 23 | 24 | -- MORE COMMENTS 25 | SELECT * FROM ${sampleTable} 26 | 27 | 28 | 29 | -- EVEN MORE 30 | -------------------------------------------------------------------------------- /test/testScript2.sql: -------------------------------------------------------------------------------- 1 | 2 | -- CREATING TABLE 3 | CREATE TABLE ${sampleTable} (a INTEGER, 4 | b VARCHAR(100), 5 | c TIMESTAMP WITH TIME ZONE, 6 | e NUMERIC(20,10), 7 | f NUMERIC 8 | )| 9 | 10 | -- THIS IS ALSO A TEST 11 | INSERT INTO ${sampleTable} VALUES (23, 'This is a test|Making sure pipes in statements work.', '2015-05-30 12:00:00-GMT', 1.23456, 789)| 12 | 13 | -- AND THIS 14 | SELECT COUNT(*) AS "count" FROM ${sampleTable}| 15 | 16 | | 17 | 18 | -- MORE COMMENTS 19 | SELECT * FROM ${sampleTable} 20 | 21 | 22 | 23 | -- EVEN MORE 24 | -------------------------------------------------------------------------------- /test/test_pulljson.py: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in 13 | # all copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | from teradata import pulljson 23 | import unittest 24 | import sys 25 | 26 | if sys.version_info[0] == 2: 27 | from StringIO import StringIO # @UnresolvedImport #@UnusedImport 28 | else: 29 | from io import StringIO # @UnresolvedImport @UnusedImport @Reimport 30 | 31 | 32 | class TestJSONPullParser (unittest.TestCase): 33 | 34 | def testNextEvent(self): 35 | stream = StringIO("""{"key1":"value", "key2":100, "key3":null, 36 | "key4": true, "key5":false, "key6":-201.50E1, "key7":{"key8":"value2", 37 | "key9":null}, "key10":["value3", 10101010101010101010101, null, 38 | {} ] }""") 39 | reader = pulljson.JSONPullParser(stream) 40 | 41 | # Start of object 42 | event = reader.nextEvent() 43 | self.assertEqual(event.type, pulljson.START_OBJECT) 44 | 45 | # Key1 - "value" 46 | event = reader.nextEvent() 47 | self.assertEqual(event.type, pulljson.FIELD_NAME) 48 | self.assertEqual(event.value, "key1") 49 | event = reader.nextEvent() 50 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 51 | self.assertEqual(event.value, "value") 52 | self.assertEqual(event.valueType, pulljson.STRING) 53 | 54 | # Key2 - 100 55 | event = reader.nextEvent() 56 | self.assertEqual(event.type, pulljson.FIELD_NAME) 57 | self.assertEqual(event.value, "key2") 58 | event = reader.nextEvent() 59 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 60 | self.assertEqual(event.value, 100) 61 | self.assertEqual(event.valueType, pulljson.NUMBER) 62 | 63 | # Key3 - null 64 | event = reader.nextEvent() 65 | self.assertEqual(event.type, pulljson.FIELD_NAME) 66 | self.assertEqual(event.value, "key3") 67 | event = reader.nextEvent() 68 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 69 | self.assertIsNone(event.value) 70 | self.assertEqual(event.valueType, pulljson.NULL) 71 | 72 | # Key4 - true 73 | event = reader.nextEvent() 74 | self.assertEqual(event.type, pulljson.FIELD_NAME) 75 | self.assertEqual(event.value, "key4") 76 | event = reader.nextEvent() 77 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 78 | self.assertTrue(event.value) 79 | self.assertEqual(event.valueType, pulljson.BOOLEAN) 80 | 81 | # Key5 - false 82 | event = reader.nextEvent() 83 | self.assertEqual(event.type, pulljson.FIELD_NAME) 84 | self.assertEqual(event.value, "key5") 85 | event = reader.nextEvent() 86 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 87 | self.assertFalse(event.value) 88 | self.assertEqual(event.valueType, pulljson.BOOLEAN) 89 | 90 | # Key6 91 | event = reader.nextEvent() 92 | self.assertEqual(event.type, pulljson.FIELD_NAME) 93 | self.assertEqual(event.value, "key6") 94 | event = reader.nextEvent() 95 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 96 | self.assertEqual(event.value, -2015) 97 | self.assertEqual(event.valueType, pulljson.NUMBER) 98 | 99 | # Key7 100 | event = reader.nextEvent() 101 | self.assertEqual(event.type, pulljson.FIELD_NAME) 102 | self.assertEqual(event.value, "key7") 103 | 104 | # Start of key7 object 105 | event = reader.nextEvent() 106 | self.assertEqual(event.type, pulljson.START_OBJECT) 107 | 108 | # Key8 - value2 109 | event = reader.nextEvent() 110 | self.assertEqual(event.type, pulljson.FIELD_NAME) 111 | self.assertEqual(event.value, "key8") 112 | event = reader.nextEvent() 113 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 114 | self.assertEqual(event.value, "value2") 115 | self.assertEqual(event.valueType, pulljson.STRING) 116 | 117 | # Key9 - null 118 | event = reader.nextEvent() 119 | self.assertEqual(event.type, pulljson.FIELD_NAME) 120 | self.assertEqual(event.value, "key9") 121 | event = reader.nextEvent() 122 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 123 | self.assertIsNone(event.value) 124 | 125 | # End of key7 object 126 | event = reader.nextEvent() 127 | self.assertEqual(event.type, pulljson.END_OBJECT) 128 | 129 | # Key10 - array[0] - value3 130 | event = reader.nextEvent() 131 | self.assertEqual(event.type, pulljson.FIELD_NAME) 132 | self.assertEqual(event.value, "key10") 133 | event = reader.nextEvent() 134 | self.assertEqual(event.type, pulljson.START_ARRAY) 135 | 136 | # Key10 - array[0] - value3 137 | event = reader.nextEvent() 138 | self.assertEqual(event.type, pulljson.ARRAY_VALUE) 139 | self.assertEqual(event.value, "value3") 140 | self.assertEqual(event.valueType, pulljson.STRING) 141 | self.assertEqual(event.arrayIndex, 0) 142 | 143 | # Key10 - array[1] - 10101010101010101010101 144 | event = reader.nextEvent() 145 | self.assertEqual(event.type, pulljson.ARRAY_VALUE) 146 | self.assertEqual(event.value, 10101010101010101010101) 147 | self.assertEqual(event.valueType, pulljson.NUMBER) 148 | self.assertEqual(event.arrayIndex, 1) 149 | 150 | # Key10 - array[2] - null 151 | event = reader.nextEvent() 152 | self.assertEqual(event.type, pulljson.ARRAY_VALUE) 153 | self.assertIsNone(event.value) 154 | self.assertEqual(event.valueType, pulljson.NULL) 155 | self.assertEqual(event.arrayIndex, 2) 156 | 157 | # Key10 - array[3] - object 158 | event = reader.nextEvent() 159 | self.assertEqual(event.type, pulljson.START_OBJECT) 160 | self.assertEqual(event.arrayIndex, 3) 161 | 162 | # Key10 - array[3] - object 163 | event = reader.nextEvent() 164 | self.assertEqual(event.type, pulljson.END_OBJECT) 165 | self.assertEqual(event.arrayIndex, 3) 166 | 167 | # End of key 10 array. 168 | event = reader.nextEvent() 169 | self.assertEqual(event.type, pulljson.END_ARRAY) 170 | 171 | # End of object 172 | event = reader.nextEvent() 173 | self.assertEqual(event.type, pulljson.END_OBJECT) 174 | event = reader.nextEvent() 175 | self.assertIsNone(event) 176 | 177 | def testDocumentIncomplete(self): 178 | stream = StringIO('{"key":"value"') 179 | reader = pulljson.JSONPullParser(stream) 180 | event = reader.nextEvent() 181 | self.assertEqual(event.type, pulljson.START_OBJECT) 182 | event = reader.nextEvent() 183 | self.assertEqual(event.type, pulljson.FIELD_NAME) 184 | self.assertEqual(event.value, "key") 185 | 186 | with self.assertRaises(pulljson.JSONParseError) as cm: 187 | event = reader.nextEvent() 188 | self.assertEqual( 189 | cm.exception.code, pulljson.JSON_INCOMPLETE_ERROR, 190 | cm.exception.msg) 191 | 192 | def testEmptyName(self): 193 | stream = StringIO('{:"value"}') 194 | reader = pulljson.JSONPullParser(stream) 195 | event = reader.nextEvent() 196 | self.assertEqual(event.type, pulljson.START_OBJECT) 197 | with self.assertRaises(pulljson.JSONParseError) as cm: 198 | event = reader.nextEvent() 199 | self.assertEqual( 200 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 201 | 202 | def testExtraWhiteSpace(self): 203 | stream = StringIO('{\n\t "key"\n\t\t: "\t value\n"} ') 204 | reader = pulljson.JSONPullParser(stream) 205 | event = reader.nextEvent() 206 | self.assertEqual(event.type, pulljson.START_OBJECT) 207 | event = reader.nextEvent() 208 | self.assertEqual(event.type, pulljson.FIELD_NAME) 209 | self.assertEqual(event.value, "key") 210 | event = reader.nextEvent() 211 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 212 | self.assertEqual(event.value, "\t value\n") 213 | event = reader.nextEvent() 214 | self.assertEqual(event.type, pulljson.END_OBJECT) 215 | event = reader.nextEvent() 216 | self.assertIsNone(event) 217 | 218 | def testEscapeCharacter(self): 219 | stream = StringIO('{"\\"ke\\"y\\\\" : "va\\"l\\"ue"} ') 220 | reader = pulljson.JSONPullParser(stream) 221 | event = reader.nextEvent() 222 | self.assertEqual(event.type, pulljson.START_OBJECT) 223 | event = reader.nextEvent() 224 | self.assertEqual(event.type, pulljson.FIELD_NAME) 225 | self.assertEqual(event.value, '"ke"y\\') 226 | event = reader.nextEvent() 227 | self.assertEqual(event.type, pulljson.FIELD_VALUE) 228 | self.assertEqual(event.value, 'va"l"ue') 229 | event = reader.nextEvent() 230 | self.assertEqual(event.type, pulljson.END_OBJECT) 231 | event = reader.nextEvent() 232 | self.assertIsNone(event) 233 | 234 | def testEmptyArray(self): 235 | stream = StringIO('[]') 236 | reader = pulljson.JSONPullParser(stream) 237 | event = reader.nextEvent() 238 | self.assertEqual(event.type, pulljson.START_ARRAY) 239 | event = reader.nextEvent() 240 | self.assertEqual(event.type, pulljson.END_ARRAY) 241 | event = reader.nextEvent() 242 | self.assertIsNone(event) 243 | 244 | def testMissingColon(self): 245 | stream = StringIO('{"key" "value"}') 246 | reader = pulljson.JSONPullParser(stream) 247 | event = reader.nextEvent() 248 | self.assertEqual(event.type, pulljson.START_OBJECT) 249 | with self.assertRaises(pulljson.JSONParseError) as cm: 250 | event = reader.nextEvent() 251 | self.assertEqual( 252 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 253 | 254 | def testCommaInsteadOfColon(self): 255 | stream = StringIO('{"key","value"}') 256 | reader = pulljson.JSONPullParser(stream) 257 | event = reader.nextEvent() 258 | self.assertEqual(event.type, pulljson.START_OBJECT) 259 | with self.assertRaises(pulljson.JSONParseError) as cm: 260 | event = reader.nextEvent() 261 | self.assertEqual( 262 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 263 | 264 | def testColonInsteadOfComma(self): 265 | stream = StringIO('["key":"value"]') 266 | reader = pulljson.JSONPullParser(stream) 267 | event = reader.nextEvent() 268 | self.assertEqual(event.type, pulljson.START_ARRAY) 269 | with self.assertRaises(pulljson.JSONParseError) as cm: 270 | event = reader.nextEvent() 271 | self.assertEqual( 272 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 273 | 274 | def testNumberLiteral(self): 275 | stream = StringIO('1') 276 | reader = pulljson.JSONPullParser(stream) 277 | with self.assertRaises(pulljson.JSONParseError) as cm: 278 | reader.nextEvent() 279 | self.assertEqual( 280 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 281 | 282 | def testStringLiteral(self): 283 | stream = StringIO('"This is a test"') 284 | reader = pulljson.JSONPullParser(stream) 285 | with self.assertRaises(pulljson.JSONParseError) as cm: 286 | reader.nextEvent() 287 | self.assertEqual( 288 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 289 | 290 | def testObjectMissingValue(self): 291 | stream = StringIO('{"key":}') 292 | reader = pulljson.JSONPullParser(stream) 293 | event = reader.nextEvent() 294 | self.assertEqual(event.type, pulljson.START_OBJECT) 295 | event = reader.nextEvent() 296 | self.assertEqual(event.type, pulljson.FIELD_NAME) 297 | with self.assertRaises(pulljson.JSONParseError) as cm: 298 | event = reader.nextEvent() 299 | self.assertEqual( 300 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 301 | 302 | def testArrayMissingValue(self): 303 | stream = StringIO('[1, ,2}') 304 | reader = pulljson.JSONPullParser(stream) 305 | event = reader.nextEvent() 306 | self.assertEqual(event.type, pulljson.START_ARRAY) 307 | event = reader.nextEvent() 308 | self.assertEqual(event.type, pulljson.ARRAY_VALUE) 309 | with self.assertRaises(pulljson.JSONParseError) as cm: 310 | event = reader.nextEvent() 311 | self.assertEqual( 312 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 313 | 314 | def testArrayInObject(self): 315 | stream = StringIO('{[]}') 316 | reader = pulljson.JSONPullParser(stream) 317 | event = reader.nextEvent() 318 | self.assertEqual(event.type, pulljson.START_OBJECT) 319 | with self.assertRaises(pulljson.JSONParseError) as cm: 320 | event = reader.nextEvent() 321 | self.assertEqual( 322 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 323 | 324 | def testReadObject(self): 325 | stream = StringIO( 326 | '{"key1":[0,1,2,3,4,{"value":"5"}], "key2":\ 327 | {"key1":[0,1,2,3,4,{"value":"5"}]}}') 328 | reader = pulljson.JSONPullParser(stream) 329 | obj = reader.readObject() 330 | self.assertEqual(len(obj), 2) 331 | for i in range(0, 2): 332 | self.assertEqual(len(obj["key1"]), 6) 333 | for i in range(0, 5): 334 | self.assertEqual(obj["key1"][i], i) 335 | self.assertEqual(obj["key1"][5]["value"], "5") 336 | if i == 1: 337 | obj = obj["key2"] 338 | self.assertEqual(len(obj), 1) 339 | 340 | def testReadArray(self): 341 | stream = StringIO('[0,1,2,3,4,[0,1,2,3,4,[0,1,2,3,4]],[0,1,2,3,4]]') 342 | reader = pulljson.JSONPullParser(stream) 343 | arr = reader.readArray() 344 | self.assertEqual(len(arr), 7) 345 | for i in range(0, 5): 346 | self.assertEqual(arr[i], i) 347 | for i in range(0, 5): 348 | self.assertEqual(arr[5][i], i) 349 | for i in range(0, 5): 350 | self.assertEqual(arr[5][5][i], i) 351 | for i in range(0, 5): 352 | self.assertEqual(arr[6][i], i) 353 | 354 | def testArraySyntaxError(self): 355 | stream = StringIO('[[0,1][0,1]]') 356 | reader = pulljson.JSONPullParser(stream) 357 | with self.assertRaises(pulljson.JSONParseError) as cm: 358 | reader.readArray() 359 | self.assertEqual( 360 | cm.exception.code, pulljson.JSON_SYNTAX_ERROR, cm.exception.msg) 361 | 362 | def testIterateArray(self): 363 | stream = StringIO( 364 | '[{"key0}":["}\\"","\\"}","}"]}, {"key1}":["}","\\"}","}"]}, ' 365 | '{"key2}":["}","}","\\"}"]}]') 366 | reader = pulljson.JSONPullParser(stream) 367 | i = 0 368 | for x in reader.expectArray(): 369 | self.assertEqual(len(x["key" + str(i) + "}"]), 3) 370 | i += 1 371 | 372 | 373 | if __name__ == '__main__': 374 | unittest.main() 375 | -------------------------------------------------------------------------------- /test/test_tdodbc.py: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in 13 | # all copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | import unittest 23 | import os 24 | import teradata 25 | from teradata import tdodbc, util 26 | 27 | 28 | class TdOdbcTest (unittest.TestCase): 29 | 30 | @classmethod 31 | def setUpClass(cls): 32 | cls.username = cls.password = util.setupTestUser(udaExec, dsn) 33 | 34 | def testGlobals(self): 35 | self.assertEqual(tdodbc.apilevel, "2.0") 36 | self.assertEqual(tdodbc.threadsafety, 1) 37 | self.assertEqual(tdodbc.paramstyle, "qmark") 38 | 39 | def testSystemNotFound(self): 40 | with self.assertRaises(tdodbc.DatabaseError) as cm: 41 | tdodbc.connect(system="continuum.td.teradata.com", 42 | username=self.username, password=self.password) 43 | self.assertTrue("08004" in cm.exception.msg, cm.exception) 44 | 45 | def testBadCredentials(self): 46 | with self.assertRaises(tdodbc.DatabaseError) as cm: 47 | tdodbc.connect(system=system, username="bad", password="bad") 48 | self.assertEqual(cm.exception.code, 8017, cm.exception.msg) 49 | 50 | def testConnect(self): 51 | conn = tdodbc.connect( 52 | system=system, username=self.username, password=self.password) 53 | self.assertIsNotNone(conn) 54 | conn.close() 55 | 56 | def testConnectBadDriver(self): 57 | with self.assertRaises(tdodbc.InterfaceError) as cm: 58 | tdodbc.connect( 59 | system=system, username=self.username, 60 | password=self.password, 61 | driver="BadDriver") 62 | self.assertEqual(cm.exception.code, "DRIVER_NOT_FOUND") 63 | 64 | def testCursorBasics(self): 65 | with tdodbc.connect(system=system, username=self.username, 66 | password=self.password, autoCommit=True) as conn: 67 | self.assertIsNotNone(conn) 68 | with conn.cursor() as cursor: 69 | count = 0 70 | for row in cursor.execute("SELECT * FROM DBC.DBCInfo"): 71 | self.assertEqual(len(row), 2) 72 | self.assertIsNotNone(row[0]) 73 | self.assertIsNotNone(row['InfoKey']) 74 | self.assertIsNotNone(row['infokey']) 75 | self.assertIsNotNone(row.InfoKey) 76 | self.assertIsNotNone(row.infokey) 77 | self.assertIsNotNone(row[1]) 78 | self.assertIsNotNone(row['InfoData']) 79 | self.assertIsNotNone(row['infodata']) 80 | self.assertIsNotNone(row.infodata) 81 | self.assertIsNotNone(row.InfoData) 82 | 83 | row[0] = "test1" 84 | self.assertEqual(row[0], "test1") 85 | self.assertEqual(row['InfoKey'], "test1") 86 | self.assertEqual(row.infokey, "test1") 87 | 88 | row['infokey'] = "test2" 89 | self.assertEqual(row[0], "test2") 90 | self.assertEqual(row['InfoKey'], "test2") 91 | self.assertEqual(row.infokey, "test2") 92 | 93 | row.infokey = "test3" 94 | self.assertEqual(row[0], "test3") 95 | self.assertEqual(row['InfoKey'], "test3") 96 | self.assertEqual(row.InfoKey, "test3") 97 | count += 1 98 | 99 | self.assertEqual(cursor.description[0][0], "InfoKey") 100 | self.assertEqual(cursor.description[0][1], tdodbc.STRING) 101 | self.assertEqual(cursor.description[1][0], "InfoData") 102 | self.assertEqual(cursor.description[1][1], tdodbc.STRING) 103 | self.assertEqual(count, 3) 104 | 105 | def testExecuteWithParamsMismatch(self): 106 | with self.assertRaises(teradata.InterfaceError) as cm: 107 | with tdodbc.connect(system=system, username=self.username, 108 | password=self.password, 109 | autoCommit=True) as conn: 110 | self.assertIsNotNone(conn) 111 | with conn.cursor() as cursor: 112 | cursor.execute( 113 | "CREATE TABLE testExecuteWithParamsMismatch (id INT, " 114 | "name VARCHAR(128), dob TIMESTAMP)") 115 | cursor.execute( 116 | "INSERT INTO testExecuteWithParamsMismatch " 117 | "VALUES (?, ?, ?)", (1, "TEST", )) 118 | self.assertEqual( 119 | cm.exception.code, "PARAMS_MISMATCH", cm.exception.msg) 120 | 121 | configFiles = [os.path.join(os.path.dirname(__file__), 'udaexec.ini')] 122 | udaExec = teradata.UdaExec(configFiles=configFiles, configureLogging=False) 123 | dsn = 'ODBC' 124 | odbcConfig = udaExec.config.section(dsn) 125 | system = odbcConfig['system'] 126 | super_username = odbcConfig['username'] 127 | super_password = odbcConfig['password'] 128 | 129 | if __name__ == '__main__': 130 | unittest.main() 131 | -------------------------------------------------------------------------------- /test/test_tdrest.py: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in 13 | # all copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | import unittest 23 | import os 24 | import teradata 25 | from teradata import tdrest, util 26 | 27 | 28 | class TdRestTest (unittest.TestCase): 29 | 30 | @classmethod 31 | def setUpClass(cls): 32 | cls.username = cls.password = util.setupTestUser(udaExec, dsn) 33 | 34 | def testGlobals(self): 35 | self.assertEqual(tdrest.apilevel, "2.0") 36 | self.assertEqual(tdrest.threadsafety, 1) 37 | self.assertEqual(tdrest.paramstyle, "qmark") 38 | 39 | def testBadHost(self): 40 | badHost = "badhostname" 41 | with self.assertRaises(tdrest.InterfaceError) as cm: 42 | tdrest.connect( 43 | host=badHost, system=system, username=self.username, 44 | password=self.password) 45 | self.assertEqual(cm.exception.code, tdrest.REST_ERROR) 46 | self.assertTrue(badHost in cm.exception.msg, 47 | '{} not found in "{}"'.format( 48 | badHost, cm.exception.msg)) 49 | 50 | def testSystemNotFound(self): 51 | with self.assertRaises(tdrest.InterfaceError) as cm: 52 | tdrest.connect( 53 | host=host, system="unknown", username=self.username, 54 | password=self.password) 55 | self.assertEqual(cm.exception.code, 404) 56 | # print(cm.exception) 57 | self.assertTrue( 58 | "404" in cm.exception.msg, 59 | '404 not found in "{}"'.format(cm.exception.msg)) 60 | 61 | def testBadCredentials(self): 62 | with self.assertRaises(tdrest.DatabaseError) as cm: 63 | tdrest.connect( 64 | host=host, system=system, username="bad", password="bad") 65 | # print(cm.exception) 66 | self.assertEqual(cm.exception.code, 8017, cm.exception.msg) 67 | 68 | def testConnect(self): 69 | conn = tdrest.connect( 70 | host=host, system=system, username=self.username, 71 | password=self.password) 72 | self.assertIsNotNone(conn) 73 | conn.close() 74 | 75 | def testCursorBasics(self): 76 | with tdrest.connect(host=host, system=system, username=self.username, 77 | password=self.password) as conn: 78 | self.assertIsNotNone(conn) 79 | cursor = conn.cursor() 80 | count = 0 81 | for row in cursor.execute("SELECT * FROM DBC.DBCInfo"): 82 | self.assertEqual(len(row), 2) 83 | self.assertIsNotNone(row[0]) 84 | self.assertIsNotNone(row['InfoKey']) 85 | self.assertIsNotNone(row['infokey']) 86 | self.assertIsNotNone(row.InfoKey) 87 | self.assertIsNotNone(row.infokey) 88 | self.assertIsNotNone(row[1]) 89 | self.assertIsNotNone(row['InfoData']) 90 | self.assertIsNotNone(row['infodata']) 91 | self.assertIsNotNone(row.infodata) 92 | self.assertIsNotNone(row.InfoData) 93 | 94 | row[0] = "test1" 95 | self.assertEqual(row[0], "test1") 96 | self.assertEqual(row['InfoKey'], "test1") 97 | self.assertEqual(row.infokey, "test1") 98 | 99 | row['infokey'] = "test2" 100 | self.assertEqual(row[0], "test2") 101 | self.assertEqual(row['InfoKey'], "test2") 102 | self.assertEqual(row.infokey, "test2") 103 | 104 | row.infokey = "test3" 105 | self.assertEqual(row[0], "test3") 106 | self.assertEqual(row['InfoKey'], "test3") 107 | self.assertEqual(row.InfoKey, "test3") 108 | count += 1 109 | 110 | self.assertEqual(cursor.description[0][0], "InfoKey") 111 | self.assertEqual(cursor.description[0][1], tdrest.STRING) 112 | self.assertEqual(cursor.description[1][0], "InfoData") 113 | self.assertEqual(cursor.description[1][1], tdrest.STRING) 114 | self.assertEqual(count, 3) 115 | 116 | def testExecuteWithParamsMismatch(self): 117 | with self.assertRaises(teradata.InterfaceError) as cm: 118 | with tdrest.connect(host=host, system=system, 119 | username=self.username, 120 | password=self.password, 121 | autoCommit=True) as conn: 122 | self.assertIsNotNone(conn) 123 | with conn.cursor() as cursor: 124 | cursor.execute( 125 | "CREATE TABLE testExecuteWithParamsMismatch (id INT, " 126 | "name VARCHAR(128), dob TIMESTAMP)") 127 | cursor.execute( 128 | "INSERT INTO testExecuteWithParamsMismatch " 129 | "VALUES (?, ?, ?)", (1, "TEST", )) 130 | self.assertEqual(cm.exception.code, 400, cm.exception.msg) 131 | 132 | def testSessionAlreadyClosed(self): 133 | with tdrest.connect(host=host, system=system, username=self.username, 134 | password=self.password, autoCommit=True) as conn: 135 | self.assertIsNotNone(conn) 136 | with conn.template.connect() as http: 137 | http.delete( 138 | "/systems/{}/sessions/{}".format(conn.system, 139 | conn.sessionId)) 140 | 141 | configFiles = [os.path.join(os.path.dirname(__file__), 'udaexec.ini')] 142 | udaExec = teradata.UdaExec(configFiles=configFiles, configureLogging=False) 143 | dsn = 'HTTP' 144 | restConfig = udaExec.config.section(dsn) 145 | host = restConfig['host'] 146 | system = restConfig['system'] 147 | super_username = restConfig['username'] 148 | super_password = restConfig['password'] 149 | 150 | if __name__ == '__main__': 151 | unittest.main() 152 | -------------------------------------------------------------------------------- /test/test_udaexec_config.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # The MIT License (MIT) 3 | # 4 | # Copyright (c) 2015 by Teradata 5 | # 6 | # Permission is hereby granted, free of charge, to any person obtaining a copy 7 | # of this software and associated documentation files (the "Software"), to deal 8 | # in the Software without restriction, including without limitation the rights 9 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 10 | # copies of the Software, and to permit persons to whom the Software is 11 | # furnished to do so, subject to the following conditions: 12 | # 13 | # The above copyright notice and this permission notice shall be included in 14 | # all copies or substantial portions of the Software. 15 | # 16 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 17 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 18 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 19 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 20 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 21 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 22 | # SOFTWARE. 23 | import unittest 24 | import teradata 25 | import os 26 | import sys 27 | import logging 28 | 29 | configFiles = [os.path.join(os.path.dirname(__file__), file) 30 | for file in ('udaexec.ini', 'udaexec2.ini')] 31 | 32 | 33 | class UdaExecConfigTest (unittest.TestCase): 34 | 35 | """Test UdaExec DevOps features.""" 36 | 37 | def setUp(self): 38 | self.udaExec = teradata.UdaExec( 39 | configFiles=configFiles, configureLogging=False) 40 | self.udaExec.checkpoint() 41 | self.assertIsNotNone(self.udaExec) 42 | 43 | def testGlobals(self): 44 | self.assertEqual(teradata.apilevel, "2.0") 45 | self.assertEqual(teradata.threadsafety, 1) 46 | self.assertEqual(teradata.paramstyle, "qmark") 47 | 48 | def testMissingAppName(self): 49 | with self.assertRaises(teradata.InterfaceError) as cm: 50 | teradata.UdaExec(configFiles=[], configureLogging=False) 51 | self.assertEqual(cm.exception.code, teradata.CONFIG_ERROR) 52 | 53 | def testConfig(self): 54 | udaExec = self.udaExec 55 | self.assertEqual(udaExec.config['appName'], u'PyTdUnitTestsの') 56 | self.assertEqual(udaExec.config['version'], '1.00.00.01') 57 | self.assertEqual(udaExec.config['key1'], 'file1') 58 | self.assertEqual(udaExec.config['key2'], 'file2') 59 | self.assertEqual(udaExec.config['key3'], 'file2') 60 | self.assertEqual(udaExec.config['key5'], 'file1') 61 | 62 | def testConfigEscapeCharacter(self): 63 | udaExec = self.udaExec 64 | self.assertEqual(udaExec.config['escapeTest'], 'this$isatest') 65 | 66 | def testEscapeCharacterInDataSource(self): 67 | section = self.udaExec.config.section("ESCAPE_TEST") 68 | self.assertEqual(section['password'], 'pa$$word') 69 | self.assertEqual(section['escapeTest2'], 'this$isatest') 70 | 71 | def testConnectUsingBadDSN(self): 72 | with self.assertRaises(teradata.InterfaceError) as cm: 73 | self.udaExec.connect("UNKNOWN") 74 | self.assertEqual(cm.exception.code, teradata.CONFIG_ERROR) 75 | 76 | def testRunNumber(self): 77 | # Check that runNumber is incremented by 1. 78 | udaExec = teradata.UdaExec( 79 | configFiles=configFiles, configureLogging=False) 80 | self.assertEqual(int(udaExec.runNumber.split( 81 | "-")[1]), int(self.udaExec.runNumber.split("-")[1]) + 1) 82 | 83 | def testResumeFromCheckPoint(self): 84 | checkpoint = "testResumeFromCheckPoint" 85 | self.udaExec.checkpoint(checkpoint) 86 | udaExec = teradata.UdaExec( 87 | configFiles=configFiles, configureLogging=False) 88 | self.assertEqual(udaExec.resumeFromCheckpoint, checkpoint) 89 | with udaExec.connect("ODBC") as session: 90 | self.assertIsNone(session.execute( 91 | "SELECT 1").fetchone(), 92 | "Query was executed but should have been skipped.") 93 | udaExec.checkpoint("notTheExpectedCheckpoint") 94 | self.assertIsNone(session.execute( 95 | "SELECT 1").fetchone(), 96 | "Query was executed but should have been skipped.") 97 | udaExec.checkpoint(checkpoint) 98 | self.assertEqual(session.execute("SELECT 1").fetchone()[0], 1) 99 | # Clear the checkpoint. 100 | self.udaExec.checkpoint() 101 | udaExec = teradata.UdaExec( 102 | configFiles=configFiles, configureLogging=False) 103 | self.assertIsNone(udaExec.resumeFromCheckpoint) 104 | udaExec.setResumeCheckpoint(checkpoint) 105 | self.assertEqual(udaExec.resumeFromCheckpoint, checkpoint) 106 | 107 | def testVariableResolutionEscapeCharacter(self): 108 | with self.udaExec.connect("ODBC") as session: 109 | self.assertEqual( 110 | session.execute( 111 | "SELECT '$${ThisShouldBeTreatedAsALiteral}'").fetchone()[ 112 | 0], "${ThisShouldBeTreatedAsALiteral}") 113 | self.assertEqual( 114 | session.execute( 115 | "SELECT '$$ThisShouldBeTreatedAsALiteral'").fetchone()[ 116 | 0], "$ThisShouldBeTreatedAsALiteral") 117 | 118 | 119 | if __name__ == '__main__': 120 | formatter = logging.Formatter( 121 | "%(asctime)s - %(name)s - %(levelname)s - %(message)s") 122 | sh = logging.StreamHandler(sys.stdout) 123 | sh.setFormatter(formatter) 124 | root = logging.getLogger() 125 | root.setLevel(logging.INFO) 126 | root.addHandler(sh) 127 | unittest.main() 128 | -------------------------------------------------------------------------------- /test/test_udaexec_execute.py: -------------------------------------------------------------------------------- 1 | # The MIT License (MIT) 2 | # 3 | # Copyright (c) 2015 by Teradata 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in 13 | # all copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | import unittest 23 | import sys 24 | import logging 25 | import os 26 | import decimal 27 | import teradata 28 | import threading 29 | import random 30 | import time 31 | from teradata import util 32 | 33 | logger = logging.getLogger(__name__) 34 | 35 | 36 | class UdaExecExecuteTest (): 37 | 38 | """Test UdaExec execute methods on a named data source""" 39 | 40 | @classmethod 41 | def setUpClass(cls): 42 | cls.username = cls.password = util.setupTestUser(udaExec, cls.dsn) 43 | cls.failure = False 44 | 45 | def testCursorBasics(self): 46 | with udaExec.connect(self.dsn, username=self.username, 47 | password=self.password) as conn: 48 | self.assertIsNotNone(conn) 49 | with conn.cursor() as cursor: 50 | count = 0 51 | for row in cursor.execute("SELECT * FROM DBC.DBCInfo"): 52 | self.assertEqual(len(row), 2) 53 | self.assertIsNotNone(row[0]) 54 | self.assertIsNotNone(row['InfoKey']) 55 | self.assertIsNotNone(row['infokey']) 56 | self.assertIsNotNone(row.InfoKey) 57 | self.assertIsNotNone(row.infokey) 58 | self.assertIsNotNone(row[1]) 59 | self.assertIsNotNone(row['InfoData']) 60 | self.assertIsNotNone(row['infodata']) 61 | self.assertIsNotNone(row.infodata) 62 | self.assertIsNotNone(row.InfoData) 63 | 64 | row[0] = "test1" 65 | self.assertEqual(row[0], "test1") 66 | self.assertEqual(row['InfoKey'], "test1") 67 | self.assertEqual(row.infokey, "test1") 68 | 69 | row['infokey'] = "test2" 70 | self.assertEqual(row[0], "test2") 71 | self.assertEqual(row['InfoKey'], "test2") 72 | self.assertEqual(row.infokey, "test2") 73 | 74 | row.infokey = "test3" 75 | self.assertEqual(row[0], "test3") 76 | self.assertEqual(row['InfoKey'], "test3") 77 | self.assertEqual(row.InfoKey, "test3") 78 | count += 1 79 | 80 | self.assertEqual(cursor.description[0][0], "InfoKey") 81 | self.assertEqual(cursor.description[0][1], teradata.STRING) 82 | self.assertEqual(cursor.description[1][0], "InfoData") 83 | self.assertEqual(cursor.description[1][1], teradata.STRING) 84 | self.assertEqual(count, 3) 85 | 86 | def testDefaultDatabase(self): 87 | with udaExec.connect(self.dsn, username=self.username, 88 | password=self.password, database="DBC") as conn: 89 | self.assertIsNotNone(conn) 90 | conn.execute("SELECT * FROM DBCInfo") 91 | 92 | def testQueryBands(self): 93 | with udaExec.connect(self.dsn, username=self.username, 94 | password=self.username, 95 | queryBands={"queryBand1": "1", 96 | "queryBand2": "2"}) as conn: 97 | cursor = conn.cursor() 98 | queryBands = cursor.execute("Select GetQueryBand()").fetchone()[0] 99 | self.assertIn("ApplicationName=PyTdUnitTests", queryBands) 100 | self.assertIn("Version=1.00.00.01", queryBands) 101 | self.assertIn("queryBand1=1", queryBands) 102 | self.assertIn("queryBand2=2", queryBands) 103 | 104 | def testRollbackCommitTeraMode(self): 105 | with udaExec.connect(self.dsn, username=self.username, 106 | password=self.password, autoCommit=False, 107 | transactionMode='TERA') as conn: 108 | self.assertIsNotNone(conn) 109 | cursor = conn.cursor() 110 | 111 | cursor.execute("CREATE TABLE testRollbackCommitTeraMode (x INT)") 112 | conn.commit() 113 | 114 | cursor.execute("INSERT INTO testRollbackCommitTeraMode VALUES (1)") 115 | 116 | row = cursor.execute( 117 | "SELECT COUNT(*) FROM testRollbackCommitTeraMode").fetchone() 118 | self.assertEqual(row[0], 1) 119 | 120 | conn.rollback() 121 | 122 | row = cursor.execute( 123 | "SELECT COUNT(*) FROM testRollbackCommitTeraMode").fetchone() 124 | self.assertEqual(row[0], 0) 125 | 126 | def testRollbackCommitAnsiMode(self): 127 | with udaExec.connect(self.dsn, username=self.username, 128 | password=self.password, autoCommit="false", 129 | transactionMode='ANSI') as conn: 130 | self.assertIsNotNone(conn) 131 | cursor = conn.cursor() 132 | 133 | cursor.execute("CREATE TABLE testRollbackCommitAnsiMode (x INT)") 134 | conn.commit() 135 | 136 | cursor.execute("INSERT INTO testRollbackCommitAnsiMode VALUES (1)") 137 | 138 | row = cursor.execute( 139 | "SELECT COUNT(*) FROM testRollbackCommitAnsiMode").fetchone() 140 | self.assertEqual(row[0], 1) 141 | 142 | conn.rollback() 143 | 144 | row = cursor.execute( 145 | "SELECT COUNT(*) FROM testRollbackCommitAnsiMode").fetchone() 146 | self.assertEqual(row[0], 0) 147 | 148 | def testSqlScriptExecution(self): 149 | with udaExec.connect(self.dsn, username=self.username, 150 | password=self.password) as conn: 151 | self.assertIsNotNone(conn) 152 | scriptFile = os.path.join( 153 | os.path.dirname(__file__), "testScript.sql") 154 | udaExec.config['sampleTable'] = 'sample1' 155 | conn.execute(file=scriptFile) 156 | rows = conn.execute("SELECT * FROM ${sampleTable}").fetchall() 157 | self.assertEqual(len(rows), 1) 158 | self.assertEqual(rows[0].a, 23) 159 | self.assertEqual( 160 | rows[0].b, "This ----- is a test;Making sure semi-colons\nin " 161 | "statements work.$") 162 | self.assertEqual(rows[0].e, decimal.Decimal("1.23456")) 163 | self.assertEqual(rows[0].f, decimal.Decimal(789)) 164 | 165 | def testSqlScriptExecutionDelimiter(self): 166 | with udaExec.connect(self.dsn, username=self.username, 167 | password=self.password) as conn: 168 | self.assertIsNotNone(conn) 169 | scriptFile = os.path.join( 170 | os.path.dirname(__file__), "testScript2.sql") 171 | udaExec.config['sampleTable'] = 'sample2' 172 | conn.execute(file=scriptFile, delimiter="|") 173 | rows = conn.execute("SELECT * FROM ${sampleTable}").fetchall() 174 | self.assertEqual(len(rows), 1) 175 | self.assertEqual(rows[0].a, 23) 176 | self.assertEqual( 177 | rows[0].b, 178 | 'This is a test|Making sure pipes in statements work.') 179 | self.assertEqual(rows[0].e, decimal.Decimal("1.23456")) 180 | self.assertEqual(rows[0].f, decimal.Decimal(789)) 181 | 182 | def testBteqScriptExecution(self): 183 | with udaExec.connect(self.dsn, username=self.username, 184 | password=self.password) as conn: 185 | self.assertIsNotNone(conn) 186 | cursor = conn.cursor() 187 | scriptFile = os.path.join( 188 | os.path.dirname(__file__), "testBteqScript.sql") 189 | conn.execute(file=scriptFile, fileType="bteq") 190 | rows = cursor.execute( 191 | "SELECT * FROM {}.Sou_EMP_Tab".format( 192 | self.username)).fetchall() 193 | self.assertEqual(len(rows), 2) 194 | self.assertEqual(rows[0].EMP_ID, 1) 195 | self.assertEqual(rows[0].EMP_Name.strip(), 'bala') 196 | self.assertEqual(rows[1].EMP_ID, 2) 197 | self.assertEqual(rows[1].EMP_Name.strip(), 'nawab') 198 | 199 | def testExecuteManyFetchMany(self): 200 | with udaExec.connect(self.dsn, username=self.username, 201 | password=self.password) as conn: 202 | self.assertIsNotNone(conn) 203 | cursor = conn.cursor() 204 | 205 | rowCount = 10000 206 | cursor.execute("""CREATE TABLE testExecuteManyFetchMany ( 207 | id INT, name VARCHAR(128), dob TIMESTAMP)""") 208 | cursor.executemany( 209 | "INSERT INTO testExecuteManyFetchMany \ 210 | VALUES (?, ?, CURRENT_TIMESTAMP)", 211 | [(x, "{ \\[]" + str(x) + "\"}") for x in range(0, rowCount)], 212 | batch=True, logParamFrequency=1000) 213 | 214 | row = cursor.execute( 215 | "SELECT COUNT(*) FROM testExecuteManyFetchMany").fetchone() 216 | self.assertEqual(row[0], rowCount) 217 | 218 | setCount = 10 219 | cursor.execute("".join( 220 | ["SELECT * FROM testExecuteManyFetchMany WHERE id = %s; " 221 | % x for x in range(0, setCount)])) 222 | for i in range(0, setCount): 223 | if i != 0: 224 | self.assertTrue(cursor.nextset()) 225 | row = cursor.fetchone() 226 | self.assertEqual(row.id, i) 227 | self.assertEqual(row.name, "{ \\[]" + str(i) + "\"}") 228 | self.assertIsNone(cursor.nextset()) 229 | 230 | setCount = 10 231 | cursor.execute("".join( 232 | ["SELECT * FROM testExecuteManyFetchMany WHERE id = %s; " 233 | % x for x in range(0, setCount)])) 234 | for i in range(0, setCount): 235 | if i != 0: 236 | self.assertTrue(cursor.nextset()) 237 | row = cursor.fetchall() 238 | self.assertEqual(row[0].id, i) 239 | self.assertEqual(row[0].name, "{ \\[]" + str(i) + "\"}") 240 | self.assertIsNone(cursor.nextset()) 241 | 242 | fetchCount = 500 243 | cursor.execute("SELECT * FROM testExecuteManyFetchMany") 244 | for i in range(0, rowCount // fetchCount): 245 | rows = cursor.fetchmany(fetchCount) 246 | self.assertEqual(len(rows), fetchCount) 247 | rows = cursor.fetchmany(fetchCount) 248 | self.assertEqual(len(rows), 0) 249 | self.assertIsNone(cursor.fetchone()) 250 | 251 | def testVolatileTable(self): 252 | with udaExec.connect(self.dsn, username=self.username, 253 | password=self.password) as conn: 254 | self.assertIsNotNone(conn) 255 | cursor = conn.cursor() 256 | 257 | rowCount = 1000 258 | cursor.execute( 259 | "CREATE VOLATILE TABLE testVolatileTable, NO FALLBACK ," 260 | "NO BEFORE JOURNAL,NO AFTER JOURNAL, NO LOG, CHECKSUM = " 261 | "DEFAULT (id INT, name VARCHAR(128), dob TIMESTAMP) " 262 | "ON COMMIT PRESERVE ROWS;") 263 | cursor.executemany( 264 | "INSERT INTO testVolatileTable VALUES (?, ?, " 265 | "CURRENT_TIMESTAMP)", 266 | [(x, "{ \\[]" + str(x) + "\"}") for x in range(0, rowCount)], 267 | batch=True) 268 | 269 | row = cursor.execute( 270 | "SELECT COUNT(*) FROM testVolatileTable").fetchone() 271 | self.assertEqual(row[0], rowCount) 272 | 273 | setCount = 10 274 | cursor.execute("".join( 275 | ["SELECT * FROM testVolatileTable WHERE id = %s; " % x 276 | for x in range(0, setCount)])) 277 | for i in range(0, setCount): 278 | if i != 0: 279 | self.assertTrue(cursor.nextset()) 280 | row = cursor.fetchone() 281 | self.assertEqual(row.id, i) 282 | self.assertEqual(row.name, "{ \\[]" + str(i) + "\"}") 283 | self.assertIsNone(cursor.nextset()) 284 | 285 | fetchCount = 500 286 | cursor.execute("SELECT * FROM testVolatileTable") 287 | for i in range(0, rowCount // fetchCount): 288 | rows = cursor.fetchmany(fetchCount) 289 | self.assertEqual(len(rows), fetchCount) 290 | rows = cursor.fetchmany(fetchCount) 291 | self.assertEqual(len(rows), 0) 292 | self.assertIsNone(cursor.fetchone()) 293 | 294 | def testProcedureInOutParamNull(self): 295 | if self.dsn == "ODBC": 296 | with udaExec.connect("ODBC", username=self.username, 297 | password=self.password) as conn: 298 | self.assertIsNotNone(conn) 299 | for r in conn.execute( 300 | """REPLACE PROCEDURE testProcedure1 301 | (IN p1 INTEGER, INOUT p2 INTEGER, 302 | INOUT p3 VARCHAR(200), INOUT p4 FLOAT, 303 | INOUT p5 VARBYTE(128)) 304 | BEGIN 305 | IF p2 IS NULL THEN 306 | SET p2 = p1; 307 | END IF; 308 | IF p3 IS NULL THEN 309 | SET p3 = 'PASS'; 310 | END IF; 311 | IF p4 IS NULL THEN 312 | SET p4 = p1; 313 | END IF; 314 | IF p5 IS NULL THEN 315 | SET p5 = 'AABBCCDDEEFF'XBV; 316 | END IF; 317 | END;"""): 318 | logger.info(r) 319 | with udaExec.connect(self.dsn, username=self.username, 320 | password=self.password) as conn: 321 | for i in range(0, 10): 322 | result = conn.callproc( 323 | "testProcedure1", 324 | (i, teradata.InOutParam(None, "p2", 325 | dataType='INTEGER'), 326 | teradata.InOutParam(None, "p3", size=200), 327 | teradata.InOutParam(None, "p4"), 328 | teradata.InOutParam(None, "p5"))) 329 | self.assertEqual(result["p2"], i) 330 | self.assertEqual(result["p3"], "PASS") 331 | self.assertEqual(result["p4"], i) 332 | 333 | def testProcedure(self): 334 | # REST-307 - Unable to create Stored Procedure using REST, always use 335 | # ODBC. 336 | with udaExec.connect("ODBC", username=self.username, 337 | password=self.password) as conn: 338 | self.assertIsNotNone(conn) 339 | for r in conn.execute( 340 | """REPLACE PROCEDURE testProcedure1 341 | (IN p1 INTEGER, OUT p2 INTEGER) 342 | BEGIN 343 | SET p2 = p1; 344 | END;"""): 345 | logger.info(r) 346 | for r in conn.execute( 347 | """REPLACE PROCEDURE testProcedure2 (INOUT p2 INTEGER) 348 | BEGIN 349 | SET p2 = p2 * p2; 350 | END;"""): 351 | logger.info(r) 352 | with udaExec.connect(self.dsn, username=self.username, 353 | password=self.password) as conn: 354 | for i in range(0, 10): 355 | result = conn.callproc( 356 | "testProcedure1", 357 | (i, teradata.OutParam("p2", dataType="INTEGER"))) 358 | self.assertEqual(result["p2"], i) 359 | # Does not work with REST due to REST-308 360 | if self.dsn == "ODBC": 361 | for i in range(0, 10): 362 | result = conn.callproc( 363 | "testProcedure2", 364 | (teradata.InOutParam(i, "p1", dataType="INTEGER"), )) 365 | self.assertEqual(result["p1"], i * i) 366 | 367 | def testProcedureWithLargeLobInput(self): 368 | # REST-307 - Unable to create Stored Procedure using REST, always use 369 | # ODBC. 370 | with udaExec.connect("ODBC", username=self.username, 371 | password=self.password) as conn: 372 | self.assertIsNotNone(conn) 373 | scriptFile = os.path.join( 374 | os.path.dirname(__file__), "testClobSp.sql") 375 | conn.execute(file=scriptFile, delimiter=";;") 376 | 377 | SQLText = "CDR_2011-07-25_090000.000000.txt\n" 378 | SQLText = SQLText * 5000 379 | print("LENGTH OF SQLTest: {}".format(len(SQLText))) 380 | 381 | conn.callproc('GCFR_BB_ExecutionLog_Set', 382 | ('TestProc', 127, 12, 96, 2, 2, 'MyText', 383 | 'Test.py', 0, 0, SQLText)) 384 | 385 | count = 0 386 | for row in conn.execute("SELECT * FROM GCFR_Execution_Log"): 387 | self.assertEqual(row.Sql_Text, SQLText) 388 | count = count + 1 389 | self.assertEqual(count, 1) 390 | 391 | def testProcedureWithBinaryAndFloatParameters(self): 392 | if self.dsn == "ODBC": 393 | with udaExec.connect(self.dsn, username=self.username, 394 | password=self.password) as conn: 395 | self.assertIsNotNone(conn) 396 | for r in conn.execute( 397 | """REPLACE PROCEDURE testProcedure1 398 | (INOUT p1 VARBYTE(128), OUT p2 VARBYTE(128), 399 | INOUT p3 FLOAT, OUT p4 FLOAT, OUT p5 TIMESTAMP) 400 | BEGIN 401 | SET p2 = p1; 402 | SET p4 = p3; 403 | SET p5 = CURRENT_TIMESTAMP; 404 | END;"""): 405 | logger.info(r) 406 | result = conn.callproc( 407 | "testProcedure1", 408 | (teradata.InOutParam(bytearray([0xFF]), "p1"), 409 | teradata.OutParam("p2"), 410 | teradata.InOutParam(float("inf"), "p3"), 411 | teradata.OutParam("p4", dataType="FLOAT"), 412 | teradata.OutParam("p5", dataType="TIMESTAMP"))) 413 | self.assertEqual(result.p1, bytearray([0xFF])) 414 | self.assertEqual(result.p2, result.p1) 415 | self.assertEqual(result.p3, float('inf')) 416 | self.assertEqual(result.p4, result.p3) 417 | 418 | def testProcedureWithResultSet(self): 419 | if self.dsn == "ODBC": 420 | with udaExec.connect(self.dsn, username=self.username, 421 | password=self.password) as conn: 422 | self.assertIsNotNone(conn) 423 | for r in conn.execute( 424 | """REPLACE PROCEDURE testProcedureWithResultSet() 425 | DYNAMIC RESULT SETS 1 426 | BEGIN 427 | DECLARE QUERY1 VARCHAR(22000); 428 | DECLARE dyna_set1 CURSOR WITH RETURN TO CALLER FOR STMT1; 429 | SET QUERY1 = 'select * from dbc.dbcinfo'; 430 | PREPARE STMT1 FROM QUERY1; 431 | OPEN dyna_set1; 432 | DEALLOCATE PREPARE STMT1; 433 | END;"""): 434 | logger.info(r) 435 | with conn.cursor() as cursor: 436 | cursor.callproc("testProcedureWithResultSet", ()) 437 | self.assertEqual(len(cursor.fetchall()), 3) 438 | 439 | def testQueryTimeout(self): 440 | with self.assertRaises(teradata.DatabaseError) as cm: 441 | with udaExec.connect(self.dsn, username=self.username, 442 | password=self.password) as conn: 443 | conn.execute( 444 | "CREATE TABLE testQueryTimeout (id INT, " 445 | "name VARCHAR(128), dob TIMESTAMP)") 446 | conn.executemany( 447 | "INSERT INTO testQueryTimeout VALUES (?, ?, " 448 | "CURRENT_TIMESTAMP)", 449 | [(x, str(x)) for x in range(0, 10000)], 450 | batch=True) 451 | conn.execute( 452 | "SELECT * FROM testQueryTimeout t1, testQueryTimeout t2", 453 | queryTimeout=1) 454 | self.assertIn("timeout", cm.exception.msg) 455 | 456 | def testNewlinesInQuery(self): 457 | with udaExec.connect(self.dsn, username=self.username, 458 | password=self.password, 459 | transactionMode="ANSI") as conn: 460 | with self.assertRaises(teradata.DatabaseError) as cm: 461 | conn.execute( 462 | """--THIS SQL STATMENT HAS A SYNATAX ERROR 463 | SELECT * FROM ThereIsNoWayThisTableExists""") 464 | self.assertEqual(3807, cm.exception.code) 465 | row = conn.execute("""--THIS SQL STATMENT HAS CORRECT SYNTAX 466 | SELECT 467 | 'Line\nFeed' 468 | AS 469 | linefeed""").fetchone() 470 | self.assertEqual(row.linefeed, 'Line\nFeed') 471 | 472 | def testUnicode(self): 473 | insertCount = 1000 474 | unicodeString = u"\u4EC5\u6062\u590D\u914D\u7F6E\u3002\u73B0" 475 | "\u6709\u7684\u5386\u53F2\u76D1\u63A7\u6570\u636E\u5C06" 476 | "\u4FDD\u7559\uFF0C\u4E0D\u4F1A\u4ECE\u5907\u4EFD\u4E2D" 477 | "\u6062\u590D\u3002" 478 | with udaExec.connect(self.dsn, username=self.username, 479 | password=self.password) as conn: 480 | self.assertEqual(conn.execute( 481 | u"SELECT '{}'".format(unicodeString)).fetchone()[0], 482 | unicodeString) 483 | conn.execute( 484 | "CREATE TABLE testUnicode (id INT, name VARCHAR(10000) " 485 | "CHARACTER SET UNICODE)") 486 | conn.executemany("INSERT INTO testUnicode VALUES (?, ?)", [ 487 | (x, unicodeString) 488 | for x in range(0, insertCount)], 489 | batch=True) 490 | conn.executemany("INSERT INTO testUnicode VALUES (?, ?)", [ 491 | (x + insertCount, unicodeString * 100) 492 | for x in range(0, 10)], 493 | batch=False) 494 | count = 0 495 | for row in conn.execute("SELECT * FROM testUnicode"): 496 | if row.id >= insertCount: 497 | self.assertEqual(row.name, unicodeString * 100) 498 | else: 499 | self.assertEqual(row.name, unicodeString) 500 | count += 1 501 | self.assertEqual(count, insertCount + 10) 502 | 503 | def testExecuteWhileIterating(self): 504 | insertCount = 100 505 | with udaExec.connect(self.dsn, username=self.username, 506 | password=self.password) as conn: 507 | conn.execute( 508 | "CREATE TABLE testExecuteWhileIterating (id INT, " 509 | "name VARCHAR(128))") 510 | conn.executemany( 511 | "INSERT INTO testExecuteWhileIterating VALUES (?, ?)", 512 | [(x, str(x)) for x in range(0, insertCount)], batch=True) 513 | count = 0 514 | self.assertEqual( 515 | conn.execute( 516 | "SELECT COUNT(*) FROM testExecuteWhileIterating" 517 | ).fetchone()[0], insertCount) 518 | for row in conn.cursor().execute( 519 | "SELECT * FROM testExecuteWhileIterating"): 520 | conn.execute( 521 | "DELETE FROM testExecuteWhileIterating WHERE id = ?", 522 | (row.id, )) 523 | count += 1 524 | self.assertEqual(count, insertCount) 525 | self.assertEqual(conn.execute( 526 | "SELECT COUNT(*) FROM testExecuteWhileIterating" 527 | ).fetchone()[0], 0) 528 | 529 | def testUdaExecMultipleThreads(self): 530 | threadCount = 5 531 | threads = [] 532 | for i in range(0, threadCount): 533 | t = threading.Thread( 534 | target=connectAndExecuteSelect, args=(self, i)) 535 | t.daemon = True 536 | threads.append(t) 537 | t.start() 538 | for t in threads: 539 | t.join() 540 | if self.failure: 541 | raise self.failure 542 | 543 | def testConnectionMultipleThreads(self): 544 | threadCount = 5 545 | threads = [] 546 | with udaExec.connect(self.dsn, username=self.username, 547 | password=self.password) as conn: 548 | for i in range(0, threadCount): 549 | t = threading.Thread( 550 | target=cursorAndExecuteSelect, args=(self, conn, i)) 551 | t.daemon = True 552 | threads.append(t) 553 | t.start() 554 | for t in threads: 555 | t.join() 556 | if self.failure: 557 | raise self.failure 558 | 559 | def testAutoGeneratedKeys(self): 560 | # Auto-generated keys are not supported by REST. 561 | if self.dsn == "ODBC": 562 | rowCount = 1 563 | with udaExec.connect(self.dsn, username=self.username, 564 | password=self.password, 565 | ReturnGeneratedKeys="C") as conn: 566 | conn.execute( 567 | "CREATE TABLE testAutoGeneratedKeys (id INTEGER " 568 | "GENERATED BY DEFAULT AS IDENTITY, name VARCHAR(128))") 569 | count = 0 570 | for row in conn.executemany( 571 | "INSERT INTO testAutoGeneratedKeys VALUES (NULL, ?)", 572 | [(str(x), ) for x in range(0, rowCount)]): 573 | count += 1 574 | print(row) 575 | self.assertEqual(row[0], count) 576 | # Potential ODBC bug is preventing this test case from 577 | # passing, e-mail sent to ODBC support team. 578 | # self.assertEqual(count, rowCount) 579 | 580 | def testEmptyResultSet(self): 581 | with udaExec.connect(self.dsn, username=self.username, 582 | password=self.password) as conn: 583 | conn.execute( 584 | "CREATE TABLE testEmptyResultSet (id INTEGER, " 585 | "name VARCHAR(128))") 586 | count = 0 587 | with conn.cursor() as cursor: 588 | for row in cursor.execute("SELECT * FROM testEmptyResultSet"): 589 | count += 1 590 | print(row) 591 | self.assertEqual(count, 0) 592 | 593 | def testFetchArraySize1000(self): 594 | rows = 5000 595 | randomset = [] 596 | for j in range(rows): 597 | rowset = [j, ] 598 | rowset.append(int(random.random() * 100000)) 599 | rowset.append(int(random.random() * 100000)) 600 | rowset.append(int(random.random() * 100000)) 601 | rowset.append(str(random.random() * 100000)) 602 | rowset.append(str(random.random() * 100000)) 603 | rowset.append(str(random.random() * 100000)) 604 | randomset.append(rowset) 605 | 606 | createtablestatement = """ 607 | CREATE MULTISET TABLE testFetchArraySize1000 608 | ( 609 | id INTEGER, 610 | randint1 INTEGER, 611 | randint2 INTEGER, 612 | randint3 INTEGER, 613 | randchar1 VARCHAR(20), 614 | randchar2 VARCHAR(20), 615 | randchar3 VARCHAR(20) 616 | ) 617 | NO PRIMARY INDEX; 618 | """ 619 | with udaExec.connect(self.dsn, username=self.username, 620 | password=self.password) as session: 621 | cursor = session.execute(createtablestatement) 622 | cursor.arraysize = 1000 623 | index = 0 624 | while index < 20: 625 | session.executemany("""INSERT INTO testFetchArraySize1000 626 | VALUES (?, ?, ?, ?, ?, ?, ?)""", 627 | randomset, 628 | batch=True) 629 | for y in randomset: 630 | y[0] = y[0] + len(randomset) 631 | index += 1 632 | fetchRows(self, 100, randomset, session) 633 | fetchRows(self, 1000, randomset, session) 634 | fetchRows(self, 10000, randomset, session) 635 | fetchRows(self, 100000, randomset, session) 636 | 637 | def testDollarSignInPassword(self): 638 | with udaExec.connect(self.dsn) as session: 639 | session.execute("DROP USER testDollarSignInPassword", 640 | ignoreErrors=[3802]) 641 | util.setupTestUser(udaExec, self.dsn, user='testDollarSignInPassword', 642 | passwd='pa$$$$word') 643 | with udaExec.connect(self.dsn, username='testDollarSignInPassword', 644 | password='pa$$$$word') as session: 645 | session.execute("SELECT * FROM DBC.DBCINFO") 646 | 647 | def testOperationsOnClosedCursor(self): 648 | if self.dsn == "ODBC": 649 | with udaExec.connect(self.dsn) as session: 650 | cursor = session.cursor() 651 | cursor.close() 652 | error = None 653 | try: 654 | cursor.execute("SELECT * FROM DBC.DBCINFO") 655 | except teradata.InterfaceError as e: 656 | error = e 657 | self.assertIsNotNone(error) 658 | 659 | def testIgnoreError(self): 660 | with udaExec.connect(self.dsn) as session: 661 | cursor = session.execute("DROP DATABASE ThisDatabaseDoesNotExist", 662 | ignoreErrors=(3802,)) 663 | self.assertIsNotNone(cursor.error) 664 | 665 | def testMultipleResultSets(self): 666 | with udaExec.connect(self.dsn) as session: 667 | cursor = session.execute("""SELECT 'string' as \"string\"; 668 | SELECT 1 as \"integer\"""") 669 | self.assertEqual(cursor.description[0][0], 'string') 670 | self.assertTrue(cursor.nextset()) 671 | self.assertEqual(cursor.description[0][0], 'integer') 672 | 673 | 674 | def fetchRows(test, count, randomset, session): 675 | result = session.execute( 676 | """select * from testFetchArraySize1000 WHERE id < %s 677 | ORDER BY id""" % count) 678 | t0 = time.time() 679 | rowIndex = 0 680 | for r in result: 681 | colIndex = 0 682 | for col in r: 683 | if colIndex != 0: 684 | test.assertEqual( 685 | col, randomset[rowIndex % len(randomset)][colIndex]) 686 | colIndex += 1 687 | rowIndex += 1 688 | print("fetch over sample %s records: %s seconds " % 689 | (count, time.time() - t0)) 690 | 691 | 692 | def connectAndExecuteSelect(testCase, threadId): 693 | try: 694 | with udaExec.connect(testCase.dsn, username=testCase.username, 695 | password=testCase.password) as session: 696 | for row in session.execute("SELECT * FROM DBC.DBCInfo"): 697 | logger.info(str(threadId) + ": " + str(row)) 698 | except Exception as e: 699 | testCase.failure = e 700 | 701 | 702 | def cursorAndExecuteSelect(testCase, session, threadId): 703 | try: 704 | with session.cursor() as cursor: 705 | for row in cursor.execute("SELECT * FROM DBC.DBCInfo"): 706 | logger.info(str(threadId) + ": " + str(row)) 707 | except Exception as e: 708 | testCase.failure = e 709 | 710 | 711 | # The unit tests in the UdaExecExecuteTest are execute once for each named 712 | # data source below. 713 | util.createTestCasePerDSN( 714 | UdaExecExecuteTest, unittest.TestCase, ("HTTP", "HTTPS", "ODBC")) 715 | 716 | if __name__ == '__main__': 717 | formatter = logging.Formatter( 718 | "%(asctime)s - %(name)s - %(levelname)s - %(message)s") 719 | sh = logging.StreamHandler(sys.stdout) 720 | sh.setFormatter(formatter) 721 | sh.setLevel(logging.INFO) 722 | root = logging.getLogger() 723 | root.setLevel(logging.INFO) 724 | root.addHandler(sh) 725 | 726 | configFiles = [os.path.join(os.path.dirname(__file__), 'udaexec.ini')] 727 | udaExec = teradata.UdaExec(configFiles=configFiles, configureLogging=False) 728 | udaExec.checkpoint() 729 | 730 | 731 | def runTest(testName): 732 | suite = unittest.TestSuite() 733 | suite.addTest(UdaExecExecuteTest_ODBC(testName)) # @UndefinedVariable # noqa 734 | suite.addTest(UdaExecExecuteTest_HTTP(testName)) # @UndefinedVariable # noqa 735 | unittest.TextTestRunner().run(suite) 736 | 737 | if __name__ == '__main__': 738 | # runTest('testMultipleResultSets') 739 | unittest.main() 740 | -------------------------------------------------------------------------------- /test/testlargeview.sql: -------------------------------------------------------------------------------- 1 | REPLACE VIEW LARGE_TEST_VIEW AS SELECT 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00100' 2 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00200' 3 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00300' 4 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00400' 5 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00500' 6 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00600' 7 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00700' 8 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00800' 9 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 00900' 10 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01000' 11 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01100' 12 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01200' 13 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01300' 14 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01400' 15 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01500' 16 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01600' 17 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01700' 18 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01800' 19 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 01900' 20 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02000' 21 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02100' 22 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02200' 23 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02300' 24 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02400' 25 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02500' 26 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02600' 27 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02700' 28 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02800' 29 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 02900' 30 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03000' 31 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03100' 32 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03200' 33 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03300' 34 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03400' 35 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03500' 36 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03600' 37 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03700' 38 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03800' 39 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 03900' 40 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04000' 41 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04100' 42 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04200' 43 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04300' 44 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04400' 45 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04500' 46 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04600' 47 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04700' 48 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04800' 49 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 04900' 50 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05000' 51 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05100' 52 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05200' 53 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05300' 54 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05400' 55 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05500' 56 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05600' 57 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05700' 58 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05800' 59 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 05900' 60 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06000' 61 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06100' 62 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06200' 63 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06300' 64 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06400' 65 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06500' 66 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06600' 67 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06700' 68 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06800' 69 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 06900' 70 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07000' 71 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07100' 72 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07200' 73 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07300' 74 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07400' 75 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07500' 76 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07600' 77 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07700' 78 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07800' 79 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 07900' 80 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08000' 81 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08100' 82 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08200' 83 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08300' 84 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08400' 85 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08500' 86 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08600' 87 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08700' 88 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08800' 89 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 08900' 90 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09000' 91 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09100' 92 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09200' 93 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09300' 94 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09400' 95 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09500' 96 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09600' 97 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09700' 98 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09800' 99 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 09900' 100 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10000' 101 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10100' 102 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10200' 103 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10300' 104 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10400' 105 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10500' 106 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10600' 107 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10700' 108 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10800' 109 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 10900' 110 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11000' 111 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11100' 112 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11200' 113 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11300' 114 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11400' 115 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11500' 116 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11600' 117 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11700' 118 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11800' 119 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 11900' 120 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12000' 121 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12100' 122 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12200' 123 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12300' 124 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12400' 125 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12500' 126 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12600' 127 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12700' 128 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12800' 129 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 12900' 130 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13000' 131 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13100' 132 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13200' 133 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13300' 134 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13400' 135 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13500' 136 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13600' 137 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13700' 138 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13800' 139 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 13900' 140 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14000' 141 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14100' 142 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14200' 143 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14300' 144 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14400' 145 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14500' 146 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14600' 147 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14700' 148 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14800' 149 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 14900' 150 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15000' 151 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15100' 152 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15200' 153 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15300' 154 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15400' 155 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15500' 156 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15600' 157 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15700' 158 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15800' 159 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 15900' 160 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16000' 161 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16100' 162 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16200' 163 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16300' 164 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16400' 165 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16500' 166 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16600' 167 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16700' 168 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16800' 169 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 16900' 170 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17000' 171 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17100' 172 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17200' 173 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17300' 174 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17400' 175 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17500' 176 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17600' 177 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17700' 178 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17800' 179 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 17900' 180 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18000' 181 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18100' 182 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18200' 183 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18300' 184 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18400' 185 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18500' 186 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18600' 187 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18700' 188 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18800' 189 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 18900' 190 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19000' 191 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19100' 192 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19200' 193 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19300' 194 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19400' 195 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19500' 196 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19600' 197 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19700' 198 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19800' 199 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 19900' 200 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20000' 201 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20100' 202 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20200' 203 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20300' 204 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20400' 205 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20500' 206 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20600' 207 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20700' 208 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20800' 209 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 20900' 210 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21000' 211 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21100' 212 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21200' 213 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21300' 214 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21400' 215 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21500' 216 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21600' 217 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21700' 218 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21800' 219 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 21900' 220 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22000' 221 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22100' 222 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22200' 223 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22300' 224 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22400' 225 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22500' 226 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22600' 227 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22700' 228 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22800' 229 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 22900' 230 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23000' 231 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23100' 232 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23200' 233 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23300' 234 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23400' 235 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23500' 236 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23600' 237 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23700' 238 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23800' 239 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 23900' 240 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24000' 241 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24100' 242 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24200' 243 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24300' 244 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24400' 245 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24500' 246 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24600' 247 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24700' 248 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24800' 249 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 24900' 250 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25000' 251 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25100' 252 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25200' 253 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25300' 254 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25400' 255 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25500' 256 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25600' 257 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25700' 258 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25800' 259 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 25900' 260 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26000' 261 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26100' 262 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26200' 263 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26300' 264 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26400' 265 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26500' 266 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26600' 267 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26700' 268 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26800' 269 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 26900' 270 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27000' 271 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27100' 272 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27200' 273 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27300' 274 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27400' 275 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27500' 276 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27600' 277 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27700' 278 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27800' 279 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 27900' 280 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28000' 281 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28100' 282 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28200' 283 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28300' 284 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28400' 285 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28500' 286 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28600' 287 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28700' 288 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28800' 289 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 28900' 290 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29000' 291 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29100' 292 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29200' 293 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29300' 294 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29400' 295 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29500' 296 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29600' 297 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29700' 298 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29800' 299 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 29900' 300 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 30000' 301 | || 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX' AS total_chars_31000 302 | 303 | -------------------------------------------------------------------------------- /test/udaexec.ini: -------------------------------------------------------------------------------- 1 | [CONFIG] 2 | appName=PyTdUnitTestsの 3 | version=1.00.00.01 4 | dsn=TEST3 5 | key1=file1 6 | key2=file1 7 | escapeTest=this$$isatest 8 | httpsPort=1443 9 | dbcInfo=DBC.DBCInfo 10 | port=${httpsPort} 11 | testSystem=sdt00250 12 | 13 | [DEFAULT] 14 | system=${testSystem} 15 | host=sdlc4157.labs.teradata.com 16 | username=dbc 17 | password=dbc 18 | charset=UTF8 19 | 20 | [HTTP] 21 | method=rest 22 | 23 | [HTTPS] 24 | method=rest 25 | protocol=https 26 | verifyCerts=False 27 | 28 | [ODBC] 29 | method=odbc 30 | system=${testSystem}.labs.teradata.com 31 | 32 | 33 | [ESCAPE_TEST] 34 | method=odbc 35 | system=${testSystem}.labs.teradata.com 36 | password=pa$$$$word 37 | escapeTest2=${escapeTest} 38 | -------------------------------------------------------------------------------- /test/udaexec2.ini: -------------------------------------------------------------------------------- 1 | [CONFIG] 2 | key2=file2 3 | key3=file2 4 | key4=${key1} 5 | key5=${key4} 6 | --------------------------------------------------------------------------------