├── .gitignore
├── README.md
├── bin
├── .dnsserver.py.swp
├── checkconfig.py
├── dnsserver.py
├── dnsserver.pyc
├── ippool.py
├── ippool.pyc
├── logger.py
├── logger.pyc
└── sdns.py
├── conf
├── .a.yaml.swp
├── a.yaml
├── ns.yaml
├── sdns.pid
├── sdns.yaml
└── soa.yaml
├── data
└── ip.csv
├── lib
└── yaml
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── composer.py
│ ├── composer.pyc
│ ├── constructor.py
│ ├── constructor.pyc
│ ├── cyaml.py
│ ├── cyaml.pyc
│ ├── dumper.py
│ ├── dumper.pyc
│ ├── emitter.py
│ ├── emitter.pyc
│ ├── error.py
│ ├── error.pyc
│ ├── events.py
│ ├── events.pyc
│ ├── loader.py
│ ├── loader.pyc
│ ├── nodes.py
│ ├── nodes.pyc
│ ├── parser.py
│ ├── parser.pyc
│ ├── reader.py
│ ├── reader.pyc
│ ├── representer.py
│ ├── representer.pyc
│ ├── resolver.py
│ ├── resolver.pyc
│ ├── scanner.py
│ ├── scanner.pyc
│ ├── serializer.py
│ ├── serializer.pyc
│ ├── tokens.py
│ └── tokens.pyc
├── pkg
├── Twisted-12.2.0.tar.bz2
├── virtualenv-1.9.1.tar.gz
└── zope.interface-4.0.1.tar.gz
└── script
├── dns_in_twisted.py
├── install_smartdns.sh
├── run_dns.sh
└── twistd
/.gitignore:
--------------------------------------------------------------------------------
1 | log/*
2 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | smartdns
2 | ========
3 | ## 使用场景
4 | ##### smartdns是python语言编写,基于twisted框架实现的dns server,能够支持针对不同的dns请求根据配置返回不同的解析结果。smartdns获取dns请求的源IP或者客户端IP(支持edns协议的请求可以获取客户端IP),根据本地的静态IP库获取请求IP的特性,包括所在的国家、省份、城市、ISP等,然后根据我们的调度配置返回解析结果。
5 | ##### smartdns的使用场景:
6 | 1. 服务的多机房流量调度,比如电信流量调度到电信机房、联通流量调度到联通机房;
7 | 2. 用户访问控制,将用户调度到离用户最近或者链路质量最好的节点上。
8 |
9 | ##### 举个简单的例子,我们的一个站点test.test.com同时部署在电信和联通两个机房,该站点在电信机房的ip为1.1.1.1、在联通机房的ip为2.2.2.2,就可以通过smartdns做到该站点域名解析时判断源IP为电信的IP时返回1.1.1.1、判断源IP为联通的IP时返回2.2.2.2,从而达到不同运营商机房流量调度的目的。
10 |
11 | ## 支持的功能
12 | 支持A、SOA、NS记录的查询,支持DNS forward功能
13 |
14 | ## 性能
15 | 在虚拟机2.4G CPU上能够处理1000QPS查询请求,打开debug日志后可以到800QPS。3-5台dns server组成的集群已经能够满足大部分站点的需求。
16 |
17 | 目前我们正在实现和小流量测试go语言实现的smartdns,能够达到3wQPS以上,后续测试稳定后会开源出来,大家敬请期待:)
18 |
19 | ## 原理
20 |
21 | smartdns响应dns请求的处理流程如下:
22 |
23 | 
24 |
25 | IPPool类的初始化和该类中FindIP方法进行解析处理是smartdns中最关键的两个要素,这两个要素在下面详细介绍。其他的特性比如继承twisted中dns相关类并重写处理dns请求的方法、升级twisted代码支持解析和处理edns请求等大家可以通过代码了解。edns知识可以猛戳这里:DNS support edns-client-subnet
26 |
27 | #### IPPool初始化
28 |
29 | 
30 |
31 | ip.csv内容格式如下:
32 | ``200000001, 200000010,中国,陕西,西安,电信``
33 |
34 | 其中各个字段含义分别为 ``IP段起始,IP段截止,IP段所属国家,IP段所属省份,IP段所属城市,IP段所属ISP``
35 |
36 | a.yaml配置文件格式:
37 |
test.test.com:
38 | ttl: 3600
39 | default: 5.5.5.5 2.2.2.2
40 | 中国,广东,,联通: 1.1.1.1 3.3.3.1
41 | 中国,广东,,电信: 1.1.1.2 3.3.3.2
42 |
43 | 配置中地域信息的key包括四个字段,分别带有不同的权重:
44 | - 国家: 8
45 | - 省份: 4
46 | - 城市: 2
47 | - 运营商: 1
48 |
49 | 初始化阶段,会生成一个名为iphash的dict,具体数据结构如下图:
50 |
51 | 
52 |
53 | 其中,iphash的key为ip.csv每一条记录的起始IP,value为一个list,list长度为6,list前5个字段分别为以该key为起始IP记录的IP段截止、IP段所属国家、IP段所属省份、IP段所属城市、IP段所属ISP,第六个字段是一个hash,key为a.yaml里面配置的域名,value为长度为2的list,iphash[IP段起始][6][域名1][0]为域名1在该IP段的最优解析,iphash[IP段起始][6][域名1][1]为该最优解析的总权值,该总权值暂时只做参考。
54 |
55 | iphash初始化过程中最关键的是iphash[IP段起始][6][域名1]的最优解析的计算,最简单直接的方式是直接遍历域名1的所有调度配置,挑选出满足条件且总权值最高的解析,即为最优解析。这种方式记录整个iphash的时间复杂度为O(xyz),x为ip.csv记录数,y为域名总数量,z为各个域名的调度配置数。为了优化启动速度,优化了寻找最优解析的方法:事先将每个域名调度配置生成一颗树,这棵树是用dict模拟出来的,这样需要最优解的时候就不需要遍历所有调度配置,而是最多检索15次即可找到最优,即时间复杂度为O(15xy),具体实现参考IPPool的LoadRecord和JoinIP两个方法。
56 |
57 | 有了初始化后的iphash数据结构之后,每次请求处理的时候,只需要定位请求IP处在哪个IP段,找到IP段起始IP,然后从iphash中取出最优解析,取出最优解析的过程是O(1)的。具体流程如下:
58 |
59 | 
60 |
61 | ## 代码
62 |
63 | github: https://github.com/xiaomi-sa/smartdns
64 |
65 | ## 安装
66 |
67 | 依赖:
68 |
69 | python 2.6或者2.7
70 | Twisted 12.2.0
71 | zope.interface 4.0.1
72 |
73 | 安装:
74 |
75 | git clone smartdns到本地路径,进入script目录,执行install_smartdns.sh即可将smartdns安装在本地,同时python环境和相关的依赖都是使用virtualenv来进行管理,不会对系统环境造成影响。
76 |
77 | 启动:
78 |
79 | 进入smartdns的bin路径下,执行sh run_dns.sh即可启动smartdns
80 |
81 | ## 测试
82 |
83 | 本地测试 dig test.test.com @127.0.0.1
84 |
85 | 或者将搭建的smartdns加到测试域名的ns中进行测试。
86 |
87 | ## 支持
88 |
89 | mail: fangshaosen@xiaomi.com
90 |
91 | github: jerryfang8
92 |
93 | EDNS相关请参考:DNS support edns-client-subnet
94 |
--------------------------------------------------------------------------------
/bin/.dnsserver.py.swp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/bin/.dnsserver.py.swp
--------------------------------------------------------------------------------
/bin/checkconfig.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 |
3 | import sys, os
4 | import traceback
5 | from os.path import isfile
6 | sys.path.append('../lib')
7 | import yaml
8 |
9 | def loadconfig(path):
10 | if not isfile(path):
11 | print "[FATAL] can't find config file %s !" % path
12 | exit(1)
13 | f = open(path, 'r')
14 | x = yaml.load(f)
15 | f.close
16 | return x
17 |
18 | def checkDnsMainConfig(c):
19 | if c['dnsforward_port'] not in range(1, 65536):
20 | print "[FATAL] dnsforward_port out of range."
21 | return False
22 | for ip in c['dnsforward_ip']:
23 | boollist = map(lambda x: -1 ipend:
77 | print "[ERROR] ip starts(%s) bigger than ends(%s)" %(ipstart, ipend)
78 | return False
79 | if 0 == ipstart:
80 | print "[ERROR] ip starts with 0"
81 | return False
82 | if ipstart in iphash:
83 | print "[ERROR] ip起始点重合:start(%s),end(%s)" % (ipstart, ipend)
84 | return False
85 | iplist.append(ipstart)
86 | iphash[ipstart] = ipend
87 | iplist.sort()
88 | i = 0
89 | length = len(iplist) - 1
90 | while i < length:
91 | if iphash[iplist[i]] >= iplist[i + 1]:
92 | print "[ERROR] ip有重合,start(%s),end(%s)和start(%s),end(%s)" % \
93 | ( iplist[i], iphash[iplist[i]], iplist[i+1], iphash[iplist[i+1]])
94 | return False
95 | i += 1
96 | print "[INFO] 'IP.CSV' configuration check succ"
97 | return True
98 |
99 | def checkconfig():
100 | #main config
101 | try:
102 | conf = loadconfig('../conf/sdns.yaml')
103 | if not checkDnsMainConfig(conf):
104 | return False
105 |
106 | #dns record config file
107 | Amapping = loadconfig(conf['AFILE'])
108 | NSmapping = loadconfig(conf['NSFILE'])
109 | SOAmapping = loadconfig(conf['SOAFILE'])
110 | if not checkAmapConfig(Amapping) or not checkNSmapConfig(NSmapping) or not checkSOAmapConfig(SOAmapping):
111 | return False
112 | if not checkIPList("../data/ip.csv"):
113 | return False
114 | return True
115 | except:
116 | print traceback.print_exc()
117 | return False
118 |
119 | if __name__ == '__main__':
120 | print "[INFO] start check configuration"
121 | if not checkconfig():
122 | print "\x1b[1;31m[FATAL] check configuration failed.\x1b[0m"
123 | sys.exit(1)
124 | print "[INFO] check configuration done, all ok."
125 |
--------------------------------------------------------------------------------
/bin/dnsserver.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | import sys, os
3 | import random
4 | import re
5 | import time
6 | import inspect
7 | from logger import logger
8 | sys.path.append('../lib')
9 | from twisted.names import dns, server, client, cache, common, resolve
10 | from twisted.python import failure
11 | from twisted.internet import defer
12 |
13 | typeToMethod = {
14 | dns.A: 'lookupAddress',
15 | dns.AAAA: 'lookupIPV6Address',
16 | dns.A6: 'lookupAddress6',
17 | dns.NS: 'lookupNameservers',
18 | dns.CNAME: 'lookupCanonicalName',
19 | dns.SOA: 'lookupAuthority',
20 | dns.MB: 'lookupMailBox',
21 | dns.MG: 'lookupMailGroup',
22 | dns.MR: 'lookupMailRename',
23 | dns.NULL: 'lookupNull',
24 | dns.WKS: 'lookupWellKnownServices',
25 | dns.PTR: 'lookupPointer',
26 | dns.HINFO: 'lookupHostInfo',
27 | dns.MINFO: 'lookupMailboxInfo',
28 | dns.MX: 'lookupMailExchange',
29 | dns.TXT: 'lookupText',
30 | dns.SPF: 'lookupSenderPolicy',
31 |
32 | dns.RP: 'lookupResponsibility',
33 | dns.AFSDB: 'lookupAFSDatabase',
34 | dns.SRV: 'lookupService',
35 | dns.NAPTR: 'lookupNamingAuthorityPointer',
36 | dns.AXFR: 'lookupZone',
37 | dns.ALL_RECORDS: 'lookupAllRecords',
38 | }
39 |
40 | smartType = ('lookupAddress', 'lookupAuthority')
41 |
42 | class FailureHandler:
43 | def __init__(self, resolver, query, timeout, addr = None, edns = None):
44 | self.resolver = resolver
45 | self.query = query
46 | self.timeout = timeout
47 | self.addr = addr
48 | self.edns = edns
49 |
50 | def __call__(self, failure):
51 | # AuthoritativeDomainErrors should halt resolution attempts
52 | failure.trap(dns.DomainError, defer.TimeoutError, NotImplementedError)
53 | return self.resolver(self.query, self.timeout, self.addr, self.edns)
54 |
55 |
56 | class MapResolver(client.Resolver):
57 | def __init__(self, Finder, Amapping, NSmapping, SOAmapping, servers):
58 | self.Finder = Finder
59 | self.Amapping = Amapping
60 | self.NSmapping = NSmapping
61 | self.SOAmapping = SOAmapping
62 | client.Resolver.__init__(self, servers=servers)
63 |
64 | def query(self, query, timeout = None, addr = None, edns = None):
65 | try:
66 | if typeToMethod[query.type] in smartType:
67 | return self.typeToMethod[query.type](str(query.name), timeout, addr, edns)
68 | else:
69 | return self.typeToMethod[query.type](str(query.name), timeout)
70 | except KeyError, e:
71 | return defer.fail(failure.Failure(NotImplementedError(str(self.__class__) + " " + str(query.type))))
72 |
73 | def lookupAddress(self, name, timeout = None, addr = None, edns = None):
74 | if name in self.Amapping:
75 | ttl = self.Amapping[name]['ttl']
76 | def packResult( value ):
77 | ret = []
78 | add = []
79 | for x in value:
80 | ret.append(dns.RRHeader(name, dns.A, dns.IN, ttl, dns.Record_A(x, ttl), True))
81 |
82 | if edns is not None:
83 | if edns.rdlength > 8:
84 | add.append(dns.RRHeader('', dns.EDNS, 4096, edns.ttl, edns.payload, True))
85 | else:
86 | add.append(dns.RRHeader('', dns.EDNS, 4096, 0, dns.Record_EDNS(None, 0), True))
87 | return [ret, (), add]
88 |
89 | result = self.Finder.FindIP(str(addr[0]), name)
90 | #返回的IP数组乱序
91 | random.shuffle(result)
92 | return packResult(result)
93 | else:
94 | return self._lookup(name, dns.IN, dns.A, timeout)
95 |
96 | def lookupNameservers(self, name, timeout=None):
97 | if name in self.NSmapping:
98 | result = self.NSmapping[name]
99 | ttl = result['ttl']
100 | record = re.split(ur',|\s+', result['record'])
101 | def packResultNS(value):
102 | ret = []
103 | for x in value:
104 | ret.append(dns.RRHeader(name, dns.NS, dns.IN, ttl, dns.Record_NS(x, ttl), True))
105 | return [ret, (), ()]
106 | return packResultNS(record)
107 | else:
108 | return self._lookup(name, dns.IN, dns.NS, timeout)
109 |
110 | def lookupAuthority(self, name, timeout=None, addr = None, edns = None):
111 | if name in self.SOAmapping:
112 | result = self.SOAmapping[name]
113 | add = []
114 | def packResultSOA(value):
115 | if edns is not None:
116 | if edns.rdlength > 8:
117 | add.append(dns.RRHeader('', dns.EDNS, 4096, edns.ttl, edns.payload, True))
118 | else:
119 | add.append(dns.RRHeader('', dns.EDNS, 4096, 0, dns.Record_EDNS(None, 0), True))
120 |
121 | return [(dns.RRHeader(name, dns.SOA, dns.IN, value['ttl'], dns.Record_SOA(value['record'], value['email'], value['serial'], value['refresh'], value['retry'], value['expire'], value['ttl']), True),),
122 | (),
123 | add
124 | ]
125 | ret = packResultSOA(result)
126 | logger.info("SOA\t[domain: %s]\t[return: %s]\t[additional: %s]" % \
127 | (name, result, add))
128 | return ret
129 | else:
130 | return self._lookup(name, dns.IN, dns.SOA, timeout)
131 |
132 | def lookupIPV6Address(self, name, timeout = None, addr = None):
133 | return [(),(),()]
134 |
135 | class SmartResolverChain(resolve.ResolverChain):
136 |
137 | def __init__(self, resolvers):
138 | #resolve.ResolverChain.__init__(self, resolvers)
139 | common.ResolverBase.__init__(self)
140 | self.resolvers = resolvers
141 |
142 | def _lookup(self, name, cls, type, timeout, addr = None, edns = None):
143 | q = dns.Query(name, type, cls)
144 | #d = self.resolvers[0].query(q, timeout)
145 | d = defer.fail(failure.Failure(dns.DomainError(name)))
146 | for r in self.resolvers[1:]:
147 | d = d.addErrback(
148 | FailureHandler(r.query, q, timeout, addr, edns)
149 | )
150 | return d
151 |
152 | def query(self, query, timeout = None, addr = None, edns = None):
153 | try:
154 | if typeToMethod[query.type] in smartType:
155 | return self.typeToMethod[query.type](str(query.name), timeout, addr, edns)
156 | else:
157 | return self.typeToMethod[query.type](str(query.name), timeout)
158 | except KeyError, e:
159 | return defer.fail(failure.Failure(NotImplementedError(str(self.__class__) + " " + str(query.type))))
160 |
161 | def lookupAddress(self, name, timeout = None, addr = None, edns = None):
162 | return self._lookup(name, dns.IN, dns.A, timeout, addr, edns)
163 |
164 | def lookupAuthority(self, name, timeout=None, addr = None, edns = None):
165 | return self._lookup(name, dns.IN, dns.SOA, timeout, addr, edns)
166 |
167 | def lookupIPV6Address(self, name, timeout = None, addr = None, edns = None):
168 | return self._lookup(name, dns.IN, dns.AAAA, timeout, addr, edns)
169 |
170 | def lookupNameservers(self, name, timeout = None, addr = None, edns = None):
171 | return self._lookup(name, dns.IN, dns.NS, timeout, addr, edns)
172 |
173 | class SmartDNSFactory(server.DNSServerFactory):
174 | def handleQuery(self, message, protocol, address):
175 | #if len(message.additional) > 0:
176 | # print inspect.getmembers(message.additional[0]
177 | # 可以支持多个query
178 | query = message.queries[0]
179 | edns = None
180 | cliAddr = address
181 | if query.type == 43 or typeToMethod[query.type] == 'lookupAllRecords':
182 | return [(),(),()]
183 | if typeToMethod[query.type] in smartType and \
184 | len(message.additional) != 0 and \
185 | message.additional[0].type == 41 \
186 | and message.additional[0].rdlength > 8:
187 | cliAddr = (message.additional[0].payload.dottedQuad(), 0)
188 | edns = message.additional[0]
189 | logger.info("[type: %s]\t[protocol: %s]\t[query: %s]\t[address: %s]\t[dns_server_addr: %s]\t[additional: %s]" % \
190 | (typeToMethod[query.type], type(protocol), query, cliAddr[0], address[0], edns))
191 | return self.resolver.query(query, addr = cliAddr, edns = edns).addCallback(
192 | self.gotResolverResponse, protocol, message, address
193 | ).addErrback(
194 | self.gotResolverError, protocol, message, address
195 | )
196 |
197 | def __init__(self, authorities = None, caches = None, clients = None, verbose = 0):
198 | resolvers = []
199 | if authorities is not None:
200 | resolvers.extend(authorities)
201 | if caches is not None:
202 | resolvers.extend(caches)
203 | if clients is not None:
204 | resolvers.extend(clients)
205 |
206 | self.canRecurse = not not clients
207 | self.resolver = SmartResolverChain(resolvers)
208 | self.verbose = verbose
209 | if caches:
210 | self.cache = caches[-1]
211 | self.connections = []
212 |
213 |
--------------------------------------------------------------------------------
/bin/dnsserver.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/bin/dnsserver.pyc
--------------------------------------------------------------------------------
/bin/ippool.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | import sys, os
3 | from logger import logger
4 | import time
5 | import bisect
6 | from os.path import isfile
7 | import socket
8 | import struct
9 | sys.path.append('../lib/')
10 | import yaml, re
11 | reload(sys)
12 | sys.setdefaultencoding('utf8')
13 |
14 |
15 | def ip2long(ip):
16 | "convert decimal dotted quad string to long integer"
17 | hexn = ''.join(["%02X" % long(i) for i in ip.split('.')])
18 | return long(hexn, 16)
19 |
20 | def long2ip(n):
21 | "convert long int to dotted quad string"
22 | d = 256 * 256 * 256
23 | q = []
24 | while d > 0:
25 | m,n = divmod(n,d)
26 | q.append(str(m))
27 | d = d/256
28 | return '.'.join(q)
29 |
30 |
31 | class IPPool:
32 | def __init__(self, ipfile, recordfile):
33 | if not isfile(ipfile):
34 | logger.warning("can't find ip data file: %s" % ipfile)
35 | # 故意返回数据,另程序退出
36 | return 1
37 | self.ipfile = ipfile
38 |
39 | if not isfile(recordfile):
40 | logger.warning("can't find A record file: %s" % recordfile)
41 | return 2
42 | self.recordfile = recordfile
43 |
44 | #初始化iplist,用来进行2分查找
45 | self.iplist = []
46 | #初始化iphash,用来检索详细信息
47 | self.iphash = {}
48 |
49 | #初始化存储a.yaml配置
50 | self.record = {}
51 | # 存储各个域名的地域对于ip信息
52 | self.locmapip = {}
53 |
54 | #load record data
55 | self.LoadRecord()
56 |
57 | #load ip data
58 | self.LoadIP()
59 |
60 | print 'Init IP pool finished !'
61 |
62 | def LoadIP(self):
63 | f = open(self.ipfile, 'r')
64 | logger.warning("before load: %s" % ( time.time() ) )
65 | for eachline in f:
66 | ipstart, ipend, country, province, city, sp = eachline.strip().split(',')
67 | ipstart = long(ipstart)
68 | ipend = long(ipend)
69 |
70 | #如果ip地址为0,忽略
71 | if 0 == ipstart:
72 | continue
73 | self.iplist.append(ipstart)
74 | if ipstart in self.iphash:
75 | #print "maybe has same ipstart"
76 | pass
77 | else:
78 | #ipstart, ipend, country, province, city, sp, domain ip hash
79 | self.iphash[ipstart] = [ipstart, ipend, country, province, city, sp, {}]
80 | # 最好合并后再计算
81 | self.JoinIP(ipstart)
82 | f.close()
83 | logger.warning("after load: %s" % ( time.time() ) )
84 | self.iplist.sort()
85 | logger.warning("after sort: %s" % ( time.time() ) )
86 |
87 | # 重写LoadRecord和JoinIP,提升启动效率
88 | def LoadRecord(self):
89 | Add = [8, 4, 2, 1]
90 | f = open(self.recordfile, 'r')
91 | self.record = yaml.load(f)
92 | for fqdn in self.record:
93 | self.locmapip[fqdn] = {}
94 | if fqdn.endswith("_template"):
95 | continue
96 |
97 | for router in self.record[fqdn]:
98 | if router == 'default' or router == 'ttl':
99 | continue
100 | p = None
101 | #p = re.match(ur"(.*),(.*),(.*),(.*)", router)
102 | p = str(router.encode('utf-8')).strip().split(',')
103 | if p is None:
104 | logger.warning("maybe record file format error: %s" % self.recordfile)
105 | sys.exit(1)
106 | match = [None, None, None, None]
107 | weight = 0
108 | for num in range(0, 4):
109 | match[num] = p[num]
110 | if p[num] != "":
111 | weight += Add[num]
112 |
113 | if match[0] not in self.locmapip[fqdn]:
114 | self.locmapip[fqdn][match[0]] = {}
115 | self.locmapip[fqdn][match[0]][match[1]] = {}
116 | self.locmapip[fqdn][match[0]][match[1]][match[2]] = {}
117 | self.locmapip[fqdn][match[0]][match[1]][match[2]][match[3]] = \
118 | [ self.record[fqdn][router], weight]
119 | elif match[1] not in self.locmapip[fqdn][match[0]]:
120 | self.locmapip[fqdn][match[0]][match[1]] = {}
121 | self.locmapip[fqdn][match[0]][match[1]][match[2]] = {}
122 | self.locmapip[fqdn][match[0]][match[1]][match[2]][match[3]] = \
123 | [ self.record[fqdn][router], weight]
124 | elif match[2] not in self.locmapip[fqdn][match[0]][match[1]]:
125 | self.locmapip[fqdn][match[0]][match[1]][match[2]] = {}
126 | self.locmapip[fqdn][match[0]][match[1]][match[2]][match[3]] = \
127 | [ self.record[fqdn][router], weight]
128 | elif match[3] not in self.locmapip[fqdn][match[0]][match[1]][match[2]]:
129 | self.locmapip[fqdn][match[0]][match[1]][match[2]][match[3]] = \
130 | [ self.record[fqdn][router], weight]
131 | f.close()
132 | #logger.warning(self.locmapip)
133 |
134 | def JoinIP(self, ip):
135 | for fqdnk, fqdnv in self.locmapip.items():
136 | l1 = []
137 | l2 = []
138 | l3 = []
139 | weight = 0
140 | #logger.warning("l1 : %s, %s" %(self.iphash[ip][2], fqdnv.keys()))
141 | if "" in fqdnv and "" != self.iphash[ip][2]:
142 | l1.append(fqdnv[""])
143 | if self.iphash[ip][2] in fqdnv:
144 | l1.append(fqdnv[self.iphash[ip][2]])
145 | for k in l1:
146 | #logger.warning("l2 : %s, %s" %(self.iphash[ip][3], k.keys()))
147 | if "" in k and "" != self.iphash[ip][3]:
148 | l2.append(k[""])
149 | if self.iphash[ip][3] in k:
150 | l2.append(k[self.iphash[ip][3]])
151 | for k in l2:
152 | #logger.warning("l3 : %s, %s" %(self.iphash[ip][4], k.keys()))
153 | if "" in k and "" != self.iphash[ip][4]:
154 | l3.append(k[""])
155 | if self.iphash[ip][4] in k:
156 | l3.append(k[self.iphash[ip][4]])
157 | for k in l3:
158 | #logger.warning("l4 : %s, %s" %(self.iphash[ip][5], k.keys()))
159 | if "" in k and k[""][1] > weight:
160 | self.iphash[ip][6][fqdnk] = k[""]
161 | weight = k[""][1]
162 | if self.iphash[ip][5] in k and k[self.iphash[ip][5]][1] > weight:
163 | self.iphash[ip][6][fqdnk] = k[self.iphash[ip][5]]
164 | weight = k[self.iphash[ip][5]][1]
165 | if fqdnk not in self.iphash[ip][6]:
166 | self.iphash[ip][6][fqdnk] = [self.record[fqdnk]['default'], 0]
167 |
168 | def ListIP(self):
169 | for key in self.iphash:
170 | print "ipstart: %s ipend: %s country: %s province: %s city: %s sp: %s" % (key, self.iphash[key][1], self.iphash[key][2], self.iphash[key][3], self.iphash[key][4], self.iphash[key][5])
171 | for i in self.iphash[key][6]:
172 | print "[domain:%s ip: %s]" % (i, self.iphash[key][6][i][0])
173 |
174 | def SearchLocation(self, ip):
175 | ipnum = ip2long(ip)
176 | ip_point = bisect.bisect_right(self.iplist, ipnum)
177 | i = self.iplist[ip_point - 1]
178 | if ip_point == self.iplist.__len__():
179 | j = self.iplist[ip_point - 1]
180 | else:
181 | j = self.iplist[ip_point]
182 |
183 | return i, j, ipnum
184 |
185 | def FindIP(self, ip, name):
186 | i, j, ipnum = self.SearchLocation(ip)
187 |
188 | if i in self.iphash:
189 | ipstart = self.iphash[i][0]
190 | ipend = self.iphash[i][1]
191 | country = self.iphash[i][2]
192 | province = self.iphash[i][3]
193 | city = self.iphash[i][4]
194 | sp = self.iphash[i][5]
195 | if ipstart <= ipnum <= ipend:
196 | ip_list = [ tmp_ip for tmp_ip in re.split(ur',|\s+', self.iphash[i][6][name][0]) if not re.search(ur'[^0-9.]', tmp_ip) ]
197 | logger.info("userip:[%s] domain:[%s] section:[%s-%s] location:[%s,%s,%s,%s] ip_list:%s" % (ip, name, long2ip(ipstart), long2ip(ipend), country, province, city, sp, ip_list ) )
198 | return ip_list
199 | else:
200 | #print "可能不在ip列表内,需要指定默认地址"
201 | ip_list = [ tmp_ip for tmp_ip in re.split(ur',|\s+',self.record[name]['default']) if not re.search(ur'[^0-9.]', tmp_ip) ]
202 | logger.warning("userip:[%s] domain:[%s] ip-section:[%s-%s] range:[(%d-%d)-%d-%d] ip_list:%s" % (ip, name, long2ip(ipstart),long2ip(ipend), ipstart, ipend, ipnum, j, ip_list))
203 | return ip_list
204 | else:
205 | #maybe something wrong
206 | ip_list = [ tmp_ip for tmp_ip in re.split(ur',|\s+',self.record[name]['default']) if not re.search(ur'[^0-9].', tmp_ip) ]
207 | logger.warning("can't find ip in iphash, ip:[%s] domain:[%s] ip_list:%s" % (ip, name, ip_list))
208 | return ip_list
209 |
210 | if __name__ == '__main__':
211 | ipcheck = IPPool('../data/ip.csv', '../conf/a.yaml')
212 |
213 |
--------------------------------------------------------------------------------
/bin/ippool.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/bin/ippool.pyc
--------------------------------------------------------------------------------
/bin/logger.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | import logging
3 | import logging.handlers
4 |
5 | def create_logger(filename):
6 | handle = logging.handlers.TimedRotatingFileHandler( filename = filename,
7 | when='h',
8 | interval=1,
9 | backupCount=72 )
10 | #handle.setLevel(logging.INFO)
11 | fmt = logging.Formatter(fmt='%(asctime)s %(filename)s [line:%(lineno)d] [%(levelname)s] %(message)s',
12 | datefmt='%Y-%m-%d %H:%M:%S' )
13 | handle.setFormatter(fmt)
14 | logger = logging.getLogger()
15 | logger.addHandler(handle)
16 | logger.setLevel(logging.DEBUG)
17 | #logger.setLevel(logging.INFO)
18 | #logger.propagate = False
19 | return logger
20 |
21 | logger = None
22 |
23 | class SLogger():
24 | def __init__(self):
25 | pass
26 |
27 | @classmethod
28 | def init_logger(cls, filename):
29 | global logger
30 | if not logger:
31 | logger = create_logger(filename)
32 | return logger
33 |
34 | if __name__ == "__main__":
35 | logger = SLogger.init_logger("../log/test_log.log")
36 | logger.info('start to init IP pool ......')
37 | #test mem leak
38 | while True:
39 | logger.info('xxxxxxxxxxxxxxxxx')
40 |
--------------------------------------------------------------------------------
/bin/logger.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/bin/logger.pyc
--------------------------------------------------------------------------------
/bin/sdns.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 |
3 | # version
4 | __version__='1.0.1.1'
5 |
6 | import sys, os
7 | import signal
8 | import time
9 | from os.path import isfile
10 | sys.path.append('../lib')
11 | import yaml
12 |
13 | from logger import SLogger
14 | logger = SLogger.init_logger("../log/access_dns.log")
15 |
16 | # setup epoll reactor @PZ
17 | from twisted.internet import epollreactor
18 | epollreactor.install()
19 | from twisted.internet import reactor
20 |
21 | from twisted.names import dns, server, client, cache, common, resolve
22 | from twisted.application import service, internet
23 | from twisted.internet.protocol import DatagramProtocol
24 | from twisted.python import failure
25 | from twisted.internet import defer, interfaces
26 | from zope.interface import implements
27 |
28 | sys.path.append('ippool.py')
29 | sys.path.append('dnsserver.py')
30 | import ippool, dnsserver
31 |
32 | def loadconfig(path):
33 | if not isfile(path):
34 | print "[FATAL] can't find config file %s !" % path
35 | exit(1)
36 | f = open(path, 'r')
37 | x = yaml.load(f)
38 | f.close
39 | return x
40 |
41 | def get_local_ip():
42 | import sys,socket,fcntl,array,struct
43 | is_64bits = sys.maxsize > 2**32
44 | struct_size = 40 if is_64bits else 32
45 | s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
46 | max_possible = 8 # initial value
47 | while True:
48 | bytes = max_possible * struct_size
49 | names = array.array('B', '\0' * bytes)
50 | outbytes = struct.unpack('iL', fcntl.ioctl(
51 | s.fileno(),
52 | 0x8912, # SIOCGIFCONF
53 | struct.pack('iL', bytes, names.buffer_info()[0])
54 | ))[0]
55 | if outbytes == bytes:
56 | max_possible *= 2
57 | else:
58 | break
59 | namestr = names.tostring()
60 | return [(namestr[i:i+16].split('\0', 1)[0],
61 | socket.inet_ntoa(namestr[i+20:i+24]))
62 | for i in range(0, outbytes, struct_size)]
63 |
64 | def prepare_run(run_env):
65 | #load main config
66 | logger.info('start to load conf/sdns.yaml ......')
67 | conf = loadconfig('../conf/sdns.yaml')
68 |
69 | #load dns record config file
70 | logger.info('start to init IP pool ......')
71 | Finder = ippool.IPPool(conf['IPDATA'], conf['AFILE'])
72 | run_env['finder'] = Finder
73 |
74 | logger.info('start to load A,SOA,NS record ......')
75 | Amapping = loadconfig(conf['AFILE'])
76 | NSmapping = loadconfig(conf['NSFILE'])
77 | SOAmapping = loadconfig(conf['SOAFILE'])
78 |
79 | # set up a resolver that uses the mapping or a secondary nameserver
80 | dnsforward = []
81 | for i in conf['dnsforward_ip']:
82 | dnsforward.append((i, conf['dnsforward_port']))
83 |
84 | for ifc,ip in get_local_ip():
85 | # create the protocols
86 | SmartResolver = dnsserver.MapResolver(Finder, Amapping, NSmapping, SOAmapping, servers=dnsforward)
87 | f = dnsserver.SmartDNSFactory(caches=[cache.CacheResolver()], clients=[SmartResolver])
88 | p = dns.DNSDatagramProtocol(f)
89 | f.noisy = p.noisy = False
90 | run_env['tcp'].append([f,ip])
91 | run_env['udp'].append([p,ip])
92 |
93 | def run_server(run_env):
94 | for e in run_env['tcp']:
95 | run_env['readers'].append(reactor.listenTCP(53, e[0], interface=e[1]))
96 | for e in run_env['udp']:
97 | run_env['readers'].append(reactor.listenUDP(53, e[0], interface=e[1]))
98 |
99 | # sets uy the application
100 | application = service.Application('sdns', 0, 0)
101 |
102 | env = {'udp':[], 'tcp':[], 'readers':[], 'closed':0, 'updated': False, 'finder':None}
103 | prepare_run(env)
104 | run_server(env)
105 |
106 | # run it through twistd!
107 | if __name__ == '__main__':
108 | print "Usage: twistd -y %s" % sys.argv[0]
109 |
--------------------------------------------------------------------------------
/conf/.a.yaml.swp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/conf/.a.yaml.swp
--------------------------------------------------------------------------------
/conf/a.yaml:
--------------------------------------------------------------------------------
1 | ---
2 | test.test.com:
3 | ttl: 3600
4 | default: 5.5.5.5 2.2.2.2
5 | 中国,广东,,联通: 1.1.1.1 3.3.3.1
6 | 中国,广东,,电信: 1.1.1.2 3.3.3.2
7 | 中国,浙江,,联通: 1.1.1.3 3.3.3.3
8 | 中国,湖南,,联通: 1.1.1.4 3.3.3.4
9 | 中国,湖南,,电信: 1.1.1.5 3.3.3.5
10 | 中国,上海,,: 1.1.1.6 3.3.3.6
11 | 中国,北京,,移动: 1.1.1.7 3.3.3.7
12 | 中国,内蒙古,,: 1.1.1.8 3.3.3.8
13 | 中国,,,: 1.1.1.9 3.3.3.9
14 | 美国,,,: 1.1.1.10 3.3.3.10
15 | 马来西亚,,,: 1.1.1.11 3.3.3.11
16 | 土耳其,,,: 1.1.1.12 3.3.3.12
17 |
--------------------------------------------------------------------------------
/conf/ns.yaml:
--------------------------------------------------------------------------------
1 | #Smart DNS NS record config file
2 |
3 | #NS record
4 | test.test.com:
5 | record: ns1.sdns.xiaomidns.com ns2.sdns.xiaomidns.com ns7.sdns.xiaomidns.com ns11.sdns.xiaomidns.com
6 | ttl: 600
7 |
--------------------------------------------------------------------------------
/conf/sdns.pid:
--------------------------------------------------------------------------------
1 | 23930
--------------------------------------------------------------------------------
/conf/sdns.yaml:
--------------------------------------------------------------------------------
1 | #Smart DNS main config file
2 |
3 | #dns forward
4 | #需要支持多个
5 | dnsforward_ip:
6 | - 8.8.8.8
7 | dnsforward_port: 53
8 |
9 | #ip data path
10 | IPDATA: ../data/ip.csv
11 |
12 | #NS record filed
13 | NSFILE: ../conf/ns.yaml
14 |
15 | #SOA record file
16 | SOAFILE: ../conf/soa.yaml
17 |
18 | #A record file
19 | AFILE: ../conf/a.yaml
20 |
--------------------------------------------------------------------------------
/conf/soa.yaml:
--------------------------------------------------------------------------------
1 | #Smart DNS soa config file
2 |
3 | #SOA record
4 | test.test.com:
5 | record: ns1.test.test.com
6 | email: sa.test.com
7 | serial: 20120627
8 | refresh: 300
9 | retry: 300
10 | expire: 9527
11 | ttl: 3600
12 |
--------------------------------------------------------------------------------
/lib/yaml/__init__.py:
--------------------------------------------------------------------------------
1 |
2 | from error import *
3 |
4 | from tokens import *
5 | from events import *
6 | from nodes import *
7 |
8 | from loader import *
9 | from dumper import *
10 |
11 | __version__ = '3.10'
12 |
13 | try:
14 | from cyaml import *
15 | __with_libyaml__ = True
16 | except ImportError:
17 | __with_libyaml__ = False
18 |
19 | def scan(stream, Loader=Loader):
20 | """
21 | Scan a YAML stream and produce scanning tokens.
22 | """
23 | loader = Loader(stream)
24 | try:
25 | while loader.check_token():
26 | yield loader.get_token()
27 | finally:
28 | loader.dispose()
29 |
30 | def parse(stream, Loader=Loader):
31 | """
32 | Parse a YAML stream and produce parsing events.
33 | """
34 | loader = Loader(stream)
35 | try:
36 | while loader.check_event():
37 | yield loader.get_event()
38 | finally:
39 | loader.dispose()
40 |
41 | def compose(stream, Loader=Loader):
42 | """
43 | Parse the first YAML document in a stream
44 | and produce the corresponding representation tree.
45 | """
46 | loader = Loader(stream)
47 | try:
48 | return loader.get_single_node()
49 | finally:
50 | loader.dispose()
51 |
52 | def compose_all(stream, Loader=Loader):
53 | """
54 | Parse all YAML documents in a stream
55 | and produce corresponding representation trees.
56 | """
57 | loader = Loader(stream)
58 | try:
59 | while loader.check_node():
60 | yield loader.get_node()
61 | finally:
62 | loader.dispose()
63 |
64 | def load(stream, Loader=Loader):
65 | """
66 | Parse the first YAML document in a stream
67 | and produce the corresponding Python object.
68 | """
69 | loader = Loader(stream)
70 | try:
71 | return loader.get_single_data()
72 | finally:
73 | loader.dispose()
74 |
75 | def load_all(stream, Loader=Loader):
76 | """
77 | Parse all YAML documents in a stream
78 | and produce corresponding Python objects.
79 | """
80 | loader = Loader(stream)
81 | try:
82 | while loader.check_data():
83 | yield loader.get_data()
84 | finally:
85 | loader.dispose()
86 |
87 | def safe_load(stream):
88 | """
89 | Parse the first YAML document in a stream
90 | and produce the corresponding Python object.
91 | Resolve only basic YAML tags.
92 | """
93 | return load(stream, SafeLoader)
94 |
95 | def safe_load_all(stream):
96 | """
97 | Parse all YAML documents in a stream
98 | and produce corresponding Python objects.
99 | Resolve only basic YAML tags.
100 | """
101 | return load_all(stream, SafeLoader)
102 |
103 | def emit(events, stream=None, Dumper=Dumper,
104 | canonical=None, indent=None, width=None,
105 | allow_unicode=None, line_break=None):
106 | """
107 | Emit YAML parsing events into a stream.
108 | If stream is None, return the produced string instead.
109 | """
110 | getvalue = None
111 | if stream is None:
112 | from StringIO import StringIO
113 | stream = StringIO()
114 | getvalue = stream.getvalue
115 | dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
116 | allow_unicode=allow_unicode, line_break=line_break)
117 | try:
118 | for event in events:
119 | dumper.emit(event)
120 | finally:
121 | dumper.dispose()
122 | if getvalue:
123 | return getvalue()
124 |
125 | def serialize_all(nodes, stream=None, Dumper=Dumper,
126 | canonical=None, indent=None, width=None,
127 | allow_unicode=None, line_break=None,
128 | encoding='utf-8', explicit_start=None, explicit_end=None,
129 | version=None, tags=None):
130 | """
131 | Serialize a sequence of representation trees into a YAML stream.
132 | If stream is None, return the produced string instead.
133 | """
134 | getvalue = None
135 | if stream is None:
136 | if encoding is None:
137 | from StringIO import StringIO
138 | else:
139 | from cStringIO import StringIO
140 | stream = StringIO()
141 | getvalue = stream.getvalue
142 | dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
143 | allow_unicode=allow_unicode, line_break=line_break,
144 | encoding=encoding, version=version, tags=tags,
145 | explicit_start=explicit_start, explicit_end=explicit_end)
146 | try:
147 | dumper.open()
148 | for node in nodes:
149 | dumper.serialize(node)
150 | dumper.close()
151 | finally:
152 | dumper.dispose()
153 | if getvalue:
154 | return getvalue()
155 |
156 | def serialize(node, stream=None, Dumper=Dumper, **kwds):
157 | """
158 | Serialize a representation tree into a YAML stream.
159 | If stream is None, return the produced string instead.
160 | """
161 | return serialize_all([node], stream, Dumper=Dumper, **kwds)
162 |
163 | def dump_all(documents, stream=None, Dumper=Dumper,
164 | default_style=None, default_flow_style=None,
165 | canonical=None, indent=None, width=None,
166 | allow_unicode=None, line_break=None,
167 | encoding='utf-8', explicit_start=None, explicit_end=None,
168 | version=None, tags=None):
169 | """
170 | Serialize a sequence of Python objects into a YAML stream.
171 | If stream is None, return the produced string instead.
172 | """
173 | getvalue = None
174 | if stream is None:
175 | if encoding is None:
176 | from StringIO import StringIO
177 | else:
178 | from cStringIO import StringIO
179 | stream = StringIO()
180 | getvalue = stream.getvalue
181 | dumper = Dumper(stream, default_style=default_style,
182 | default_flow_style=default_flow_style,
183 | canonical=canonical, indent=indent, width=width,
184 | allow_unicode=allow_unicode, line_break=line_break,
185 | encoding=encoding, version=version, tags=tags,
186 | explicit_start=explicit_start, explicit_end=explicit_end)
187 | try:
188 | dumper.open()
189 | for data in documents:
190 | dumper.represent(data)
191 | dumper.close()
192 | finally:
193 | dumper.dispose()
194 | if getvalue:
195 | return getvalue()
196 |
197 | def dump(data, stream=None, Dumper=Dumper, **kwds):
198 | """
199 | Serialize a Python object into a YAML stream.
200 | If stream is None, return the produced string instead.
201 | """
202 | return dump_all([data], stream, Dumper=Dumper, **kwds)
203 |
204 | def safe_dump_all(documents, stream=None, **kwds):
205 | """
206 | Serialize a sequence of Python objects into a YAML stream.
207 | Produce only basic YAML tags.
208 | If stream is None, return the produced string instead.
209 | """
210 | return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
211 |
212 | def safe_dump(data, stream=None, **kwds):
213 | """
214 | Serialize a Python object into a YAML stream.
215 | Produce only basic YAML tags.
216 | If stream is None, return the produced string instead.
217 | """
218 | return dump_all([data], stream, Dumper=SafeDumper, **kwds)
219 |
220 | def add_implicit_resolver(tag, regexp, first=None,
221 | Loader=Loader, Dumper=Dumper):
222 | """
223 | Add an implicit scalar detector.
224 | If an implicit scalar value matches the given regexp,
225 | the corresponding tag is assigned to the scalar.
226 | first is a sequence of possible initial characters or None.
227 | """
228 | Loader.add_implicit_resolver(tag, regexp, first)
229 | Dumper.add_implicit_resolver(tag, regexp, first)
230 |
231 | def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
232 | """
233 | Add a path based resolver for the given tag.
234 | A path is a list of keys that forms a path
235 | to a node in the representation tree.
236 | Keys can be string values, integers, or None.
237 | """
238 | Loader.add_path_resolver(tag, path, kind)
239 | Dumper.add_path_resolver(tag, path, kind)
240 |
241 | def add_constructor(tag, constructor, Loader=Loader):
242 | """
243 | Add a constructor for the given tag.
244 | Constructor is a function that accepts a Loader instance
245 | and a node object and produces the corresponding Python object.
246 | """
247 | Loader.add_constructor(tag, constructor)
248 |
249 | def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
250 | """
251 | Add a multi-constructor for the given tag prefix.
252 | Multi-constructor is called for a node if its tag starts with tag_prefix.
253 | Multi-constructor accepts a Loader instance, a tag suffix,
254 | and a node object and produces the corresponding Python object.
255 | """
256 | Loader.add_multi_constructor(tag_prefix, multi_constructor)
257 |
258 | def add_representer(data_type, representer, Dumper=Dumper):
259 | """
260 | Add a representer for the given type.
261 | Representer is a function accepting a Dumper instance
262 | and an instance of the given data type
263 | and producing the corresponding representation node.
264 | """
265 | Dumper.add_representer(data_type, representer)
266 |
267 | def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
268 | """
269 | Add a representer for the given type.
270 | Multi-representer is a function accepting a Dumper instance
271 | and an instance of the given data type or subtype
272 | and producing the corresponding representation node.
273 | """
274 | Dumper.add_multi_representer(data_type, multi_representer)
275 |
276 | class YAMLObjectMetaclass(type):
277 | """
278 | The metaclass for YAMLObject.
279 | """
280 | def __init__(cls, name, bases, kwds):
281 | super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
282 | if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
283 | cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
284 | cls.yaml_dumper.add_representer(cls, cls.to_yaml)
285 |
286 | class YAMLObject(object):
287 | """
288 | An object that can dump itself to a YAML stream
289 | and load itself from a YAML stream.
290 | """
291 |
292 | __metaclass__ = YAMLObjectMetaclass
293 | __slots__ = () # no direct instantiation, so allow immutable subclasses
294 |
295 | yaml_loader = Loader
296 | yaml_dumper = Dumper
297 |
298 | yaml_tag = None
299 | yaml_flow_style = None
300 |
301 | def from_yaml(cls, loader, node):
302 | """
303 | Convert a representation node to a Python object.
304 | """
305 | return loader.construct_yaml_object(node, cls)
306 | from_yaml = classmethod(from_yaml)
307 |
308 | def to_yaml(cls, dumper, data):
309 | """
310 | Convert a Python object to a representation node.
311 | """
312 | return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
313 | flow_style=cls.yaml_flow_style)
314 | to_yaml = classmethod(to_yaml)
315 |
316 |
--------------------------------------------------------------------------------
/lib/yaml/__init__.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/__init__.pyc
--------------------------------------------------------------------------------
/lib/yaml/composer.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['Composer', 'ComposerError']
3 |
4 | from error import MarkedYAMLError
5 | from events import *
6 | from nodes import *
7 |
8 | class ComposerError(MarkedYAMLError):
9 | pass
10 |
11 | class Composer(object):
12 |
13 | def __init__(self):
14 | self.anchors = {}
15 |
16 | def check_node(self):
17 | # Drop the STREAM-START event.
18 | if self.check_event(StreamStartEvent):
19 | self.get_event()
20 |
21 | # If there are more documents available?
22 | return not self.check_event(StreamEndEvent)
23 |
24 | def get_node(self):
25 | # Get the root node of the next document.
26 | if not self.check_event(StreamEndEvent):
27 | return self.compose_document()
28 |
29 | def get_single_node(self):
30 | # Drop the STREAM-START event.
31 | self.get_event()
32 |
33 | # Compose a document if the stream is not empty.
34 | document = None
35 | if not self.check_event(StreamEndEvent):
36 | document = self.compose_document()
37 |
38 | # Ensure that the stream contains no more documents.
39 | if not self.check_event(StreamEndEvent):
40 | event = self.get_event()
41 | raise ComposerError("expected a single document in the stream",
42 | document.start_mark, "but found another document",
43 | event.start_mark)
44 |
45 | # Drop the STREAM-END event.
46 | self.get_event()
47 |
48 | return document
49 |
50 | def compose_document(self):
51 | # Drop the DOCUMENT-START event.
52 | self.get_event()
53 |
54 | # Compose the root node.
55 | node = self.compose_node(None, None)
56 |
57 | # Drop the DOCUMENT-END event.
58 | self.get_event()
59 |
60 | self.anchors = {}
61 | return node
62 |
63 | def compose_node(self, parent, index):
64 | if self.check_event(AliasEvent):
65 | event = self.get_event()
66 | anchor = event.anchor
67 | if anchor not in self.anchors:
68 | raise ComposerError(None, None, "found undefined alias %r"
69 | % anchor.encode('utf-8'), event.start_mark)
70 | return self.anchors[anchor]
71 | event = self.peek_event()
72 | anchor = event.anchor
73 | if anchor is not None:
74 | if anchor in self.anchors:
75 | raise ComposerError("found duplicate anchor %r; first occurence"
76 | % anchor.encode('utf-8'), self.anchors[anchor].start_mark,
77 | "second occurence", event.start_mark)
78 | self.descend_resolver(parent, index)
79 | if self.check_event(ScalarEvent):
80 | node = self.compose_scalar_node(anchor)
81 | elif self.check_event(SequenceStartEvent):
82 | node = self.compose_sequence_node(anchor)
83 | elif self.check_event(MappingStartEvent):
84 | node = self.compose_mapping_node(anchor)
85 | self.ascend_resolver()
86 | return node
87 |
88 | def compose_scalar_node(self, anchor):
89 | event = self.get_event()
90 | tag = event.tag
91 | if tag is None or tag == u'!':
92 | tag = self.resolve(ScalarNode, event.value, event.implicit)
93 | node = ScalarNode(tag, event.value,
94 | event.start_mark, event.end_mark, style=event.style)
95 | if anchor is not None:
96 | self.anchors[anchor] = node
97 | return node
98 |
99 | def compose_sequence_node(self, anchor):
100 | start_event = self.get_event()
101 | tag = start_event.tag
102 | if tag is None or tag == u'!':
103 | tag = self.resolve(SequenceNode, None, start_event.implicit)
104 | node = SequenceNode(tag, [],
105 | start_event.start_mark, None,
106 | flow_style=start_event.flow_style)
107 | if anchor is not None:
108 | self.anchors[anchor] = node
109 | index = 0
110 | while not self.check_event(SequenceEndEvent):
111 | node.value.append(self.compose_node(node, index))
112 | index += 1
113 | end_event = self.get_event()
114 | node.end_mark = end_event.end_mark
115 | return node
116 |
117 | def compose_mapping_node(self, anchor):
118 | start_event = self.get_event()
119 | tag = start_event.tag
120 | if tag is None or tag == u'!':
121 | tag = self.resolve(MappingNode, None, start_event.implicit)
122 | node = MappingNode(tag, [],
123 | start_event.start_mark, None,
124 | flow_style=start_event.flow_style)
125 | if anchor is not None:
126 | self.anchors[anchor] = node
127 | while not self.check_event(MappingEndEvent):
128 | #key_event = self.peek_event()
129 | item_key = self.compose_node(node, None)
130 | #if item_key in node.value:
131 | # raise ComposerError("while composing a mapping", start_event.start_mark,
132 | # "found duplicate key", key_event.start_mark)
133 | item_value = self.compose_node(node, item_key)
134 | #node.value[item_key] = item_value
135 | node.value.append((item_key, item_value))
136 | end_event = self.get_event()
137 | node.end_mark = end_event.end_mark
138 | return node
139 |
140 |
--------------------------------------------------------------------------------
/lib/yaml/composer.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/composer.pyc
--------------------------------------------------------------------------------
/lib/yaml/constructor.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['BaseConstructor', 'SafeConstructor', 'Constructor',
3 | 'ConstructorError']
4 |
5 | from error import *
6 | from nodes import *
7 |
8 | import datetime
9 |
10 | import binascii, re, sys, types
11 |
12 | class ConstructorError(MarkedYAMLError):
13 | pass
14 |
15 | class BaseConstructor(object):
16 |
17 | yaml_constructors = {}
18 | yaml_multi_constructors = {}
19 |
20 | def __init__(self):
21 | self.constructed_objects = {}
22 | self.recursive_objects = {}
23 | self.state_generators = []
24 | self.deep_construct = False
25 |
26 | def check_data(self):
27 | # If there are more documents available?
28 | return self.check_node()
29 |
30 | def get_data(self):
31 | # Construct and return the next document.
32 | if self.check_node():
33 | return self.construct_document(self.get_node())
34 |
35 | def get_single_data(self):
36 | # Ensure that the stream contains a single document and construct it.
37 | node = self.get_single_node()
38 | if node is not None:
39 | return self.construct_document(node)
40 | return None
41 |
42 | def construct_document(self, node):
43 | data = self.construct_object(node)
44 | while self.state_generators:
45 | state_generators = self.state_generators
46 | self.state_generators = []
47 | for generator in state_generators:
48 | for dummy in generator:
49 | pass
50 | self.constructed_objects = {}
51 | self.recursive_objects = {}
52 | self.deep_construct = False
53 | return data
54 |
55 | def construct_object(self, node, deep=False):
56 | if node in self.constructed_objects:
57 | return self.constructed_objects[node]
58 | if deep:
59 | old_deep = self.deep_construct
60 | self.deep_construct = True
61 | if node in self.recursive_objects:
62 | raise ConstructorError(None, None,
63 | "found unconstructable recursive node", node.start_mark)
64 | self.recursive_objects[node] = None
65 | constructor = None
66 | tag_suffix = None
67 | if node.tag in self.yaml_constructors:
68 | constructor = self.yaml_constructors[node.tag]
69 | else:
70 | for tag_prefix in self.yaml_multi_constructors:
71 | if node.tag.startswith(tag_prefix):
72 | tag_suffix = node.tag[len(tag_prefix):]
73 | constructor = self.yaml_multi_constructors[tag_prefix]
74 | break
75 | else:
76 | if None in self.yaml_multi_constructors:
77 | tag_suffix = node.tag
78 | constructor = self.yaml_multi_constructors[None]
79 | elif None in self.yaml_constructors:
80 | constructor = self.yaml_constructors[None]
81 | elif isinstance(node, ScalarNode):
82 | constructor = self.__class__.construct_scalar
83 | elif isinstance(node, SequenceNode):
84 | constructor = self.__class__.construct_sequence
85 | elif isinstance(node, MappingNode):
86 | constructor = self.__class__.construct_mapping
87 | if tag_suffix is None:
88 | data = constructor(self, node)
89 | else:
90 | data = constructor(self, tag_suffix, node)
91 | if isinstance(data, types.GeneratorType):
92 | generator = data
93 | data = generator.next()
94 | if self.deep_construct:
95 | for dummy in generator:
96 | pass
97 | else:
98 | self.state_generators.append(generator)
99 | self.constructed_objects[node] = data
100 | del self.recursive_objects[node]
101 | if deep:
102 | self.deep_construct = old_deep
103 | return data
104 |
105 | def construct_scalar(self, node):
106 | if not isinstance(node, ScalarNode):
107 | raise ConstructorError(None, None,
108 | "expected a scalar node, but found %s" % node.id,
109 | node.start_mark)
110 | return node.value
111 |
112 | def construct_sequence(self, node, deep=False):
113 | if not isinstance(node, SequenceNode):
114 | raise ConstructorError(None, None,
115 | "expected a sequence node, but found %s" % node.id,
116 | node.start_mark)
117 | return [self.construct_object(child, deep=deep)
118 | for child in node.value]
119 |
120 | def construct_mapping(self, node, deep=False):
121 | if not isinstance(node, MappingNode):
122 | raise ConstructorError(None, None,
123 | "expected a mapping node, but found %s" % node.id,
124 | node.start_mark)
125 | mapping = {}
126 | for key_node, value_node in node.value:
127 | key = self.construct_object(key_node, deep=deep)
128 | try:
129 | hash(key)
130 | except TypeError, exc:
131 | raise ConstructorError("while constructing a mapping", node.start_mark,
132 | "found unacceptable key (%s)" % exc, key_node.start_mark)
133 | value = self.construct_object(value_node, deep=deep)
134 | mapping[key] = value
135 | return mapping
136 |
137 | def construct_pairs(self, node, deep=False):
138 | if not isinstance(node, MappingNode):
139 | raise ConstructorError(None, None,
140 | "expected a mapping node, but found %s" % node.id,
141 | node.start_mark)
142 | pairs = []
143 | for key_node, value_node in node.value:
144 | key = self.construct_object(key_node, deep=deep)
145 | value = self.construct_object(value_node, deep=deep)
146 | pairs.append((key, value))
147 | return pairs
148 |
149 | def add_constructor(cls, tag, constructor):
150 | if not 'yaml_constructors' in cls.__dict__:
151 | cls.yaml_constructors = cls.yaml_constructors.copy()
152 | cls.yaml_constructors[tag] = constructor
153 | add_constructor = classmethod(add_constructor)
154 |
155 | def add_multi_constructor(cls, tag_prefix, multi_constructor):
156 | if not 'yaml_multi_constructors' in cls.__dict__:
157 | cls.yaml_multi_constructors = cls.yaml_multi_constructors.copy()
158 | cls.yaml_multi_constructors[tag_prefix] = multi_constructor
159 | add_multi_constructor = classmethod(add_multi_constructor)
160 |
161 | class SafeConstructor(BaseConstructor):
162 |
163 | def construct_scalar(self, node):
164 | if isinstance(node, MappingNode):
165 | for key_node, value_node in node.value:
166 | if key_node.tag == u'tag:yaml.org,2002:value':
167 | return self.construct_scalar(value_node)
168 | return BaseConstructor.construct_scalar(self, node)
169 |
170 | def flatten_mapping(self, node):
171 | merge = []
172 | index = 0
173 | while index < len(node.value):
174 | key_node, value_node = node.value[index]
175 | if key_node.tag == u'tag:yaml.org,2002:merge':
176 | del node.value[index]
177 | if isinstance(value_node, MappingNode):
178 | self.flatten_mapping(value_node)
179 | merge.extend(value_node.value)
180 | elif isinstance(value_node, SequenceNode):
181 | submerge = []
182 | for subnode in value_node.value:
183 | if not isinstance(subnode, MappingNode):
184 | raise ConstructorError("while constructing a mapping",
185 | node.start_mark,
186 | "expected a mapping for merging, but found %s"
187 | % subnode.id, subnode.start_mark)
188 | self.flatten_mapping(subnode)
189 | submerge.append(subnode.value)
190 | submerge.reverse()
191 | for value in submerge:
192 | merge.extend(value)
193 | else:
194 | raise ConstructorError("while constructing a mapping", node.start_mark,
195 | "expected a mapping or list of mappings for merging, but found %s"
196 | % value_node.id, value_node.start_mark)
197 | elif key_node.tag == u'tag:yaml.org,2002:value':
198 | key_node.tag = u'tag:yaml.org,2002:str'
199 | index += 1
200 | else:
201 | index += 1
202 | if merge:
203 | node.value = merge + node.value
204 |
205 | def construct_mapping(self, node, deep=False):
206 | if isinstance(node, MappingNode):
207 | self.flatten_mapping(node)
208 | return BaseConstructor.construct_mapping(self, node, deep=deep)
209 |
210 | def construct_yaml_null(self, node):
211 | self.construct_scalar(node)
212 | return None
213 |
214 | bool_values = {
215 | u'yes': True,
216 | u'no': False,
217 | u'true': True,
218 | u'false': False,
219 | u'on': True,
220 | u'off': False,
221 | }
222 |
223 | def construct_yaml_bool(self, node):
224 | value = self.construct_scalar(node)
225 | return self.bool_values[value.lower()]
226 |
227 | def construct_yaml_int(self, node):
228 | value = str(self.construct_scalar(node))
229 | value = value.replace('_', '')
230 | sign = +1
231 | if value[0] == '-':
232 | sign = -1
233 | if value[0] in '+-':
234 | value = value[1:]
235 | if value == '0':
236 | return 0
237 | elif value.startswith('0b'):
238 | return sign*int(value[2:], 2)
239 | elif value.startswith('0x'):
240 | return sign*int(value[2:], 16)
241 | elif value[0] == '0':
242 | return sign*int(value, 8)
243 | elif ':' in value:
244 | digits = [int(part) for part in value.split(':')]
245 | digits.reverse()
246 | base = 1
247 | value = 0
248 | for digit in digits:
249 | value += digit*base
250 | base *= 60
251 | return sign*value
252 | else:
253 | return sign*int(value)
254 |
255 | inf_value = 1e300
256 | while inf_value != inf_value*inf_value:
257 | inf_value *= inf_value
258 | nan_value = -inf_value/inf_value # Trying to make a quiet NaN (like C99).
259 |
260 | def construct_yaml_float(self, node):
261 | value = str(self.construct_scalar(node))
262 | value = value.replace('_', '').lower()
263 | sign = +1
264 | if value[0] == '-':
265 | sign = -1
266 | if value[0] in '+-':
267 | value = value[1:]
268 | if value == '.inf':
269 | return sign*self.inf_value
270 | elif value == '.nan':
271 | return self.nan_value
272 | elif ':' in value:
273 | digits = [float(part) for part in value.split(':')]
274 | digits.reverse()
275 | base = 1
276 | value = 0.0
277 | for digit in digits:
278 | value += digit*base
279 | base *= 60
280 | return sign*value
281 | else:
282 | return sign*float(value)
283 |
284 | def construct_yaml_binary(self, node):
285 | value = self.construct_scalar(node)
286 | try:
287 | return str(value).decode('base64')
288 | except (binascii.Error, UnicodeEncodeError), exc:
289 | raise ConstructorError(None, None,
290 | "failed to decode base64 data: %s" % exc, node.start_mark)
291 |
292 | timestamp_regexp = re.compile(
293 | ur'''^(?P[0-9][0-9][0-9][0-9])
294 | -(?P[0-9][0-9]?)
295 | -(?P[0-9][0-9]?)
296 | (?:(?:[Tt]|[ \t]+)
297 | (?P[0-9][0-9]?)
298 | :(?P[0-9][0-9])
299 | :(?P[0-9][0-9])
300 | (?:\.(?P[0-9]*))?
301 | (?:[ \t]*(?PZ|(?P[-+])(?P[0-9][0-9]?)
302 | (?::(?P[0-9][0-9]))?))?)?$''', re.X)
303 |
304 | def construct_yaml_timestamp(self, node):
305 | value = self.construct_scalar(node)
306 | match = self.timestamp_regexp.match(node.value)
307 | values = match.groupdict()
308 | year = int(values['year'])
309 | month = int(values['month'])
310 | day = int(values['day'])
311 | if not values['hour']:
312 | return datetime.date(year, month, day)
313 | hour = int(values['hour'])
314 | minute = int(values['minute'])
315 | second = int(values['second'])
316 | fraction = 0
317 | if values['fraction']:
318 | fraction = values['fraction'][:6]
319 | while len(fraction) < 6:
320 | fraction += '0'
321 | fraction = int(fraction)
322 | delta = None
323 | if values['tz_sign']:
324 | tz_hour = int(values['tz_hour'])
325 | tz_minute = int(values['tz_minute'] or 0)
326 | delta = datetime.timedelta(hours=tz_hour, minutes=tz_minute)
327 | if values['tz_sign'] == '-':
328 | delta = -delta
329 | data = datetime.datetime(year, month, day, hour, minute, second, fraction)
330 | if delta:
331 | data -= delta
332 | return data
333 |
334 | def construct_yaml_omap(self, node):
335 | # Note: we do not check for duplicate keys, because it's too
336 | # CPU-expensive.
337 | omap = []
338 | yield omap
339 | if not isinstance(node, SequenceNode):
340 | raise ConstructorError("while constructing an ordered map", node.start_mark,
341 | "expected a sequence, but found %s" % node.id, node.start_mark)
342 | for subnode in node.value:
343 | if not isinstance(subnode, MappingNode):
344 | raise ConstructorError("while constructing an ordered map", node.start_mark,
345 | "expected a mapping of length 1, but found %s" % subnode.id,
346 | subnode.start_mark)
347 | if len(subnode.value) != 1:
348 | raise ConstructorError("while constructing an ordered map", node.start_mark,
349 | "expected a single mapping item, but found %d items" % len(subnode.value),
350 | subnode.start_mark)
351 | key_node, value_node = subnode.value[0]
352 | key = self.construct_object(key_node)
353 | value = self.construct_object(value_node)
354 | omap.append((key, value))
355 |
356 | def construct_yaml_pairs(self, node):
357 | # Note: the same code as `construct_yaml_omap`.
358 | pairs = []
359 | yield pairs
360 | if not isinstance(node, SequenceNode):
361 | raise ConstructorError("while constructing pairs", node.start_mark,
362 | "expected a sequence, but found %s" % node.id, node.start_mark)
363 | for subnode in node.value:
364 | if not isinstance(subnode, MappingNode):
365 | raise ConstructorError("while constructing pairs", node.start_mark,
366 | "expected a mapping of length 1, but found %s" % subnode.id,
367 | subnode.start_mark)
368 | if len(subnode.value) != 1:
369 | raise ConstructorError("while constructing pairs", node.start_mark,
370 | "expected a single mapping item, but found %d items" % len(subnode.value),
371 | subnode.start_mark)
372 | key_node, value_node = subnode.value[0]
373 | key = self.construct_object(key_node)
374 | value = self.construct_object(value_node)
375 | pairs.append((key, value))
376 |
377 | def construct_yaml_set(self, node):
378 | data = set()
379 | yield data
380 | value = self.construct_mapping(node)
381 | data.update(value)
382 |
383 | def construct_yaml_str(self, node):
384 | value = self.construct_scalar(node)
385 | try:
386 | return value.encode('ascii')
387 | except UnicodeEncodeError:
388 | return value
389 |
390 | def construct_yaml_seq(self, node):
391 | data = []
392 | yield data
393 | data.extend(self.construct_sequence(node))
394 |
395 | def construct_yaml_map(self, node):
396 | data = {}
397 | yield data
398 | value = self.construct_mapping(node)
399 | data.update(value)
400 |
401 | def construct_yaml_object(self, node, cls):
402 | data = cls.__new__(cls)
403 | yield data
404 | if hasattr(data, '__setstate__'):
405 | state = self.construct_mapping(node, deep=True)
406 | data.__setstate__(state)
407 | else:
408 | state = self.construct_mapping(node)
409 | data.__dict__.update(state)
410 |
411 | def construct_undefined(self, node):
412 | raise ConstructorError(None, None,
413 | "could not determine a constructor for the tag %r" % node.tag.encode('utf-8'),
414 | node.start_mark)
415 |
416 | SafeConstructor.add_constructor(
417 | u'tag:yaml.org,2002:null',
418 | SafeConstructor.construct_yaml_null)
419 |
420 | SafeConstructor.add_constructor(
421 | u'tag:yaml.org,2002:bool',
422 | SafeConstructor.construct_yaml_bool)
423 |
424 | SafeConstructor.add_constructor(
425 | u'tag:yaml.org,2002:int',
426 | SafeConstructor.construct_yaml_int)
427 |
428 | SafeConstructor.add_constructor(
429 | u'tag:yaml.org,2002:float',
430 | SafeConstructor.construct_yaml_float)
431 |
432 | SafeConstructor.add_constructor(
433 | u'tag:yaml.org,2002:binary',
434 | SafeConstructor.construct_yaml_binary)
435 |
436 | SafeConstructor.add_constructor(
437 | u'tag:yaml.org,2002:timestamp',
438 | SafeConstructor.construct_yaml_timestamp)
439 |
440 | SafeConstructor.add_constructor(
441 | u'tag:yaml.org,2002:omap',
442 | SafeConstructor.construct_yaml_omap)
443 |
444 | SafeConstructor.add_constructor(
445 | u'tag:yaml.org,2002:pairs',
446 | SafeConstructor.construct_yaml_pairs)
447 |
448 | SafeConstructor.add_constructor(
449 | u'tag:yaml.org,2002:set',
450 | SafeConstructor.construct_yaml_set)
451 |
452 | SafeConstructor.add_constructor(
453 | u'tag:yaml.org,2002:str',
454 | SafeConstructor.construct_yaml_str)
455 |
456 | SafeConstructor.add_constructor(
457 | u'tag:yaml.org,2002:seq',
458 | SafeConstructor.construct_yaml_seq)
459 |
460 | SafeConstructor.add_constructor(
461 | u'tag:yaml.org,2002:map',
462 | SafeConstructor.construct_yaml_map)
463 |
464 | SafeConstructor.add_constructor(None,
465 | SafeConstructor.construct_undefined)
466 |
467 | class Constructor(SafeConstructor):
468 |
469 | def construct_python_str(self, node):
470 | return self.construct_scalar(node).encode('utf-8')
471 |
472 | def construct_python_unicode(self, node):
473 | return self.construct_scalar(node)
474 |
475 | def construct_python_long(self, node):
476 | return long(self.construct_yaml_int(node))
477 |
478 | def construct_python_complex(self, node):
479 | return complex(self.construct_scalar(node))
480 |
481 | def construct_python_tuple(self, node):
482 | return tuple(self.construct_sequence(node))
483 |
484 | def find_python_module(self, name, mark):
485 | if not name:
486 | raise ConstructorError("while constructing a Python module", mark,
487 | "expected non-empty name appended to the tag", mark)
488 | try:
489 | __import__(name)
490 | except ImportError, exc:
491 | raise ConstructorError("while constructing a Python module", mark,
492 | "cannot find module %r (%s)" % (name.encode('utf-8'), exc), mark)
493 | return sys.modules[name]
494 |
495 | def find_python_name(self, name, mark):
496 | if not name:
497 | raise ConstructorError("while constructing a Python object", mark,
498 | "expected non-empty name appended to the tag", mark)
499 | if u'.' in name:
500 | module_name, object_name = name.rsplit('.', 1)
501 | else:
502 | module_name = '__builtin__'
503 | object_name = name
504 | try:
505 | __import__(module_name)
506 | except ImportError, exc:
507 | raise ConstructorError("while constructing a Python object", mark,
508 | "cannot find module %r (%s)" % (module_name.encode('utf-8'), exc), mark)
509 | module = sys.modules[module_name]
510 | if not hasattr(module, object_name):
511 | raise ConstructorError("while constructing a Python object", mark,
512 | "cannot find %r in the module %r" % (object_name.encode('utf-8'),
513 | module.__name__), mark)
514 | return getattr(module, object_name)
515 |
516 | def construct_python_name(self, suffix, node):
517 | value = self.construct_scalar(node)
518 | if value:
519 | raise ConstructorError("while constructing a Python name", node.start_mark,
520 | "expected the empty value, but found %r" % value.encode('utf-8'),
521 | node.start_mark)
522 | return self.find_python_name(suffix, node.start_mark)
523 |
524 | def construct_python_module(self, suffix, node):
525 | value = self.construct_scalar(node)
526 | if value:
527 | raise ConstructorError("while constructing a Python module", node.start_mark,
528 | "expected the empty value, but found %r" % value.encode('utf-8'),
529 | node.start_mark)
530 | return self.find_python_module(suffix, node.start_mark)
531 |
532 | class classobj: pass
533 |
534 | def make_python_instance(self, suffix, node,
535 | args=None, kwds=None, newobj=False):
536 | if not args:
537 | args = []
538 | if not kwds:
539 | kwds = {}
540 | cls = self.find_python_name(suffix, node.start_mark)
541 | if newobj and isinstance(cls, type(self.classobj)) \
542 | and not args and not kwds:
543 | instance = self.classobj()
544 | instance.__class__ = cls
545 | return instance
546 | elif newobj and isinstance(cls, type):
547 | return cls.__new__(cls, *args, **kwds)
548 | else:
549 | return cls(*args, **kwds)
550 |
551 | def set_python_instance_state(self, instance, state):
552 | if hasattr(instance, '__setstate__'):
553 | instance.__setstate__(state)
554 | else:
555 | slotstate = {}
556 | if isinstance(state, tuple) and len(state) == 2:
557 | state, slotstate = state
558 | if hasattr(instance, '__dict__'):
559 | instance.__dict__.update(state)
560 | elif state:
561 | slotstate.update(state)
562 | for key, value in slotstate.items():
563 | setattr(object, key, value)
564 |
565 | def construct_python_object(self, suffix, node):
566 | # Format:
567 | # !!python/object:module.name { ... state ... }
568 | instance = self.make_python_instance(suffix, node, newobj=True)
569 | yield instance
570 | deep = hasattr(instance, '__setstate__')
571 | state = self.construct_mapping(node, deep=deep)
572 | self.set_python_instance_state(instance, state)
573 |
574 | def construct_python_object_apply(self, suffix, node, newobj=False):
575 | # Format:
576 | # !!python/object/apply # (or !!python/object/new)
577 | # args: [ ... arguments ... ]
578 | # kwds: { ... keywords ... }
579 | # state: ... state ...
580 | # listitems: [ ... listitems ... ]
581 | # dictitems: { ... dictitems ... }
582 | # or short format:
583 | # !!python/object/apply [ ... arguments ... ]
584 | # The difference between !!python/object/apply and !!python/object/new
585 | # is how an object is created, check make_python_instance for details.
586 | if isinstance(node, SequenceNode):
587 | args = self.construct_sequence(node, deep=True)
588 | kwds = {}
589 | state = {}
590 | listitems = []
591 | dictitems = {}
592 | else:
593 | value = self.construct_mapping(node, deep=True)
594 | args = value.get('args', [])
595 | kwds = value.get('kwds', {})
596 | state = value.get('state', {})
597 | listitems = value.get('listitems', [])
598 | dictitems = value.get('dictitems', {})
599 | instance = self.make_python_instance(suffix, node, args, kwds, newobj)
600 | if state:
601 | self.set_python_instance_state(instance, state)
602 | if listitems:
603 | instance.extend(listitems)
604 | if dictitems:
605 | for key in dictitems:
606 | instance[key] = dictitems[key]
607 | return instance
608 |
609 | def construct_python_object_new(self, suffix, node):
610 | return self.construct_python_object_apply(suffix, node, newobj=True)
611 |
612 | Constructor.add_constructor(
613 | u'tag:yaml.org,2002:python/none',
614 | Constructor.construct_yaml_null)
615 |
616 | Constructor.add_constructor(
617 | u'tag:yaml.org,2002:python/bool',
618 | Constructor.construct_yaml_bool)
619 |
620 | Constructor.add_constructor(
621 | u'tag:yaml.org,2002:python/str',
622 | Constructor.construct_python_str)
623 |
624 | Constructor.add_constructor(
625 | u'tag:yaml.org,2002:python/unicode',
626 | Constructor.construct_python_unicode)
627 |
628 | Constructor.add_constructor(
629 | u'tag:yaml.org,2002:python/int',
630 | Constructor.construct_yaml_int)
631 |
632 | Constructor.add_constructor(
633 | u'tag:yaml.org,2002:python/long',
634 | Constructor.construct_python_long)
635 |
636 | Constructor.add_constructor(
637 | u'tag:yaml.org,2002:python/float',
638 | Constructor.construct_yaml_float)
639 |
640 | Constructor.add_constructor(
641 | u'tag:yaml.org,2002:python/complex',
642 | Constructor.construct_python_complex)
643 |
644 | Constructor.add_constructor(
645 | u'tag:yaml.org,2002:python/list',
646 | Constructor.construct_yaml_seq)
647 |
648 | Constructor.add_constructor(
649 | u'tag:yaml.org,2002:python/tuple',
650 | Constructor.construct_python_tuple)
651 |
652 | Constructor.add_constructor(
653 | u'tag:yaml.org,2002:python/dict',
654 | Constructor.construct_yaml_map)
655 |
656 | Constructor.add_multi_constructor(
657 | u'tag:yaml.org,2002:python/name:',
658 | Constructor.construct_python_name)
659 |
660 | Constructor.add_multi_constructor(
661 | u'tag:yaml.org,2002:python/module:',
662 | Constructor.construct_python_module)
663 |
664 | Constructor.add_multi_constructor(
665 | u'tag:yaml.org,2002:python/object:',
666 | Constructor.construct_python_object)
667 |
668 | Constructor.add_multi_constructor(
669 | u'tag:yaml.org,2002:python/object/apply:',
670 | Constructor.construct_python_object_apply)
671 |
672 | Constructor.add_multi_constructor(
673 | u'tag:yaml.org,2002:python/object/new:',
674 | Constructor.construct_python_object_new)
675 |
676 |
--------------------------------------------------------------------------------
/lib/yaml/constructor.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/constructor.pyc
--------------------------------------------------------------------------------
/lib/yaml/cyaml.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['CBaseLoader', 'CSafeLoader', 'CLoader',
3 | 'CBaseDumper', 'CSafeDumper', 'CDumper']
4 |
5 | from _yaml import CParser, CEmitter
6 |
7 | from constructor import *
8 |
9 | from serializer import *
10 | from representer import *
11 |
12 | from resolver import *
13 |
14 | class CBaseLoader(CParser, BaseConstructor, BaseResolver):
15 |
16 | def __init__(self, stream):
17 | CParser.__init__(self, stream)
18 | BaseConstructor.__init__(self)
19 | BaseResolver.__init__(self)
20 |
21 | class CSafeLoader(CParser, SafeConstructor, Resolver):
22 |
23 | def __init__(self, stream):
24 | CParser.__init__(self, stream)
25 | SafeConstructor.__init__(self)
26 | Resolver.__init__(self)
27 |
28 | class CLoader(CParser, Constructor, Resolver):
29 |
30 | def __init__(self, stream):
31 | CParser.__init__(self, stream)
32 | Constructor.__init__(self)
33 | Resolver.__init__(self)
34 |
35 | class CBaseDumper(CEmitter, BaseRepresenter, BaseResolver):
36 |
37 | def __init__(self, stream,
38 | default_style=None, default_flow_style=None,
39 | canonical=None, indent=None, width=None,
40 | allow_unicode=None, line_break=None,
41 | encoding=None, explicit_start=None, explicit_end=None,
42 | version=None, tags=None):
43 | CEmitter.__init__(self, stream, canonical=canonical,
44 | indent=indent, width=width, encoding=encoding,
45 | allow_unicode=allow_unicode, line_break=line_break,
46 | explicit_start=explicit_start, explicit_end=explicit_end,
47 | version=version, tags=tags)
48 | Representer.__init__(self, default_style=default_style,
49 | default_flow_style=default_flow_style)
50 | Resolver.__init__(self)
51 |
52 | class CSafeDumper(CEmitter, SafeRepresenter, Resolver):
53 |
54 | def __init__(self, stream,
55 | default_style=None, default_flow_style=None,
56 | canonical=None, indent=None, width=None,
57 | allow_unicode=None, line_break=None,
58 | encoding=None, explicit_start=None, explicit_end=None,
59 | version=None, tags=None):
60 | CEmitter.__init__(self, stream, canonical=canonical,
61 | indent=indent, width=width, encoding=encoding,
62 | allow_unicode=allow_unicode, line_break=line_break,
63 | explicit_start=explicit_start, explicit_end=explicit_end,
64 | version=version, tags=tags)
65 | SafeRepresenter.__init__(self, default_style=default_style,
66 | default_flow_style=default_flow_style)
67 | Resolver.__init__(self)
68 |
69 | class CDumper(CEmitter, Serializer, Representer, Resolver):
70 |
71 | def __init__(self, stream,
72 | default_style=None, default_flow_style=None,
73 | canonical=None, indent=None, width=None,
74 | allow_unicode=None, line_break=None,
75 | encoding=None, explicit_start=None, explicit_end=None,
76 | version=None, tags=None):
77 | CEmitter.__init__(self, stream, canonical=canonical,
78 | indent=indent, width=width, encoding=encoding,
79 | allow_unicode=allow_unicode, line_break=line_break,
80 | explicit_start=explicit_start, explicit_end=explicit_end,
81 | version=version, tags=tags)
82 | Representer.__init__(self, default_style=default_style,
83 | default_flow_style=default_flow_style)
84 | Resolver.__init__(self)
85 |
86 |
--------------------------------------------------------------------------------
/lib/yaml/cyaml.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/cyaml.pyc
--------------------------------------------------------------------------------
/lib/yaml/dumper.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['BaseDumper', 'SafeDumper', 'Dumper']
3 |
4 | from emitter import *
5 | from serializer import *
6 | from representer import *
7 | from resolver import *
8 |
9 | class BaseDumper(Emitter, Serializer, BaseRepresenter, BaseResolver):
10 |
11 | def __init__(self, stream,
12 | default_style=None, default_flow_style=None,
13 | canonical=None, indent=None, width=None,
14 | allow_unicode=None, line_break=None,
15 | encoding=None, explicit_start=None, explicit_end=None,
16 | version=None, tags=None):
17 | Emitter.__init__(self, stream, canonical=canonical,
18 | indent=indent, width=width,
19 | allow_unicode=allow_unicode, line_break=line_break)
20 | Serializer.__init__(self, encoding=encoding,
21 | explicit_start=explicit_start, explicit_end=explicit_end,
22 | version=version, tags=tags)
23 | Representer.__init__(self, default_style=default_style,
24 | default_flow_style=default_flow_style)
25 | Resolver.__init__(self)
26 |
27 | class SafeDumper(Emitter, Serializer, SafeRepresenter, Resolver):
28 |
29 | def __init__(self, stream,
30 | default_style=None, default_flow_style=None,
31 | canonical=None, indent=None, width=None,
32 | allow_unicode=None, line_break=None,
33 | encoding=None, explicit_start=None, explicit_end=None,
34 | version=None, tags=None):
35 | Emitter.__init__(self, stream, canonical=canonical,
36 | indent=indent, width=width,
37 | allow_unicode=allow_unicode, line_break=line_break)
38 | Serializer.__init__(self, encoding=encoding,
39 | explicit_start=explicit_start, explicit_end=explicit_end,
40 | version=version, tags=tags)
41 | SafeRepresenter.__init__(self, default_style=default_style,
42 | default_flow_style=default_flow_style)
43 | Resolver.__init__(self)
44 |
45 | class Dumper(Emitter, Serializer, Representer, Resolver):
46 |
47 | def __init__(self, stream,
48 | default_style=None, default_flow_style=None,
49 | canonical=None, indent=None, width=None,
50 | allow_unicode=None, line_break=None,
51 | encoding=None, explicit_start=None, explicit_end=None,
52 | version=None, tags=None):
53 | Emitter.__init__(self, stream, canonical=canonical,
54 | indent=indent, width=width,
55 | allow_unicode=allow_unicode, line_break=line_break)
56 | Serializer.__init__(self, encoding=encoding,
57 | explicit_start=explicit_start, explicit_end=explicit_end,
58 | version=version, tags=tags)
59 | Representer.__init__(self, default_style=default_style,
60 | default_flow_style=default_flow_style)
61 | Resolver.__init__(self)
62 |
63 |
--------------------------------------------------------------------------------
/lib/yaml/dumper.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/dumper.pyc
--------------------------------------------------------------------------------
/lib/yaml/emitter.py:
--------------------------------------------------------------------------------
1 |
2 | # Emitter expects events obeying the following grammar:
3 | # stream ::= STREAM-START document* STREAM-END
4 | # document ::= DOCUMENT-START node DOCUMENT-END
5 | # node ::= SCALAR | sequence | mapping
6 | # sequence ::= SEQUENCE-START node* SEQUENCE-END
7 | # mapping ::= MAPPING-START (node node)* MAPPING-END
8 |
9 | __all__ = ['Emitter', 'EmitterError']
10 |
11 | from error import YAMLError
12 | from events import *
13 |
14 | class EmitterError(YAMLError):
15 | pass
16 |
17 | class ScalarAnalysis(object):
18 | def __init__(self, scalar, empty, multiline,
19 | allow_flow_plain, allow_block_plain,
20 | allow_single_quoted, allow_double_quoted,
21 | allow_block):
22 | self.scalar = scalar
23 | self.empty = empty
24 | self.multiline = multiline
25 | self.allow_flow_plain = allow_flow_plain
26 | self.allow_block_plain = allow_block_plain
27 | self.allow_single_quoted = allow_single_quoted
28 | self.allow_double_quoted = allow_double_quoted
29 | self.allow_block = allow_block
30 |
31 | class Emitter(object):
32 |
33 | DEFAULT_TAG_PREFIXES = {
34 | u'!' : u'!',
35 | u'tag:yaml.org,2002:' : u'!!',
36 | }
37 |
38 | def __init__(self, stream, canonical=None, indent=None, width=None,
39 | allow_unicode=None, line_break=None):
40 |
41 | # The stream should have the methods `write` and possibly `flush`.
42 | self.stream = stream
43 |
44 | # Encoding can be overriden by STREAM-START.
45 | self.encoding = None
46 |
47 | # Emitter is a state machine with a stack of states to handle nested
48 | # structures.
49 | self.states = []
50 | self.state = self.expect_stream_start
51 |
52 | # Current event and the event queue.
53 | self.events = []
54 | self.event = None
55 |
56 | # The current indentation level and the stack of previous indents.
57 | self.indents = []
58 | self.indent = None
59 |
60 | # Flow level.
61 | self.flow_level = 0
62 |
63 | # Contexts.
64 | self.root_context = False
65 | self.sequence_context = False
66 | self.mapping_context = False
67 | self.simple_key_context = False
68 |
69 | # Characteristics of the last emitted character:
70 | # - current position.
71 | # - is it a whitespace?
72 | # - is it an indention character
73 | # (indentation space, '-', '?', or ':')?
74 | self.line = 0
75 | self.column = 0
76 | self.whitespace = True
77 | self.indention = True
78 |
79 | # Whether the document requires an explicit document indicator
80 | self.open_ended = False
81 |
82 | # Formatting details.
83 | self.canonical = canonical
84 | self.allow_unicode = allow_unicode
85 | self.best_indent = 2
86 | if indent and 1 < indent < 10:
87 | self.best_indent = indent
88 | self.best_width = 80
89 | if width and width > self.best_indent*2:
90 | self.best_width = width
91 | self.best_line_break = u'\n'
92 | if line_break in [u'\r', u'\n', u'\r\n']:
93 | self.best_line_break = line_break
94 |
95 | # Tag prefixes.
96 | self.tag_prefixes = None
97 |
98 | # Prepared anchor and tag.
99 | self.prepared_anchor = None
100 | self.prepared_tag = None
101 |
102 | # Scalar analysis and style.
103 | self.analysis = None
104 | self.style = None
105 |
106 | def dispose(self):
107 | # Reset the state attributes (to clear self-references)
108 | self.states = []
109 | self.state = None
110 |
111 | def emit(self, event):
112 | self.events.append(event)
113 | while not self.need_more_events():
114 | self.event = self.events.pop(0)
115 | self.state()
116 | self.event = None
117 |
118 | # In some cases, we wait for a few next events before emitting.
119 |
120 | def need_more_events(self):
121 | if not self.events:
122 | return True
123 | event = self.events[0]
124 | if isinstance(event, DocumentStartEvent):
125 | return self.need_events(1)
126 | elif isinstance(event, SequenceStartEvent):
127 | return self.need_events(2)
128 | elif isinstance(event, MappingStartEvent):
129 | return self.need_events(3)
130 | else:
131 | return False
132 |
133 | def need_events(self, count):
134 | level = 0
135 | for event in self.events[1:]:
136 | if isinstance(event, (DocumentStartEvent, CollectionStartEvent)):
137 | level += 1
138 | elif isinstance(event, (DocumentEndEvent, CollectionEndEvent)):
139 | level -= 1
140 | elif isinstance(event, StreamEndEvent):
141 | level = -1
142 | if level < 0:
143 | return False
144 | return (len(self.events) < count+1)
145 |
146 | def increase_indent(self, flow=False, indentless=False):
147 | self.indents.append(self.indent)
148 | if self.indent is None:
149 | if flow:
150 | self.indent = self.best_indent
151 | else:
152 | self.indent = 0
153 | elif not indentless:
154 | self.indent += self.best_indent
155 |
156 | # States.
157 |
158 | # Stream handlers.
159 |
160 | def expect_stream_start(self):
161 | if isinstance(self.event, StreamStartEvent):
162 | if self.event.encoding and not getattr(self.stream, 'encoding', None):
163 | self.encoding = self.event.encoding
164 | self.write_stream_start()
165 | self.state = self.expect_first_document_start
166 | else:
167 | raise EmitterError("expected StreamStartEvent, but got %s"
168 | % self.event)
169 |
170 | def expect_nothing(self):
171 | raise EmitterError("expected nothing, but got %s" % self.event)
172 |
173 | # Document handlers.
174 |
175 | def expect_first_document_start(self):
176 | return self.expect_document_start(first=True)
177 |
178 | def expect_document_start(self, first=False):
179 | if isinstance(self.event, DocumentStartEvent):
180 | if (self.event.version or self.event.tags) and self.open_ended:
181 | self.write_indicator(u'...', True)
182 | self.write_indent()
183 | if self.event.version:
184 | version_text = self.prepare_version(self.event.version)
185 | self.write_version_directive(version_text)
186 | self.tag_prefixes = self.DEFAULT_TAG_PREFIXES.copy()
187 | if self.event.tags:
188 | handles = self.event.tags.keys()
189 | handles.sort()
190 | for handle in handles:
191 | prefix = self.event.tags[handle]
192 | self.tag_prefixes[prefix] = handle
193 | handle_text = self.prepare_tag_handle(handle)
194 | prefix_text = self.prepare_tag_prefix(prefix)
195 | self.write_tag_directive(handle_text, prefix_text)
196 | implicit = (first and not self.event.explicit and not self.canonical
197 | and not self.event.version and not self.event.tags
198 | and not self.check_empty_document())
199 | if not implicit:
200 | self.write_indent()
201 | self.write_indicator(u'---', True)
202 | if self.canonical:
203 | self.write_indent()
204 | self.state = self.expect_document_root
205 | elif isinstance(self.event, StreamEndEvent):
206 | if self.open_ended:
207 | self.write_indicator(u'...', True)
208 | self.write_indent()
209 | self.write_stream_end()
210 | self.state = self.expect_nothing
211 | else:
212 | raise EmitterError("expected DocumentStartEvent, but got %s"
213 | % self.event)
214 |
215 | def expect_document_end(self):
216 | if isinstance(self.event, DocumentEndEvent):
217 | self.write_indent()
218 | if self.event.explicit:
219 | self.write_indicator(u'...', True)
220 | self.write_indent()
221 | self.flush_stream()
222 | self.state = self.expect_document_start
223 | else:
224 | raise EmitterError("expected DocumentEndEvent, but got %s"
225 | % self.event)
226 |
227 | def expect_document_root(self):
228 | self.states.append(self.expect_document_end)
229 | self.expect_node(root=True)
230 |
231 | # Node handlers.
232 |
233 | def expect_node(self, root=False, sequence=False, mapping=False,
234 | simple_key=False):
235 | self.root_context = root
236 | self.sequence_context = sequence
237 | self.mapping_context = mapping
238 | self.simple_key_context = simple_key
239 | if isinstance(self.event, AliasEvent):
240 | self.expect_alias()
241 | elif isinstance(self.event, (ScalarEvent, CollectionStartEvent)):
242 | self.process_anchor(u'&')
243 | self.process_tag()
244 | if isinstance(self.event, ScalarEvent):
245 | self.expect_scalar()
246 | elif isinstance(self.event, SequenceStartEvent):
247 | if self.flow_level or self.canonical or self.event.flow_style \
248 | or self.check_empty_sequence():
249 | self.expect_flow_sequence()
250 | else:
251 | self.expect_block_sequence()
252 | elif isinstance(self.event, MappingStartEvent):
253 | if self.flow_level or self.canonical or self.event.flow_style \
254 | or self.check_empty_mapping():
255 | self.expect_flow_mapping()
256 | else:
257 | self.expect_block_mapping()
258 | else:
259 | raise EmitterError("expected NodeEvent, but got %s" % self.event)
260 |
261 | def expect_alias(self):
262 | if self.event.anchor is None:
263 | raise EmitterError("anchor is not specified for alias")
264 | self.process_anchor(u'*')
265 | self.state = self.states.pop()
266 |
267 | def expect_scalar(self):
268 | self.increase_indent(flow=True)
269 | self.process_scalar()
270 | self.indent = self.indents.pop()
271 | self.state = self.states.pop()
272 |
273 | # Flow sequence handlers.
274 |
275 | def expect_flow_sequence(self):
276 | self.write_indicator(u'[', True, whitespace=True)
277 | self.flow_level += 1
278 | self.increase_indent(flow=True)
279 | self.state = self.expect_first_flow_sequence_item
280 |
281 | def expect_first_flow_sequence_item(self):
282 | if isinstance(self.event, SequenceEndEvent):
283 | self.indent = self.indents.pop()
284 | self.flow_level -= 1
285 | self.write_indicator(u']', False)
286 | self.state = self.states.pop()
287 | else:
288 | if self.canonical or self.column > self.best_width:
289 | self.write_indent()
290 | self.states.append(self.expect_flow_sequence_item)
291 | self.expect_node(sequence=True)
292 |
293 | def expect_flow_sequence_item(self):
294 | if isinstance(self.event, SequenceEndEvent):
295 | self.indent = self.indents.pop()
296 | self.flow_level -= 1
297 | if self.canonical:
298 | self.write_indicator(u',', False)
299 | self.write_indent()
300 | self.write_indicator(u']', False)
301 | self.state = self.states.pop()
302 | else:
303 | self.write_indicator(u',', False)
304 | if self.canonical or self.column > self.best_width:
305 | self.write_indent()
306 | self.states.append(self.expect_flow_sequence_item)
307 | self.expect_node(sequence=True)
308 |
309 | # Flow mapping handlers.
310 |
311 | def expect_flow_mapping(self):
312 | self.write_indicator(u'{', True, whitespace=True)
313 | self.flow_level += 1
314 | self.increase_indent(flow=True)
315 | self.state = self.expect_first_flow_mapping_key
316 |
317 | def expect_first_flow_mapping_key(self):
318 | if isinstance(self.event, MappingEndEvent):
319 | self.indent = self.indents.pop()
320 | self.flow_level -= 1
321 | self.write_indicator(u'}', False)
322 | self.state = self.states.pop()
323 | else:
324 | if self.canonical or self.column > self.best_width:
325 | self.write_indent()
326 | if not self.canonical and self.check_simple_key():
327 | self.states.append(self.expect_flow_mapping_simple_value)
328 | self.expect_node(mapping=True, simple_key=True)
329 | else:
330 | self.write_indicator(u'?', True)
331 | self.states.append(self.expect_flow_mapping_value)
332 | self.expect_node(mapping=True)
333 |
334 | def expect_flow_mapping_key(self):
335 | if isinstance(self.event, MappingEndEvent):
336 | self.indent = self.indents.pop()
337 | self.flow_level -= 1
338 | if self.canonical:
339 | self.write_indicator(u',', False)
340 | self.write_indent()
341 | self.write_indicator(u'}', False)
342 | self.state = self.states.pop()
343 | else:
344 | self.write_indicator(u',', False)
345 | if self.canonical or self.column > self.best_width:
346 | self.write_indent()
347 | if not self.canonical and self.check_simple_key():
348 | self.states.append(self.expect_flow_mapping_simple_value)
349 | self.expect_node(mapping=True, simple_key=True)
350 | else:
351 | self.write_indicator(u'?', True)
352 | self.states.append(self.expect_flow_mapping_value)
353 | self.expect_node(mapping=True)
354 |
355 | def expect_flow_mapping_simple_value(self):
356 | self.write_indicator(u':', False)
357 | self.states.append(self.expect_flow_mapping_key)
358 | self.expect_node(mapping=True)
359 |
360 | def expect_flow_mapping_value(self):
361 | if self.canonical or self.column > self.best_width:
362 | self.write_indent()
363 | self.write_indicator(u':', True)
364 | self.states.append(self.expect_flow_mapping_key)
365 | self.expect_node(mapping=True)
366 |
367 | # Block sequence handlers.
368 |
369 | def expect_block_sequence(self):
370 | indentless = (self.mapping_context and not self.indention)
371 | self.increase_indent(flow=False, indentless=indentless)
372 | self.state = self.expect_first_block_sequence_item
373 |
374 | def expect_first_block_sequence_item(self):
375 | return self.expect_block_sequence_item(first=True)
376 |
377 | def expect_block_sequence_item(self, first=False):
378 | if not first and isinstance(self.event, SequenceEndEvent):
379 | self.indent = self.indents.pop()
380 | self.state = self.states.pop()
381 | else:
382 | self.write_indent()
383 | self.write_indicator(u'-', True, indention=True)
384 | self.states.append(self.expect_block_sequence_item)
385 | self.expect_node(sequence=True)
386 |
387 | # Block mapping handlers.
388 |
389 | def expect_block_mapping(self):
390 | self.increase_indent(flow=False)
391 | self.state = self.expect_first_block_mapping_key
392 |
393 | def expect_first_block_mapping_key(self):
394 | return self.expect_block_mapping_key(first=True)
395 |
396 | def expect_block_mapping_key(self, first=False):
397 | if not first and isinstance(self.event, MappingEndEvent):
398 | self.indent = self.indents.pop()
399 | self.state = self.states.pop()
400 | else:
401 | self.write_indent()
402 | if self.check_simple_key():
403 | self.states.append(self.expect_block_mapping_simple_value)
404 | self.expect_node(mapping=True, simple_key=True)
405 | else:
406 | self.write_indicator(u'?', True, indention=True)
407 | self.states.append(self.expect_block_mapping_value)
408 | self.expect_node(mapping=True)
409 |
410 | def expect_block_mapping_simple_value(self):
411 | self.write_indicator(u':', False)
412 | self.states.append(self.expect_block_mapping_key)
413 | self.expect_node(mapping=True)
414 |
415 | def expect_block_mapping_value(self):
416 | self.write_indent()
417 | self.write_indicator(u':', True, indention=True)
418 | self.states.append(self.expect_block_mapping_key)
419 | self.expect_node(mapping=True)
420 |
421 | # Checkers.
422 |
423 | def check_empty_sequence(self):
424 | return (isinstance(self.event, SequenceStartEvent) and self.events
425 | and isinstance(self.events[0], SequenceEndEvent))
426 |
427 | def check_empty_mapping(self):
428 | return (isinstance(self.event, MappingStartEvent) and self.events
429 | and isinstance(self.events[0], MappingEndEvent))
430 |
431 | def check_empty_document(self):
432 | if not isinstance(self.event, DocumentStartEvent) or not self.events:
433 | return False
434 | event = self.events[0]
435 | return (isinstance(event, ScalarEvent) and event.anchor is None
436 | and event.tag is None and event.implicit and event.value == u'')
437 |
438 | def check_simple_key(self):
439 | length = 0
440 | if isinstance(self.event, NodeEvent) and self.event.anchor is not None:
441 | if self.prepared_anchor is None:
442 | self.prepared_anchor = self.prepare_anchor(self.event.anchor)
443 | length += len(self.prepared_anchor)
444 | if isinstance(self.event, (ScalarEvent, CollectionStartEvent)) \
445 | and self.event.tag is not None:
446 | if self.prepared_tag is None:
447 | self.prepared_tag = self.prepare_tag(self.event.tag)
448 | length += len(self.prepared_tag)
449 | if isinstance(self.event, ScalarEvent):
450 | if self.analysis is None:
451 | self.analysis = self.analyze_scalar(self.event.value)
452 | length += len(self.analysis.scalar)
453 | return (length < 128 and (isinstance(self.event, AliasEvent)
454 | or (isinstance(self.event, ScalarEvent)
455 | and not self.analysis.empty and not self.analysis.multiline)
456 | or self.check_empty_sequence() or self.check_empty_mapping()))
457 |
458 | # Anchor, Tag, and Scalar processors.
459 |
460 | def process_anchor(self, indicator):
461 | if self.event.anchor is None:
462 | self.prepared_anchor = None
463 | return
464 | if self.prepared_anchor is None:
465 | self.prepared_anchor = self.prepare_anchor(self.event.anchor)
466 | if self.prepared_anchor:
467 | self.write_indicator(indicator+self.prepared_anchor, True)
468 | self.prepared_anchor = None
469 |
470 | def process_tag(self):
471 | tag = self.event.tag
472 | if isinstance(self.event, ScalarEvent):
473 | if self.style is None:
474 | self.style = self.choose_scalar_style()
475 | if ((not self.canonical or tag is None) and
476 | ((self.style == '' and self.event.implicit[0])
477 | or (self.style != '' and self.event.implicit[1]))):
478 | self.prepared_tag = None
479 | return
480 | if self.event.implicit[0] and tag is None:
481 | tag = u'!'
482 | self.prepared_tag = None
483 | else:
484 | if (not self.canonical or tag is None) and self.event.implicit:
485 | self.prepared_tag = None
486 | return
487 | if tag is None:
488 | raise EmitterError("tag is not specified")
489 | if self.prepared_tag is None:
490 | self.prepared_tag = self.prepare_tag(tag)
491 | if self.prepared_tag:
492 | self.write_indicator(self.prepared_tag, True)
493 | self.prepared_tag = None
494 |
495 | def choose_scalar_style(self):
496 | if self.analysis is None:
497 | self.analysis = self.analyze_scalar(self.event.value)
498 | if self.event.style == '"' or self.canonical:
499 | return '"'
500 | if not self.event.style and self.event.implicit[0]:
501 | if (not (self.simple_key_context and
502 | (self.analysis.empty or self.analysis.multiline))
503 | and (self.flow_level and self.analysis.allow_flow_plain
504 | or (not self.flow_level and self.analysis.allow_block_plain))):
505 | return ''
506 | if self.event.style and self.event.style in '|>':
507 | if (not self.flow_level and not self.simple_key_context
508 | and self.analysis.allow_block):
509 | return self.event.style
510 | if not self.event.style or self.event.style == '\'':
511 | if (self.analysis.allow_single_quoted and
512 | not (self.simple_key_context and self.analysis.multiline)):
513 | return '\''
514 | return '"'
515 |
516 | def process_scalar(self):
517 | if self.analysis is None:
518 | self.analysis = self.analyze_scalar(self.event.value)
519 | if self.style is None:
520 | self.style = self.choose_scalar_style()
521 | split = (not self.simple_key_context)
522 | #if self.analysis.multiline and split \
523 | # and (not self.style or self.style in '\'\"'):
524 | # self.write_indent()
525 | if self.style == '"':
526 | self.write_double_quoted(self.analysis.scalar, split)
527 | elif self.style == '\'':
528 | self.write_single_quoted(self.analysis.scalar, split)
529 | elif self.style == '>':
530 | self.write_folded(self.analysis.scalar)
531 | elif self.style == '|':
532 | self.write_literal(self.analysis.scalar)
533 | else:
534 | self.write_plain(self.analysis.scalar, split)
535 | self.analysis = None
536 | self.style = None
537 |
538 | # Analyzers.
539 |
540 | def prepare_version(self, version):
541 | major, minor = version
542 | if major != 1:
543 | raise EmitterError("unsupported YAML version: %d.%d" % (major, minor))
544 | return u'%d.%d' % (major, minor)
545 |
546 | def prepare_tag_handle(self, handle):
547 | if not handle:
548 | raise EmitterError("tag handle must not be empty")
549 | if handle[0] != u'!' or handle[-1] != u'!':
550 | raise EmitterError("tag handle must start and end with '!': %r"
551 | % (handle.encode('utf-8')))
552 | for ch in handle[1:-1]:
553 | if not (u'0' <= ch <= u'9' or u'A' <= ch <= u'Z' or u'a' <= ch <= u'z' \
554 | or ch in u'-_'):
555 | raise EmitterError("invalid character %r in the tag handle: %r"
556 | % (ch.encode('utf-8'), handle.encode('utf-8')))
557 | return handle
558 |
559 | def prepare_tag_prefix(self, prefix):
560 | if not prefix:
561 | raise EmitterError("tag prefix must not be empty")
562 | chunks = []
563 | start = end = 0
564 | if prefix[0] == u'!':
565 | end = 1
566 | while end < len(prefix):
567 | ch = prefix[end]
568 | if u'0' <= ch <= u'9' or u'A' <= ch <= u'Z' or u'a' <= ch <= u'z' \
569 | or ch in u'-;/?!:@&=+$,_.~*\'()[]':
570 | end += 1
571 | else:
572 | if start < end:
573 | chunks.append(prefix[start:end])
574 | start = end = end+1
575 | data = ch.encode('utf-8')
576 | for ch in data:
577 | chunks.append(u'%%%02X' % ord(ch))
578 | if start < end:
579 | chunks.append(prefix[start:end])
580 | return u''.join(chunks)
581 |
582 | def prepare_tag(self, tag):
583 | if not tag:
584 | raise EmitterError("tag must not be empty")
585 | if tag == u'!':
586 | return tag
587 | handle = None
588 | suffix = tag
589 | prefixes = self.tag_prefixes.keys()
590 | prefixes.sort()
591 | for prefix in prefixes:
592 | if tag.startswith(prefix) \
593 | and (prefix == u'!' or len(prefix) < len(tag)):
594 | handle = self.tag_prefixes[prefix]
595 | suffix = tag[len(prefix):]
596 | chunks = []
597 | start = end = 0
598 | while end < len(suffix):
599 | ch = suffix[end]
600 | if u'0' <= ch <= u'9' or u'A' <= ch <= u'Z' or u'a' <= ch <= u'z' \
601 | or ch in u'-;/?:@&=+$,_.~*\'()[]' \
602 | or (ch == u'!' and handle != u'!'):
603 | end += 1
604 | else:
605 | if start < end:
606 | chunks.append(suffix[start:end])
607 | start = end = end+1
608 | data = ch.encode('utf-8')
609 | for ch in data:
610 | chunks.append(u'%%%02X' % ord(ch))
611 | if start < end:
612 | chunks.append(suffix[start:end])
613 | suffix_text = u''.join(chunks)
614 | if handle:
615 | return u'%s%s' % (handle, suffix_text)
616 | else:
617 | return u'!<%s>' % suffix_text
618 |
619 | def prepare_anchor(self, anchor):
620 | if not anchor:
621 | raise EmitterError("anchor must not be empty")
622 | for ch in anchor:
623 | if not (u'0' <= ch <= u'9' or u'A' <= ch <= u'Z' or u'a' <= ch <= u'z' \
624 | or ch in u'-_'):
625 | raise EmitterError("invalid character %r in the anchor: %r"
626 | % (ch.encode('utf-8'), anchor.encode('utf-8')))
627 | return anchor
628 |
629 | def analyze_scalar(self, scalar):
630 |
631 | # Empty scalar is a special case.
632 | if not scalar:
633 | return ScalarAnalysis(scalar=scalar, empty=True, multiline=False,
634 | allow_flow_plain=False, allow_block_plain=True,
635 | allow_single_quoted=True, allow_double_quoted=True,
636 | allow_block=False)
637 |
638 | # Indicators and special characters.
639 | block_indicators = False
640 | flow_indicators = False
641 | line_breaks = False
642 | special_characters = False
643 |
644 | # Important whitespace combinations.
645 | leading_space = False
646 | leading_break = False
647 | trailing_space = False
648 | trailing_break = False
649 | break_space = False
650 | space_break = False
651 |
652 | # Check document indicators.
653 | if scalar.startswith(u'---') or scalar.startswith(u'...'):
654 | block_indicators = True
655 | flow_indicators = True
656 |
657 | # First character or preceded by a whitespace.
658 | preceeded_by_whitespace = True
659 |
660 | # Last character or followed by a whitespace.
661 | followed_by_whitespace = (len(scalar) == 1 or
662 | scalar[1] in u'\0 \t\r\n\x85\u2028\u2029')
663 |
664 | # The previous character is a space.
665 | previous_space = False
666 |
667 | # The previous character is a break.
668 | previous_break = False
669 |
670 | index = 0
671 | while index < len(scalar):
672 | ch = scalar[index]
673 |
674 | # Check for indicators.
675 | if index == 0:
676 | # Leading indicators are special characters.
677 | if ch in u'#,[]{}&*!|>\'\"%@`':
678 | flow_indicators = True
679 | block_indicators = True
680 | if ch in u'?:':
681 | flow_indicators = True
682 | if followed_by_whitespace:
683 | block_indicators = True
684 | if ch == u'-' and followed_by_whitespace:
685 | flow_indicators = True
686 | block_indicators = True
687 | else:
688 | # Some indicators cannot appear within a scalar as well.
689 | if ch in u',?[]{}':
690 | flow_indicators = True
691 | if ch == u':':
692 | flow_indicators = True
693 | if followed_by_whitespace:
694 | block_indicators = True
695 | if ch == u'#' and preceeded_by_whitespace:
696 | flow_indicators = True
697 | block_indicators = True
698 |
699 | # Check for line breaks, special, and unicode characters.
700 | if ch in u'\n\x85\u2028\u2029':
701 | line_breaks = True
702 | if not (ch == u'\n' or u'\x20' <= ch <= u'\x7E'):
703 | if (ch == u'\x85' or u'\xA0' <= ch <= u'\uD7FF'
704 | or u'\uE000' <= ch <= u'\uFFFD') and ch != u'\uFEFF':
705 | unicode_characters = True
706 | if not self.allow_unicode:
707 | special_characters = True
708 | else:
709 | special_characters = True
710 |
711 | # Detect important whitespace combinations.
712 | if ch == u' ':
713 | if index == 0:
714 | leading_space = True
715 | if index == len(scalar)-1:
716 | trailing_space = True
717 | if previous_break:
718 | break_space = True
719 | previous_space = True
720 | previous_break = False
721 | elif ch in u'\n\x85\u2028\u2029':
722 | if index == 0:
723 | leading_break = True
724 | if index == len(scalar)-1:
725 | trailing_break = True
726 | if previous_space:
727 | space_break = True
728 | previous_space = False
729 | previous_break = True
730 | else:
731 | previous_space = False
732 | previous_break = False
733 |
734 | # Prepare for the next character.
735 | index += 1
736 | preceeded_by_whitespace = (ch in u'\0 \t\r\n\x85\u2028\u2029')
737 | followed_by_whitespace = (index+1 >= len(scalar) or
738 | scalar[index+1] in u'\0 \t\r\n\x85\u2028\u2029')
739 |
740 | # Let's decide what styles are allowed.
741 | allow_flow_plain = True
742 | allow_block_plain = True
743 | allow_single_quoted = True
744 | allow_double_quoted = True
745 | allow_block = True
746 |
747 | # Leading and trailing whitespaces are bad for plain scalars.
748 | if (leading_space or leading_break
749 | or trailing_space or trailing_break):
750 | allow_flow_plain = allow_block_plain = False
751 |
752 | # We do not permit trailing spaces for block scalars.
753 | if trailing_space:
754 | allow_block = False
755 |
756 | # Spaces at the beginning of a new line are only acceptable for block
757 | # scalars.
758 | if break_space:
759 | allow_flow_plain = allow_block_plain = allow_single_quoted = False
760 |
761 | # Spaces followed by breaks, as well as special character are only
762 | # allowed for double quoted scalars.
763 | if space_break or special_characters:
764 | allow_flow_plain = allow_block_plain = \
765 | allow_single_quoted = allow_block = False
766 |
767 | # Although the plain scalar writer supports breaks, we never emit
768 | # multiline plain scalars.
769 | if line_breaks:
770 | allow_flow_plain = allow_block_plain = False
771 |
772 | # Flow indicators are forbidden for flow plain scalars.
773 | if flow_indicators:
774 | allow_flow_plain = False
775 |
776 | # Block indicators are forbidden for block plain scalars.
777 | if block_indicators:
778 | allow_block_plain = False
779 |
780 | return ScalarAnalysis(scalar=scalar,
781 | empty=False, multiline=line_breaks,
782 | allow_flow_plain=allow_flow_plain,
783 | allow_block_plain=allow_block_plain,
784 | allow_single_quoted=allow_single_quoted,
785 | allow_double_quoted=allow_double_quoted,
786 | allow_block=allow_block)
787 |
788 | # Writers.
789 |
790 | def flush_stream(self):
791 | if hasattr(self.stream, 'flush'):
792 | self.stream.flush()
793 |
794 | def write_stream_start(self):
795 | # Write BOM if needed.
796 | if self.encoding and self.encoding.startswith('utf-16'):
797 | self.stream.write(u'\uFEFF'.encode(self.encoding))
798 |
799 | def write_stream_end(self):
800 | self.flush_stream()
801 |
802 | def write_indicator(self, indicator, need_whitespace,
803 | whitespace=False, indention=False):
804 | if self.whitespace or not need_whitespace:
805 | data = indicator
806 | else:
807 | data = u' '+indicator
808 | self.whitespace = whitespace
809 | self.indention = self.indention and indention
810 | self.column += len(data)
811 | self.open_ended = False
812 | if self.encoding:
813 | data = data.encode(self.encoding)
814 | self.stream.write(data)
815 |
816 | def write_indent(self):
817 | indent = self.indent or 0
818 | if not self.indention or self.column > indent \
819 | or (self.column == indent and not self.whitespace):
820 | self.write_line_break()
821 | if self.column < indent:
822 | self.whitespace = True
823 | data = u' '*(indent-self.column)
824 | self.column = indent
825 | if self.encoding:
826 | data = data.encode(self.encoding)
827 | self.stream.write(data)
828 |
829 | def write_line_break(self, data=None):
830 | if data is None:
831 | data = self.best_line_break
832 | self.whitespace = True
833 | self.indention = True
834 | self.line += 1
835 | self.column = 0
836 | if self.encoding:
837 | data = data.encode(self.encoding)
838 | self.stream.write(data)
839 |
840 | def write_version_directive(self, version_text):
841 | data = u'%%YAML %s' % version_text
842 | if self.encoding:
843 | data = data.encode(self.encoding)
844 | self.stream.write(data)
845 | self.write_line_break()
846 |
847 | def write_tag_directive(self, handle_text, prefix_text):
848 | data = u'%%TAG %s %s' % (handle_text, prefix_text)
849 | if self.encoding:
850 | data = data.encode(self.encoding)
851 | self.stream.write(data)
852 | self.write_line_break()
853 |
854 | # Scalar streams.
855 |
856 | def write_single_quoted(self, text, split=True):
857 | self.write_indicator(u'\'', True)
858 | spaces = False
859 | breaks = False
860 | start = end = 0
861 | while end <= len(text):
862 | ch = None
863 | if end < len(text):
864 | ch = text[end]
865 | if spaces:
866 | if ch is None or ch != u' ':
867 | if start+1 == end and self.column > self.best_width and split \
868 | and start != 0 and end != len(text):
869 | self.write_indent()
870 | else:
871 | data = text[start:end]
872 | self.column += len(data)
873 | if self.encoding:
874 | data = data.encode(self.encoding)
875 | self.stream.write(data)
876 | start = end
877 | elif breaks:
878 | if ch is None or ch not in u'\n\x85\u2028\u2029':
879 | if text[start] == u'\n':
880 | self.write_line_break()
881 | for br in text[start:end]:
882 | if br == u'\n':
883 | self.write_line_break()
884 | else:
885 | self.write_line_break(br)
886 | self.write_indent()
887 | start = end
888 | else:
889 | if ch is None or ch in u' \n\x85\u2028\u2029' or ch == u'\'':
890 | if start < end:
891 | data = text[start:end]
892 | self.column += len(data)
893 | if self.encoding:
894 | data = data.encode(self.encoding)
895 | self.stream.write(data)
896 | start = end
897 | if ch == u'\'':
898 | data = u'\'\''
899 | self.column += 2
900 | if self.encoding:
901 | data = data.encode(self.encoding)
902 | self.stream.write(data)
903 | start = end + 1
904 | if ch is not None:
905 | spaces = (ch == u' ')
906 | breaks = (ch in u'\n\x85\u2028\u2029')
907 | end += 1
908 | self.write_indicator(u'\'', False)
909 |
910 | ESCAPE_REPLACEMENTS = {
911 | u'\0': u'0',
912 | u'\x07': u'a',
913 | u'\x08': u'b',
914 | u'\x09': u't',
915 | u'\x0A': u'n',
916 | u'\x0B': u'v',
917 | u'\x0C': u'f',
918 | u'\x0D': u'r',
919 | u'\x1B': u'e',
920 | u'\"': u'\"',
921 | u'\\': u'\\',
922 | u'\x85': u'N',
923 | u'\xA0': u'_',
924 | u'\u2028': u'L',
925 | u'\u2029': u'P',
926 | }
927 |
928 | def write_double_quoted(self, text, split=True):
929 | self.write_indicator(u'"', True)
930 | start = end = 0
931 | while end <= len(text):
932 | ch = None
933 | if end < len(text):
934 | ch = text[end]
935 | if ch is None or ch in u'"\\\x85\u2028\u2029\uFEFF' \
936 | or not (u'\x20' <= ch <= u'\x7E'
937 | or (self.allow_unicode
938 | and (u'\xA0' <= ch <= u'\uD7FF'
939 | or u'\uE000' <= ch <= u'\uFFFD'))):
940 | if start < end:
941 | data = text[start:end]
942 | self.column += len(data)
943 | if self.encoding:
944 | data = data.encode(self.encoding)
945 | self.stream.write(data)
946 | start = end
947 | if ch is not None:
948 | if ch in self.ESCAPE_REPLACEMENTS:
949 | data = u'\\'+self.ESCAPE_REPLACEMENTS[ch]
950 | elif ch <= u'\xFF':
951 | data = u'\\x%02X' % ord(ch)
952 | elif ch <= u'\uFFFF':
953 | data = u'\\u%04X' % ord(ch)
954 | else:
955 | data = u'\\U%08X' % ord(ch)
956 | self.column += len(data)
957 | if self.encoding:
958 | data = data.encode(self.encoding)
959 | self.stream.write(data)
960 | start = end+1
961 | if 0 < end < len(text)-1 and (ch == u' ' or start >= end) \
962 | and self.column+(end-start) > self.best_width and split:
963 | data = text[start:end]+u'\\'
964 | if start < end:
965 | start = end
966 | self.column += len(data)
967 | if self.encoding:
968 | data = data.encode(self.encoding)
969 | self.stream.write(data)
970 | self.write_indent()
971 | self.whitespace = False
972 | self.indention = False
973 | if text[start] == u' ':
974 | data = u'\\'
975 | self.column += len(data)
976 | if self.encoding:
977 | data = data.encode(self.encoding)
978 | self.stream.write(data)
979 | end += 1
980 | self.write_indicator(u'"', False)
981 |
982 | def determine_block_hints(self, text):
983 | hints = u''
984 | if text:
985 | if text[0] in u' \n\x85\u2028\u2029':
986 | hints += unicode(self.best_indent)
987 | if text[-1] not in u'\n\x85\u2028\u2029':
988 | hints += u'-'
989 | elif len(text) == 1 or text[-2] in u'\n\x85\u2028\u2029':
990 | hints += u'+'
991 | return hints
992 |
993 | def write_folded(self, text):
994 | hints = self.determine_block_hints(text)
995 | self.write_indicator(u'>'+hints, True)
996 | if hints[-1:] == u'+':
997 | self.open_ended = True
998 | self.write_line_break()
999 | leading_space = True
1000 | spaces = False
1001 | breaks = True
1002 | start = end = 0
1003 | while end <= len(text):
1004 | ch = None
1005 | if end < len(text):
1006 | ch = text[end]
1007 | if breaks:
1008 | if ch is None or ch not in u'\n\x85\u2028\u2029':
1009 | if not leading_space and ch is not None and ch != u' ' \
1010 | and text[start] == u'\n':
1011 | self.write_line_break()
1012 | leading_space = (ch == u' ')
1013 | for br in text[start:end]:
1014 | if br == u'\n':
1015 | self.write_line_break()
1016 | else:
1017 | self.write_line_break(br)
1018 | if ch is not None:
1019 | self.write_indent()
1020 | start = end
1021 | elif spaces:
1022 | if ch != u' ':
1023 | if start+1 == end and self.column > self.best_width:
1024 | self.write_indent()
1025 | else:
1026 | data = text[start:end]
1027 | self.column += len(data)
1028 | if self.encoding:
1029 | data = data.encode(self.encoding)
1030 | self.stream.write(data)
1031 | start = end
1032 | else:
1033 | if ch is None or ch in u' \n\x85\u2028\u2029':
1034 | data = text[start:end]
1035 | self.column += len(data)
1036 | if self.encoding:
1037 | data = data.encode(self.encoding)
1038 | self.stream.write(data)
1039 | if ch is None:
1040 | self.write_line_break()
1041 | start = end
1042 | if ch is not None:
1043 | breaks = (ch in u'\n\x85\u2028\u2029')
1044 | spaces = (ch == u' ')
1045 | end += 1
1046 |
1047 | def write_literal(self, text):
1048 | hints = self.determine_block_hints(text)
1049 | self.write_indicator(u'|'+hints, True)
1050 | if hints[-1:] == u'+':
1051 | self.open_ended = True
1052 | self.write_line_break()
1053 | breaks = True
1054 | start = end = 0
1055 | while end <= len(text):
1056 | ch = None
1057 | if end < len(text):
1058 | ch = text[end]
1059 | if breaks:
1060 | if ch is None or ch not in u'\n\x85\u2028\u2029':
1061 | for br in text[start:end]:
1062 | if br == u'\n':
1063 | self.write_line_break()
1064 | else:
1065 | self.write_line_break(br)
1066 | if ch is not None:
1067 | self.write_indent()
1068 | start = end
1069 | else:
1070 | if ch is None or ch in u'\n\x85\u2028\u2029':
1071 | data = text[start:end]
1072 | if self.encoding:
1073 | data = data.encode(self.encoding)
1074 | self.stream.write(data)
1075 | if ch is None:
1076 | self.write_line_break()
1077 | start = end
1078 | if ch is not None:
1079 | breaks = (ch in u'\n\x85\u2028\u2029')
1080 | end += 1
1081 |
1082 | def write_plain(self, text, split=True):
1083 | if self.root_context:
1084 | self.open_ended = True
1085 | if not text:
1086 | return
1087 | if not self.whitespace:
1088 | data = u' '
1089 | self.column += len(data)
1090 | if self.encoding:
1091 | data = data.encode(self.encoding)
1092 | self.stream.write(data)
1093 | self.whitespace = False
1094 | self.indention = False
1095 | spaces = False
1096 | breaks = False
1097 | start = end = 0
1098 | while end <= len(text):
1099 | ch = None
1100 | if end < len(text):
1101 | ch = text[end]
1102 | if spaces:
1103 | if ch != u' ':
1104 | if start+1 == end and self.column > self.best_width and split:
1105 | self.write_indent()
1106 | self.whitespace = False
1107 | self.indention = False
1108 | else:
1109 | data = text[start:end]
1110 | self.column += len(data)
1111 | if self.encoding:
1112 | data = data.encode(self.encoding)
1113 | self.stream.write(data)
1114 | start = end
1115 | elif breaks:
1116 | if ch not in u'\n\x85\u2028\u2029':
1117 | if text[start] == u'\n':
1118 | self.write_line_break()
1119 | for br in text[start:end]:
1120 | if br == u'\n':
1121 | self.write_line_break()
1122 | else:
1123 | self.write_line_break(br)
1124 | self.write_indent()
1125 | self.whitespace = False
1126 | self.indention = False
1127 | start = end
1128 | else:
1129 | if ch is None or ch in u' \n\x85\u2028\u2029':
1130 | data = text[start:end]
1131 | self.column += len(data)
1132 | if self.encoding:
1133 | data = data.encode(self.encoding)
1134 | self.stream.write(data)
1135 | start = end
1136 | if ch is not None:
1137 | spaces = (ch == u' ')
1138 | breaks = (ch in u'\n\x85\u2028\u2029')
1139 | end += 1
1140 |
1141 |
--------------------------------------------------------------------------------
/lib/yaml/emitter.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/emitter.pyc
--------------------------------------------------------------------------------
/lib/yaml/error.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['Mark', 'YAMLError', 'MarkedYAMLError']
3 |
4 | class Mark(object):
5 |
6 | def __init__(self, name, index, line, column, buffer, pointer):
7 | self.name = name
8 | self.index = index
9 | self.line = line
10 | self.column = column
11 | self.buffer = buffer
12 | self.pointer = pointer
13 |
14 | def get_snippet(self, indent=4, max_length=75):
15 | if self.buffer is None:
16 | return None
17 | head = ''
18 | start = self.pointer
19 | while start > 0 and self.buffer[start-1] not in u'\0\r\n\x85\u2028\u2029':
20 | start -= 1
21 | if self.pointer-start > max_length/2-1:
22 | head = ' ... '
23 | start += 5
24 | break
25 | tail = ''
26 | end = self.pointer
27 | while end < len(self.buffer) and self.buffer[end] not in u'\0\r\n\x85\u2028\u2029':
28 | end += 1
29 | if end-self.pointer > max_length/2-1:
30 | tail = ' ... '
31 | end -= 5
32 | break
33 | snippet = self.buffer[start:end].encode('utf-8')
34 | return ' '*indent + head + snippet + tail + '\n' \
35 | + ' '*(indent+self.pointer-start+len(head)) + '^'
36 |
37 | def __str__(self):
38 | snippet = self.get_snippet()
39 | where = " in \"%s\", line %d, column %d" \
40 | % (self.name, self.line+1, self.column+1)
41 | if snippet is not None:
42 | where += ":\n"+snippet
43 | return where
44 |
45 | class YAMLError(Exception):
46 | pass
47 |
48 | class MarkedYAMLError(YAMLError):
49 |
50 | def __init__(self, context=None, context_mark=None,
51 | problem=None, problem_mark=None, note=None):
52 | self.context = context
53 | self.context_mark = context_mark
54 | self.problem = problem
55 | self.problem_mark = problem_mark
56 | self.note = note
57 |
58 | def __str__(self):
59 | lines = []
60 | if self.context is not None:
61 | lines.append(self.context)
62 | if self.context_mark is not None \
63 | and (self.problem is None or self.problem_mark is None
64 | or self.context_mark.name != self.problem_mark.name
65 | or self.context_mark.line != self.problem_mark.line
66 | or self.context_mark.column != self.problem_mark.column):
67 | lines.append(str(self.context_mark))
68 | if self.problem is not None:
69 | lines.append(self.problem)
70 | if self.problem_mark is not None:
71 | lines.append(str(self.problem_mark))
72 | if self.note is not None:
73 | lines.append(self.note)
74 | return '\n'.join(lines)
75 |
76 |
--------------------------------------------------------------------------------
/lib/yaml/error.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/error.pyc
--------------------------------------------------------------------------------
/lib/yaml/events.py:
--------------------------------------------------------------------------------
1 |
2 | # Abstract classes.
3 |
4 | class Event(object):
5 | def __init__(self, start_mark=None, end_mark=None):
6 | self.start_mark = start_mark
7 | self.end_mark = end_mark
8 | def __repr__(self):
9 | attributes = [key for key in ['anchor', 'tag', 'implicit', 'value']
10 | if hasattr(self, key)]
11 | arguments = ', '.join(['%s=%r' % (key, getattr(self, key))
12 | for key in attributes])
13 | return '%s(%s)' % (self.__class__.__name__, arguments)
14 |
15 | class NodeEvent(Event):
16 | def __init__(self, anchor, start_mark=None, end_mark=None):
17 | self.anchor = anchor
18 | self.start_mark = start_mark
19 | self.end_mark = end_mark
20 |
21 | class CollectionStartEvent(NodeEvent):
22 | def __init__(self, anchor, tag, implicit, start_mark=None, end_mark=None,
23 | flow_style=None):
24 | self.anchor = anchor
25 | self.tag = tag
26 | self.implicit = implicit
27 | self.start_mark = start_mark
28 | self.end_mark = end_mark
29 | self.flow_style = flow_style
30 |
31 | class CollectionEndEvent(Event):
32 | pass
33 |
34 | # Implementations.
35 |
36 | class StreamStartEvent(Event):
37 | def __init__(self, start_mark=None, end_mark=None, encoding=None):
38 | self.start_mark = start_mark
39 | self.end_mark = end_mark
40 | self.encoding = encoding
41 |
42 | class StreamEndEvent(Event):
43 | pass
44 |
45 | class DocumentStartEvent(Event):
46 | def __init__(self, start_mark=None, end_mark=None,
47 | explicit=None, version=None, tags=None):
48 | self.start_mark = start_mark
49 | self.end_mark = end_mark
50 | self.explicit = explicit
51 | self.version = version
52 | self.tags = tags
53 |
54 | class DocumentEndEvent(Event):
55 | def __init__(self, start_mark=None, end_mark=None,
56 | explicit=None):
57 | self.start_mark = start_mark
58 | self.end_mark = end_mark
59 | self.explicit = explicit
60 |
61 | class AliasEvent(NodeEvent):
62 | pass
63 |
64 | class ScalarEvent(NodeEvent):
65 | def __init__(self, anchor, tag, implicit, value,
66 | start_mark=None, end_mark=None, style=None):
67 | self.anchor = anchor
68 | self.tag = tag
69 | self.implicit = implicit
70 | self.value = value
71 | self.start_mark = start_mark
72 | self.end_mark = end_mark
73 | self.style = style
74 |
75 | class SequenceStartEvent(CollectionStartEvent):
76 | pass
77 |
78 | class SequenceEndEvent(CollectionEndEvent):
79 | pass
80 |
81 | class MappingStartEvent(CollectionStartEvent):
82 | pass
83 |
84 | class MappingEndEvent(CollectionEndEvent):
85 | pass
86 |
87 |
--------------------------------------------------------------------------------
/lib/yaml/events.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/events.pyc
--------------------------------------------------------------------------------
/lib/yaml/loader.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['BaseLoader', 'SafeLoader', 'Loader']
3 |
4 | from reader import *
5 | from scanner import *
6 | from parser import *
7 | from composer import *
8 | from constructor import *
9 | from resolver import *
10 |
11 | class BaseLoader(Reader, Scanner, Parser, Composer, BaseConstructor, BaseResolver):
12 |
13 | def __init__(self, stream):
14 | Reader.__init__(self, stream)
15 | Scanner.__init__(self)
16 | Parser.__init__(self)
17 | Composer.__init__(self)
18 | BaseConstructor.__init__(self)
19 | BaseResolver.__init__(self)
20 |
21 | class SafeLoader(Reader, Scanner, Parser, Composer, SafeConstructor, Resolver):
22 |
23 | def __init__(self, stream):
24 | Reader.__init__(self, stream)
25 | Scanner.__init__(self)
26 | Parser.__init__(self)
27 | Composer.__init__(self)
28 | SafeConstructor.__init__(self)
29 | Resolver.__init__(self)
30 |
31 | class Loader(Reader, Scanner, Parser, Composer, Constructor, Resolver):
32 |
33 | def __init__(self, stream):
34 | Reader.__init__(self, stream)
35 | Scanner.__init__(self)
36 | Parser.__init__(self)
37 | Composer.__init__(self)
38 | Constructor.__init__(self)
39 | Resolver.__init__(self)
40 |
41 |
--------------------------------------------------------------------------------
/lib/yaml/loader.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/loader.pyc
--------------------------------------------------------------------------------
/lib/yaml/nodes.py:
--------------------------------------------------------------------------------
1 |
2 | class Node(object):
3 | def __init__(self, tag, value, start_mark, end_mark):
4 | self.tag = tag
5 | self.value = value
6 | self.start_mark = start_mark
7 | self.end_mark = end_mark
8 | def __repr__(self):
9 | value = self.value
10 | #if isinstance(value, list):
11 | # if len(value) == 0:
12 | # value = ''
13 | # elif len(value) == 1:
14 | # value = '<1 item>'
15 | # else:
16 | # value = '<%d items>' % len(value)
17 | #else:
18 | # if len(value) > 75:
19 | # value = repr(value[:70]+u' ... ')
20 | # else:
21 | # value = repr(value)
22 | value = repr(value)
23 | return '%s(tag=%r, value=%s)' % (self.__class__.__name__, self.tag, value)
24 |
25 | class ScalarNode(Node):
26 | id = 'scalar'
27 | def __init__(self, tag, value,
28 | start_mark=None, end_mark=None, style=None):
29 | self.tag = tag
30 | self.value = value
31 | self.start_mark = start_mark
32 | self.end_mark = end_mark
33 | self.style = style
34 |
35 | class CollectionNode(Node):
36 | def __init__(self, tag, value,
37 | start_mark=None, end_mark=None, flow_style=None):
38 | self.tag = tag
39 | self.value = value
40 | self.start_mark = start_mark
41 | self.end_mark = end_mark
42 | self.flow_style = flow_style
43 |
44 | class SequenceNode(CollectionNode):
45 | id = 'sequence'
46 |
47 | class MappingNode(CollectionNode):
48 | id = 'mapping'
49 |
50 |
--------------------------------------------------------------------------------
/lib/yaml/nodes.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/nodes.pyc
--------------------------------------------------------------------------------
/lib/yaml/parser.py:
--------------------------------------------------------------------------------
1 |
2 | # The following YAML grammar is LL(1) and is parsed by a recursive descent
3 | # parser.
4 | #
5 | # stream ::= STREAM-START implicit_document? explicit_document* STREAM-END
6 | # implicit_document ::= block_node DOCUMENT-END*
7 | # explicit_document ::= DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END*
8 | # block_node_or_indentless_sequence ::=
9 | # ALIAS
10 | # | properties (block_content | indentless_block_sequence)?
11 | # | block_content
12 | # | indentless_block_sequence
13 | # block_node ::= ALIAS
14 | # | properties block_content?
15 | # | block_content
16 | # flow_node ::= ALIAS
17 | # | properties flow_content?
18 | # | flow_content
19 | # properties ::= TAG ANCHOR? | ANCHOR TAG?
20 | # block_content ::= block_collection | flow_collection | SCALAR
21 | # flow_content ::= flow_collection | SCALAR
22 | # block_collection ::= block_sequence | block_mapping
23 | # flow_collection ::= flow_sequence | flow_mapping
24 | # block_sequence ::= BLOCK-SEQUENCE-START (BLOCK-ENTRY block_node?)* BLOCK-END
25 | # indentless_sequence ::= (BLOCK-ENTRY block_node?)+
26 | # block_mapping ::= BLOCK-MAPPING_START
27 | # ((KEY block_node_or_indentless_sequence?)?
28 | # (VALUE block_node_or_indentless_sequence?)?)*
29 | # BLOCK-END
30 | # flow_sequence ::= FLOW-SEQUENCE-START
31 | # (flow_sequence_entry FLOW-ENTRY)*
32 | # flow_sequence_entry?
33 | # FLOW-SEQUENCE-END
34 | # flow_sequence_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)?
35 | # flow_mapping ::= FLOW-MAPPING-START
36 | # (flow_mapping_entry FLOW-ENTRY)*
37 | # flow_mapping_entry?
38 | # FLOW-MAPPING-END
39 | # flow_mapping_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)?
40 | #
41 | # FIRST sets:
42 | #
43 | # stream: { STREAM-START }
44 | # explicit_document: { DIRECTIVE DOCUMENT-START }
45 | # implicit_document: FIRST(block_node)
46 | # block_node: { ALIAS TAG ANCHOR SCALAR BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START }
47 | # flow_node: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START }
48 | # block_content: { BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START SCALAR }
49 | # flow_content: { FLOW-SEQUENCE-START FLOW-MAPPING-START SCALAR }
50 | # block_collection: { BLOCK-SEQUENCE-START BLOCK-MAPPING-START }
51 | # flow_collection: { FLOW-SEQUENCE-START FLOW-MAPPING-START }
52 | # block_sequence: { BLOCK-SEQUENCE-START }
53 | # block_mapping: { BLOCK-MAPPING-START }
54 | # block_node_or_indentless_sequence: { ALIAS ANCHOR TAG SCALAR BLOCK-SEQUENCE-START BLOCK-MAPPING-START FLOW-SEQUENCE-START FLOW-MAPPING-START BLOCK-ENTRY }
55 | # indentless_sequence: { ENTRY }
56 | # flow_collection: { FLOW-SEQUENCE-START FLOW-MAPPING-START }
57 | # flow_sequence: { FLOW-SEQUENCE-START }
58 | # flow_mapping: { FLOW-MAPPING-START }
59 | # flow_sequence_entry: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START KEY }
60 | # flow_mapping_entry: { ALIAS ANCHOR TAG SCALAR FLOW-SEQUENCE-START FLOW-MAPPING-START KEY }
61 |
62 | __all__ = ['Parser', 'ParserError']
63 |
64 | from error import MarkedYAMLError
65 | from tokens import *
66 | from events import *
67 | from scanner import *
68 |
69 | class ParserError(MarkedYAMLError):
70 | pass
71 |
72 | class Parser(object):
73 | # Since writing a recursive-descendant parser is a straightforward task, we
74 | # do not give many comments here.
75 |
76 | DEFAULT_TAGS = {
77 | u'!': u'!',
78 | u'!!': u'tag:yaml.org,2002:',
79 | }
80 |
81 | def __init__(self):
82 | self.current_event = None
83 | self.yaml_version = None
84 | self.tag_handles = {}
85 | self.states = []
86 | self.marks = []
87 | self.state = self.parse_stream_start
88 |
89 | def dispose(self):
90 | # Reset the state attributes (to clear self-references)
91 | self.states = []
92 | self.state = None
93 |
94 | def check_event(self, *choices):
95 | # Check the type of the next event.
96 | if self.current_event is None:
97 | if self.state:
98 | self.current_event = self.state()
99 | if self.current_event is not None:
100 | if not choices:
101 | return True
102 | for choice in choices:
103 | if isinstance(self.current_event, choice):
104 | return True
105 | return False
106 |
107 | def peek_event(self):
108 | # Get the next event.
109 | if self.current_event is None:
110 | if self.state:
111 | self.current_event = self.state()
112 | return self.current_event
113 |
114 | def get_event(self):
115 | # Get the next event and proceed further.
116 | if self.current_event is None:
117 | if self.state:
118 | self.current_event = self.state()
119 | value = self.current_event
120 | self.current_event = None
121 | return value
122 |
123 | # stream ::= STREAM-START implicit_document? explicit_document* STREAM-END
124 | # implicit_document ::= block_node DOCUMENT-END*
125 | # explicit_document ::= DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END*
126 |
127 | def parse_stream_start(self):
128 |
129 | # Parse the stream start.
130 | token = self.get_token()
131 | event = StreamStartEvent(token.start_mark, token.end_mark,
132 | encoding=token.encoding)
133 |
134 | # Prepare the next state.
135 | self.state = self.parse_implicit_document_start
136 |
137 | return event
138 |
139 | def parse_implicit_document_start(self):
140 |
141 | # Parse an implicit document.
142 | if not self.check_token(DirectiveToken, DocumentStartToken,
143 | StreamEndToken):
144 | self.tag_handles = self.DEFAULT_TAGS
145 | token = self.peek_token()
146 | start_mark = end_mark = token.start_mark
147 | event = DocumentStartEvent(start_mark, end_mark,
148 | explicit=False)
149 |
150 | # Prepare the next state.
151 | self.states.append(self.parse_document_end)
152 | self.state = self.parse_block_node
153 |
154 | return event
155 |
156 | else:
157 | return self.parse_document_start()
158 |
159 | def parse_document_start(self):
160 |
161 | # Parse any extra document end indicators.
162 | while self.check_token(DocumentEndToken):
163 | self.get_token()
164 |
165 | # Parse an explicit document.
166 | if not self.check_token(StreamEndToken):
167 | token = self.peek_token()
168 | start_mark = token.start_mark
169 | version, tags = self.process_directives()
170 | if not self.check_token(DocumentStartToken):
171 | raise ParserError(None, None,
172 | "expected '', but found %r"
173 | % self.peek_token().id,
174 | self.peek_token().start_mark)
175 | token = self.get_token()
176 | end_mark = token.end_mark
177 | event = DocumentStartEvent(start_mark, end_mark,
178 | explicit=True, version=version, tags=tags)
179 | self.states.append(self.parse_document_end)
180 | self.state = self.parse_document_content
181 | else:
182 | # Parse the end of the stream.
183 | token = self.get_token()
184 | event = StreamEndEvent(token.start_mark, token.end_mark)
185 | assert not self.states
186 | assert not self.marks
187 | self.state = None
188 | return event
189 |
190 | def parse_document_end(self):
191 |
192 | # Parse the document end.
193 | token = self.peek_token()
194 | start_mark = end_mark = token.start_mark
195 | explicit = False
196 | if self.check_token(DocumentEndToken):
197 | token = self.get_token()
198 | end_mark = token.end_mark
199 | explicit = True
200 | event = DocumentEndEvent(start_mark, end_mark,
201 | explicit=explicit)
202 |
203 | # Prepare the next state.
204 | self.state = self.parse_document_start
205 |
206 | return event
207 |
208 | def parse_document_content(self):
209 | if self.check_token(DirectiveToken,
210 | DocumentStartToken, DocumentEndToken, StreamEndToken):
211 | event = self.process_empty_scalar(self.peek_token().start_mark)
212 | self.state = self.states.pop()
213 | return event
214 | else:
215 | return self.parse_block_node()
216 |
217 | def process_directives(self):
218 | self.yaml_version = None
219 | self.tag_handles = {}
220 | while self.check_token(DirectiveToken):
221 | token = self.get_token()
222 | if token.name == u'YAML':
223 | if self.yaml_version is not None:
224 | raise ParserError(None, None,
225 | "found duplicate YAML directive", token.start_mark)
226 | major, minor = token.value
227 | if major != 1:
228 | raise ParserError(None, None,
229 | "found incompatible YAML document (version 1.* is required)",
230 | token.start_mark)
231 | self.yaml_version = token.value
232 | elif token.name == u'TAG':
233 | handle, prefix = token.value
234 | if handle in self.tag_handles:
235 | raise ParserError(None, None,
236 | "duplicate tag handle %r" % handle.encode('utf-8'),
237 | token.start_mark)
238 | self.tag_handles[handle] = prefix
239 | if self.tag_handles:
240 | value = self.yaml_version, self.tag_handles.copy()
241 | else:
242 | value = self.yaml_version, None
243 | for key in self.DEFAULT_TAGS:
244 | if key not in self.tag_handles:
245 | self.tag_handles[key] = self.DEFAULT_TAGS[key]
246 | return value
247 |
248 | # block_node_or_indentless_sequence ::= ALIAS
249 | # | properties (block_content | indentless_block_sequence)?
250 | # | block_content
251 | # | indentless_block_sequence
252 | # block_node ::= ALIAS
253 | # | properties block_content?
254 | # | block_content
255 | # flow_node ::= ALIAS
256 | # | properties flow_content?
257 | # | flow_content
258 | # properties ::= TAG ANCHOR? | ANCHOR TAG?
259 | # block_content ::= block_collection | flow_collection | SCALAR
260 | # flow_content ::= flow_collection | SCALAR
261 | # block_collection ::= block_sequence | block_mapping
262 | # flow_collection ::= flow_sequence | flow_mapping
263 |
264 | def parse_block_node(self):
265 | return self.parse_node(block=True)
266 |
267 | def parse_flow_node(self):
268 | return self.parse_node()
269 |
270 | def parse_block_node_or_indentless_sequence(self):
271 | return self.parse_node(block=True, indentless_sequence=True)
272 |
273 | def parse_node(self, block=False, indentless_sequence=False):
274 | if self.check_token(AliasToken):
275 | token = self.get_token()
276 | event = AliasEvent(token.value, token.start_mark, token.end_mark)
277 | self.state = self.states.pop()
278 | else:
279 | anchor = None
280 | tag = None
281 | start_mark = end_mark = tag_mark = None
282 | if self.check_token(AnchorToken):
283 | token = self.get_token()
284 | start_mark = token.start_mark
285 | end_mark = token.end_mark
286 | anchor = token.value
287 | if self.check_token(TagToken):
288 | token = self.get_token()
289 | tag_mark = token.start_mark
290 | end_mark = token.end_mark
291 | tag = token.value
292 | elif self.check_token(TagToken):
293 | token = self.get_token()
294 | start_mark = tag_mark = token.start_mark
295 | end_mark = token.end_mark
296 | tag = token.value
297 | if self.check_token(AnchorToken):
298 | token = self.get_token()
299 | end_mark = token.end_mark
300 | anchor = token.value
301 | if tag is not None:
302 | handle, suffix = tag
303 | if handle is not None:
304 | if handle not in self.tag_handles:
305 | raise ParserError("while parsing a node", start_mark,
306 | "found undefined tag handle %r" % handle.encode('utf-8'),
307 | tag_mark)
308 | tag = self.tag_handles[handle]+suffix
309 | else:
310 | tag = suffix
311 | #if tag == u'!':
312 | # raise ParserError("while parsing a node", start_mark,
313 | # "found non-specific tag '!'", tag_mark,
314 | # "Please check 'http://pyyaml.org/wiki/YAMLNonSpecificTag' and share your opinion.")
315 | if start_mark is None:
316 | start_mark = end_mark = self.peek_token().start_mark
317 | event = None
318 | implicit = (tag is None or tag == u'!')
319 | if indentless_sequence and self.check_token(BlockEntryToken):
320 | end_mark = self.peek_token().end_mark
321 | event = SequenceStartEvent(anchor, tag, implicit,
322 | start_mark, end_mark)
323 | self.state = self.parse_indentless_sequence_entry
324 | else:
325 | if self.check_token(ScalarToken):
326 | token = self.get_token()
327 | end_mark = token.end_mark
328 | if (token.plain and tag is None) or tag == u'!':
329 | implicit = (True, False)
330 | elif tag is None:
331 | implicit = (False, True)
332 | else:
333 | implicit = (False, False)
334 | event = ScalarEvent(anchor, tag, implicit, token.value,
335 | start_mark, end_mark, style=token.style)
336 | self.state = self.states.pop()
337 | elif self.check_token(FlowSequenceStartToken):
338 | end_mark = self.peek_token().end_mark
339 | event = SequenceStartEvent(anchor, tag, implicit,
340 | start_mark, end_mark, flow_style=True)
341 | self.state = self.parse_flow_sequence_first_entry
342 | elif self.check_token(FlowMappingStartToken):
343 | end_mark = self.peek_token().end_mark
344 | event = MappingStartEvent(anchor, tag, implicit,
345 | start_mark, end_mark, flow_style=True)
346 | self.state = self.parse_flow_mapping_first_key
347 | elif block and self.check_token(BlockSequenceStartToken):
348 | end_mark = self.peek_token().start_mark
349 | event = SequenceStartEvent(anchor, tag, implicit,
350 | start_mark, end_mark, flow_style=False)
351 | self.state = self.parse_block_sequence_first_entry
352 | elif block and self.check_token(BlockMappingStartToken):
353 | end_mark = self.peek_token().start_mark
354 | event = MappingStartEvent(anchor, tag, implicit,
355 | start_mark, end_mark, flow_style=False)
356 | self.state = self.parse_block_mapping_first_key
357 | elif anchor is not None or tag is not None:
358 | # Empty scalars are allowed even if a tag or an anchor is
359 | # specified.
360 | event = ScalarEvent(anchor, tag, (implicit, False), u'',
361 | start_mark, end_mark)
362 | self.state = self.states.pop()
363 | else:
364 | if block:
365 | node = 'block'
366 | else:
367 | node = 'flow'
368 | token = self.peek_token()
369 | raise ParserError("while parsing a %s node" % node, start_mark,
370 | "expected the node content, but found %r" % token.id,
371 | token.start_mark)
372 | return event
373 |
374 | # block_sequence ::= BLOCK-SEQUENCE-START (BLOCK-ENTRY block_node?)* BLOCK-END
375 |
376 | def parse_block_sequence_first_entry(self):
377 | token = self.get_token()
378 | self.marks.append(token.start_mark)
379 | return self.parse_block_sequence_entry()
380 |
381 | def parse_block_sequence_entry(self):
382 | if self.check_token(BlockEntryToken):
383 | token = self.get_token()
384 | if not self.check_token(BlockEntryToken, BlockEndToken):
385 | self.states.append(self.parse_block_sequence_entry)
386 | return self.parse_block_node()
387 | else:
388 | self.state = self.parse_block_sequence_entry
389 | return self.process_empty_scalar(token.end_mark)
390 | if not self.check_token(BlockEndToken):
391 | token = self.peek_token()
392 | raise ParserError("while parsing a block collection", self.marks[-1],
393 | "expected , but found %r" % token.id, token.start_mark)
394 | token = self.get_token()
395 | event = SequenceEndEvent(token.start_mark, token.end_mark)
396 | self.state = self.states.pop()
397 | self.marks.pop()
398 | return event
399 |
400 | # indentless_sequence ::= (BLOCK-ENTRY block_node?)+
401 |
402 | def parse_indentless_sequence_entry(self):
403 | if self.check_token(BlockEntryToken):
404 | token = self.get_token()
405 | if not self.check_token(BlockEntryToken,
406 | KeyToken, ValueToken, BlockEndToken):
407 | self.states.append(self.parse_indentless_sequence_entry)
408 | return self.parse_block_node()
409 | else:
410 | self.state = self.parse_indentless_sequence_entry
411 | return self.process_empty_scalar(token.end_mark)
412 | token = self.peek_token()
413 | event = SequenceEndEvent(token.start_mark, token.start_mark)
414 | self.state = self.states.pop()
415 | return event
416 |
417 | # block_mapping ::= BLOCK-MAPPING_START
418 | # ((KEY block_node_or_indentless_sequence?)?
419 | # (VALUE block_node_or_indentless_sequence?)?)*
420 | # BLOCK-END
421 |
422 | def parse_block_mapping_first_key(self):
423 | token = self.get_token()
424 | self.marks.append(token.start_mark)
425 | return self.parse_block_mapping_key()
426 |
427 | def parse_block_mapping_key(self):
428 | if self.check_token(KeyToken):
429 | token = self.get_token()
430 | if not self.check_token(KeyToken, ValueToken, BlockEndToken):
431 | self.states.append(self.parse_block_mapping_value)
432 | return self.parse_block_node_or_indentless_sequence()
433 | else:
434 | self.state = self.parse_block_mapping_value
435 | return self.process_empty_scalar(token.end_mark)
436 | if not self.check_token(BlockEndToken):
437 | token = self.peek_token()
438 | raise ParserError("while parsing a block mapping", self.marks[-1],
439 | "expected , but found %r" % token.id, token.start_mark)
440 | token = self.get_token()
441 | event = MappingEndEvent(token.start_mark, token.end_mark)
442 | self.state = self.states.pop()
443 | self.marks.pop()
444 | return event
445 |
446 | def parse_block_mapping_value(self):
447 | if self.check_token(ValueToken):
448 | token = self.get_token()
449 | if not self.check_token(KeyToken, ValueToken, BlockEndToken):
450 | self.states.append(self.parse_block_mapping_key)
451 | return self.parse_block_node_or_indentless_sequence()
452 | else:
453 | self.state = self.parse_block_mapping_key
454 | return self.process_empty_scalar(token.end_mark)
455 | else:
456 | self.state = self.parse_block_mapping_key
457 | token = self.peek_token()
458 | return self.process_empty_scalar(token.start_mark)
459 |
460 | # flow_sequence ::= FLOW-SEQUENCE-START
461 | # (flow_sequence_entry FLOW-ENTRY)*
462 | # flow_sequence_entry?
463 | # FLOW-SEQUENCE-END
464 | # flow_sequence_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)?
465 | #
466 | # Note that while production rules for both flow_sequence_entry and
467 | # flow_mapping_entry are equal, their interpretations are different.
468 | # For `flow_sequence_entry`, the part `KEY flow_node? (VALUE flow_node?)?`
469 | # generate an inline mapping (set syntax).
470 |
471 | def parse_flow_sequence_first_entry(self):
472 | token = self.get_token()
473 | self.marks.append(token.start_mark)
474 | return self.parse_flow_sequence_entry(first=True)
475 |
476 | def parse_flow_sequence_entry(self, first=False):
477 | if not self.check_token(FlowSequenceEndToken):
478 | if not first:
479 | if self.check_token(FlowEntryToken):
480 | self.get_token()
481 | else:
482 | token = self.peek_token()
483 | raise ParserError("while parsing a flow sequence", self.marks[-1],
484 | "expected ',' or ']', but got %r" % token.id, token.start_mark)
485 |
486 | if self.check_token(KeyToken):
487 | token = self.peek_token()
488 | event = MappingStartEvent(None, None, True,
489 | token.start_mark, token.end_mark,
490 | flow_style=True)
491 | self.state = self.parse_flow_sequence_entry_mapping_key
492 | return event
493 | elif not self.check_token(FlowSequenceEndToken):
494 | self.states.append(self.parse_flow_sequence_entry)
495 | return self.parse_flow_node()
496 | token = self.get_token()
497 | event = SequenceEndEvent(token.start_mark, token.end_mark)
498 | self.state = self.states.pop()
499 | self.marks.pop()
500 | return event
501 |
502 | def parse_flow_sequence_entry_mapping_key(self):
503 | token = self.get_token()
504 | if not self.check_token(ValueToken,
505 | FlowEntryToken, FlowSequenceEndToken):
506 | self.states.append(self.parse_flow_sequence_entry_mapping_value)
507 | return self.parse_flow_node()
508 | else:
509 | self.state = self.parse_flow_sequence_entry_mapping_value
510 | return self.process_empty_scalar(token.end_mark)
511 |
512 | def parse_flow_sequence_entry_mapping_value(self):
513 | if self.check_token(ValueToken):
514 | token = self.get_token()
515 | if not self.check_token(FlowEntryToken, FlowSequenceEndToken):
516 | self.states.append(self.parse_flow_sequence_entry_mapping_end)
517 | return self.parse_flow_node()
518 | else:
519 | self.state = self.parse_flow_sequence_entry_mapping_end
520 | return self.process_empty_scalar(token.end_mark)
521 | else:
522 | self.state = self.parse_flow_sequence_entry_mapping_end
523 | token = self.peek_token()
524 | return self.process_empty_scalar(token.start_mark)
525 |
526 | def parse_flow_sequence_entry_mapping_end(self):
527 | self.state = self.parse_flow_sequence_entry
528 | token = self.peek_token()
529 | return MappingEndEvent(token.start_mark, token.start_mark)
530 |
531 | # flow_mapping ::= FLOW-MAPPING-START
532 | # (flow_mapping_entry FLOW-ENTRY)*
533 | # flow_mapping_entry?
534 | # FLOW-MAPPING-END
535 | # flow_mapping_entry ::= flow_node | KEY flow_node? (VALUE flow_node?)?
536 |
537 | def parse_flow_mapping_first_key(self):
538 | token = self.get_token()
539 | self.marks.append(token.start_mark)
540 | return self.parse_flow_mapping_key(first=True)
541 |
542 | def parse_flow_mapping_key(self, first=False):
543 | if not self.check_token(FlowMappingEndToken):
544 | if not first:
545 | if self.check_token(FlowEntryToken):
546 | self.get_token()
547 | else:
548 | token = self.peek_token()
549 | raise ParserError("while parsing a flow mapping", self.marks[-1],
550 | "expected ',' or '}', but got %r" % token.id, token.start_mark)
551 | if self.check_token(KeyToken):
552 | token = self.get_token()
553 | if not self.check_token(ValueToken,
554 | FlowEntryToken, FlowMappingEndToken):
555 | self.states.append(self.parse_flow_mapping_value)
556 | return self.parse_flow_node()
557 | else:
558 | self.state = self.parse_flow_mapping_value
559 | return self.process_empty_scalar(token.end_mark)
560 | elif not self.check_token(FlowMappingEndToken):
561 | self.states.append(self.parse_flow_mapping_empty_value)
562 | return self.parse_flow_node()
563 | token = self.get_token()
564 | event = MappingEndEvent(token.start_mark, token.end_mark)
565 | self.state = self.states.pop()
566 | self.marks.pop()
567 | return event
568 |
569 | def parse_flow_mapping_value(self):
570 | if self.check_token(ValueToken):
571 | token = self.get_token()
572 | if not self.check_token(FlowEntryToken, FlowMappingEndToken):
573 | self.states.append(self.parse_flow_mapping_key)
574 | return self.parse_flow_node()
575 | else:
576 | self.state = self.parse_flow_mapping_key
577 | return self.process_empty_scalar(token.end_mark)
578 | else:
579 | self.state = self.parse_flow_mapping_key
580 | token = self.peek_token()
581 | return self.process_empty_scalar(token.start_mark)
582 |
583 | def parse_flow_mapping_empty_value(self):
584 | self.state = self.parse_flow_mapping_key
585 | return self.process_empty_scalar(self.peek_token().start_mark)
586 |
587 | def process_empty_scalar(self, mark):
588 | return ScalarEvent(None, None, (True, False), u'', mark, mark)
589 |
590 |
--------------------------------------------------------------------------------
/lib/yaml/parser.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/parser.pyc
--------------------------------------------------------------------------------
/lib/yaml/reader.py:
--------------------------------------------------------------------------------
1 | # This module contains abstractions for the input stream. You don't have to
2 | # looks further, there are no pretty code.
3 | #
4 | # We define two classes here.
5 | #
6 | # Mark(source, line, column)
7 | # It's just a record and its only use is producing nice error messages.
8 | # Parser does not use it for any other purposes.
9 | #
10 | # Reader(source, data)
11 | # Reader determines the encoding of `data` and converts it to unicode.
12 | # Reader provides the following methods and attributes:
13 | # reader.peek(length=1) - return the next `length` characters
14 | # reader.forward(length=1) - move the current position to `length` characters.
15 | # reader.index - the number of the current character.
16 | # reader.line, stream.column - the line and the column of the current character.
17 |
18 | __all__ = ['Reader', 'ReaderError']
19 |
20 | from error import YAMLError, Mark
21 |
22 | import codecs, re
23 |
24 | class ReaderError(YAMLError):
25 |
26 | def __init__(self, name, position, character, encoding, reason):
27 | self.name = name
28 | self.character = character
29 | self.position = position
30 | self.encoding = encoding
31 | self.reason = reason
32 |
33 | def __str__(self):
34 | if isinstance(self.character, str):
35 | return "'%s' codec can't decode byte #x%02x: %s\n" \
36 | " in \"%s\", position %d" \
37 | % (self.encoding, ord(self.character), self.reason,
38 | self.name, self.position)
39 | else:
40 | return "unacceptable character #x%04x: %s\n" \
41 | " in \"%s\", position %d" \
42 | % (self.character, self.reason,
43 | self.name, self.position)
44 |
45 | class Reader(object):
46 | # Reader:
47 | # - determines the data encoding and converts it to unicode,
48 | # - checks if characters are in allowed range,
49 | # - adds '\0' to the end.
50 |
51 | # Reader accepts
52 | # - a `str` object,
53 | # - a `unicode` object,
54 | # - a file-like object with its `read` method returning `str`,
55 | # - a file-like object with its `read` method returning `unicode`.
56 |
57 | # Yeah, it's ugly and slow.
58 |
59 | def __init__(self, stream):
60 | self.name = None
61 | self.stream = None
62 | self.stream_pointer = 0
63 | self.eof = True
64 | self.buffer = u''
65 | self.pointer = 0
66 | self.raw_buffer = None
67 | self.raw_decode = None
68 | self.encoding = None
69 | self.index = 0
70 | self.line = 0
71 | self.column = 0
72 | if isinstance(stream, unicode):
73 | self.name = ""
74 | self.check_printable(stream)
75 | self.buffer = stream+u'\0'
76 | elif isinstance(stream, str):
77 | self.name = ""
78 | self.raw_buffer = stream
79 | self.determine_encoding()
80 | else:
81 | self.stream = stream
82 | self.name = getattr(stream, 'name', "")
83 | self.eof = False
84 | self.raw_buffer = ''
85 | self.determine_encoding()
86 |
87 | def peek(self, index=0):
88 | try:
89 | return self.buffer[self.pointer+index]
90 | except IndexError:
91 | self.update(index+1)
92 | return self.buffer[self.pointer+index]
93 |
94 | def prefix(self, length=1):
95 | if self.pointer+length >= len(self.buffer):
96 | self.update(length)
97 | return self.buffer[self.pointer:self.pointer+length]
98 |
99 | def forward(self, length=1):
100 | if self.pointer+length+1 >= len(self.buffer):
101 | self.update(length+1)
102 | while length:
103 | ch = self.buffer[self.pointer]
104 | self.pointer += 1
105 | self.index += 1
106 | if ch in u'\n\x85\u2028\u2029' \
107 | or (ch == u'\r' and self.buffer[self.pointer] != u'\n'):
108 | self.line += 1
109 | self.column = 0
110 | elif ch != u'\uFEFF':
111 | self.column += 1
112 | length -= 1
113 |
114 | def get_mark(self):
115 | if self.stream is None:
116 | return Mark(self.name, self.index, self.line, self.column,
117 | self.buffer, self.pointer)
118 | else:
119 | return Mark(self.name, self.index, self.line, self.column,
120 | None, None)
121 |
122 | def determine_encoding(self):
123 | while not self.eof and len(self.raw_buffer) < 2:
124 | self.update_raw()
125 | if not isinstance(self.raw_buffer, unicode):
126 | if self.raw_buffer.startswith(codecs.BOM_UTF16_LE):
127 | self.raw_decode = codecs.utf_16_le_decode
128 | self.encoding = 'utf-16-le'
129 | elif self.raw_buffer.startswith(codecs.BOM_UTF16_BE):
130 | self.raw_decode = codecs.utf_16_be_decode
131 | self.encoding = 'utf-16-be'
132 | else:
133 | self.raw_decode = codecs.utf_8_decode
134 | self.encoding = 'utf-8'
135 | self.update(1)
136 |
137 | NON_PRINTABLE = re.compile(u'[^\x09\x0A\x0D\x20-\x7E\x85\xA0-\uD7FF\uE000-\uFFFD]')
138 | def check_printable(self, data):
139 | match = self.NON_PRINTABLE.search(data)
140 | if match:
141 | character = match.group()
142 | position = self.index+(len(self.buffer)-self.pointer)+match.start()
143 | raise ReaderError(self.name, position, ord(character),
144 | 'unicode', "special characters are not allowed")
145 |
146 | def update(self, length):
147 | if self.raw_buffer is None:
148 | return
149 | self.buffer = self.buffer[self.pointer:]
150 | self.pointer = 0
151 | while len(self.buffer) < length:
152 | if not self.eof:
153 | self.update_raw()
154 | if self.raw_decode is not None:
155 | try:
156 | data, converted = self.raw_decode(self.raw_buffer,
157 | 'strict', self.eof)
158 | except UnicodeDecodeError, exc:
159 | character = exc.object[exc.start]
160 | if self.stream is not None:
161 | position = self.stream_pointer-len(self.raw_buffer)+exc.start
162 | else:
163 | position = exc.start
164 | raise ReaderError(self.name, position, character,
165 | exc.encoding, exc.reason)
166 | else:
167 | data = self.raw_buffer
168 | converted = len(data)
169 | self.check_printable(data)
170 | self.buffer += data
171 | self.raw_buffer = self.raw_buffer[converted:]
172 | if self.eof:
173 | self.buffer += u'\0'
174 | self.raw_buffer = None
175 | break
176 |
177 | def update_raw(self, size=1024):
178 | data = self.stream.read(size)
179 | if data:
180 | self.raw_buffer += data
181 | self.stream_pointer += len(data)
182 | else:
183 | self.eof = True
184 |
185 | #try:
186 | # import psyco
187 | # psyco.bind(Reader)
188 | #except ImportError:
189 | # pass
190 |
191 |
--------------------------------------------------------------------------------
/lib/yaml/reader.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/reader.pyc
--------------------------------------------------------------------------------
/lib/yaml/representer.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['BaseRepresenter', 'SafeRepresenter', 'Representer',
3 | 'RepresenterError']
4 |
5 | from error import *
6 | from nodes import *
7 |
8 | import datetime
9 |
10 | import sys, copy_reg, types
11 |
12 | class RepresenterError(YAMLError):
13 | pass
14 |
15 | class BaseRepresenter(object):
16 |
17 | yaml_representers = {}
18 | yaml_multi_representers = {}
19 |
20 | def __init__(self, default_style=None, default_flow_style=None):
21 | self.default_style = default_style
22 | self.default_flow_style = default_flow_style
23 | self.represented_objects = {}
24 | self.object_keeper = []
25 | self.alias_key = None
26 |
27 | def represent(self, data):
28 | node = self.represent_data(data)
29 | self.serialize(node)
30 | self.represented_objects = {}
31 | self.object_keeper = []
32 | self.alias_key = None
33 |
34 | def get_classobj_bases(self, cls):
35 | bases = [cls]
36 | for base in cls.__bases__:
37 | bases.extend(self.get_classobj_bases(base))
38 | return bases
39 |
40 | def represent_data(self, data):
41 | if self.ignore_aliases(data):
42 | self.alias_key = None
43 | else:
44 | self.alias_key = id(data)
45 | if self.alias_key is not None:
46 | if self.alias_key in self.represented_objects:
47 | node = self.represented_objects[self.alias_key]
48 | #if node is None:
49 | # raise RepresenterError("recursive objects are not allowed: %r" % data)
50 | return node
51 | #self.represented_objects[alias_key] = None
52 | self.object_keeper.append(data)
53 | data_types = type(data).__mro__
54 | if type(data) is types.InstanceType:
55 | data_types = self.get_classobj_bases(data.__class__)+list(data_types)
56 | if data_types[0] in self.yaml_representers:
57 | node = self.yaml_representers[data_types[0]](self, data)
58 | else:
59 | for data_type in data_types:
60 | if data_type in self.yaml_multi_representers:
61 | node = self.yaml_multi_representers[data_type](self, data)
62 | break
63 | else:
64 | if None in self.yaml_multi_representers:
65 | node = self.yaml_multi_representers[None](self, data)
66 | elif None in self.yaml_representers:
67 | node = self.yaml_representers[None](self, data)
68 | else:
69 | node = ScalarNode(None, unicode(data))
70 | #if alias_key is not None:
71 | # self.represented_objects[alias_key] = node
72 | return node
73 |
74 | def add_representer(cls, data_type, representer):
75 | if not 'yaml_representers' in cls.__dict__:
76 | cls.yaml_representers = cls.yaml_representers.copy()
77 | cls.yaml_representers[data_type] = representer
78 | add_representer = classmethod(add_representer)
79 |
80 | def add_multi_representer(cls, data_type, representer):
81 | if not 'yaml_multi_representers' in cls.__dict__:
82 | cls.yaml_multi_representers = cls.yaml_multi_representers.copy()
83 | cls.yaml_multi_representers[data_type] = representer
84 | add_multi_representer = classmethod(add_multi_representer)
85 |
86 | def represent_scalar(self, tag, value, style=None):
87 | if style is None:
88 | style = self.default_style
89 | node = ScalarNode(tag, value, style=style)
90 | if self.alias_key is not None:
91 | self.represented_objects[self.alias_key] = node
92 | return node
93 |
94 | def represent_sequence(self, tag, sequence, flow_style=None):
95 | value = []
96 | node = SequenceNode(tag, value, flow_style=flow_style)
97 | if self.alias_key is not None:
98 | self.represented_objects[self.alias_key] = node
99 | best_style = True
100 | for item in sequence:
101 | node_item = self.represent_data(item)
102 | if not (isinstance(node_item, ScalarNode) and not node_item.style):
103 | best_style = False
104 | value.append(node_item)
105 | if flow_style is None:
106 | if self.default_flow_style is not None:
107 | node.flow_style = self.default_flow_style
108 | else:
109 | node.flow_style = best_style
110 | return node
111 |
112 | def represent_mapping(self, tag, mapping, flow_style=None):
113 | value = []
114 | node = MappingNode(tag, value, flow_style=flow_style)
115 | if self.alias_key is not None:
116 | self.represented_objects[self.alias_key] = node
117 | best_style = True
118 | if hasattr(mapping, 'items'):
119 | mapping = mapping.items()
120 | mapping.sort()
121 | for item_key, item_value in mapping:
122 | node_key = self.represent_data(item_key)
123 | node_value = self.represent_data(item_value)
124 | if not (isinstance(node_key, ScalarNode) and not node_key.style):
125 | best_style = False
126 | if not (isinstance(node_value, ScalarNode) and not node_value.style):
127 | best_style = False
128 | value.append((node_key, node_value))
129 | if flow_style is None:
130 | if self.default_flow_style is not None:
131 | node.flow_style = self.default_flow_style
132 | else:
133 | node.flow_style = best_style
134 | return node
135 |
136 | def ignore_aliases(self, data):
137 | return False
138 |
139 | class SafeRepresenter(BaseRepresenter):
140 |
141 | def ignore_aliases(self, data):
142 | if data in [None, ()]:
143 | return True
144 | if isinstance(data, (str, unicode, bool, int, float)):
145 | return True
146 |
147 | def represent_none(self, data):
148 | return self.represent_scalar(u'tag:yaml.org,2002:null',
149 | u'null')
150 |
151 | def represent_str(self, data):
152 | tag = None
153 | style = None
154 | try:
155 | data = unicode(data, 'ascii')
156 | tag = u'tag:yaml.org,2002:str'
157 | except UnicodeDecodeError:
158 | try:
159 | data = unicode(data, 'utf-8')
160 | tag = u'tag:yaml.org,2002:str'
161 | except UnicodeDecodeError:
162 | data = data.encode('base64')
163 | tag = u'tag:yaml.org,2002:binary'
164 | style = '|'
165 | return self.represent_scalar(tag, data, style=style)
166 |
167 | def represent_unicode(self, data):
168 | return self.represent_scalar(u'tag:yaml.org,2002:str', data)
169 |
170 | def represent_bool(self, data):
171 | if data:
172 | value = u'true'
173 | else:
174 | value = u'false'
175 | return self.represent_scalar(u'tag:yaml.org,2002:bool', value)
176 |
177 | def represent_int(self, data):
178 | return self.represent_scalar(u'tag:yaml.org,2002:int', unicode(data))
179 |
180 | def represent_long(self, data):
181 | return self.represent_scalar(u'tag:yaml.org,2002:int', unicode(data))
182 |
183 | inf_value = 1e300
184 | while repr(inf_value) != repr(inf_value*inf_value):
185 | inf_value *= inf_value
186 |
187 | def represent_float(self, data):
188 | if data != data or (data == 0.0 and data == 1.0):
189 | value = u'.nan'
190 | elif data == self.inf_value:
191 | value = u'.inf'
192 | elif data == -self.inf_value:
193 | value = u'-.inf'
194 | else:
195 | value = unicode(repr(data)).lower()
196 | # Note that in some cases `repr(data)` represents a float number
197 | # without the decimal parts. For instance:
198 | # >>> repr(1e17)
199 | # '1e17'
200 | # Unfortunately, this is not a valid float representation according
201 | # to the definition of the `!!float` tag. We fix this by adding
202 | # '.0' before the 'e' symbol.
203 | if u'.' not in value and u'e' in value:
204 | value = value.replace(u'e', u'.0e', 1)
205 | return self.represent_scalar(u'tag:yaml.org,2002:float', value)
206 |
207 | def represent_list(self, data):
208 | #pairs = (len(data) > 0 and isinstance(data, list))
209 | #if pairs:
210 | # for item in data:
211 | # if not isinstance(item, tuple) or len(item) != 2:
212 | # pairs = False
213 | # break
214 | #if not pairs:
215 | return self.represent_sequence(u'tag:yaml.org,2002:seq', data)
216 | #value = []
217 | #for item_key, item_value in data:
218 | # value.append(self.represent_mapping(u'tag:yaml.org,2002:map',
219 | # [(item_key, item_value)]))
220 | #return SequenceNode(u'tag:yaml.org,2002:pairs', value)
221 |
222 | def represent_dict(self, data):
223 | return self.represent_mapping(u'tag:yaml.org,2002:map', data)
224 |
225 | def represent_set(self, data):
226 | value = {}
227 | for key in data:
228 | value[key] = None
229 | return self.represent_mapping(u'tag:yaml.org,2002:set', value)
230 |
231 | def represent_date(self, data):
232 | value = unicode(data.isoformat())
233 | return self.represent_scalar(u'tag:yaml.org,2002:timestamp', value)
234 |
235 | def represent_datetime(self, data):
236 | value = unicode(data.isoformat(' '))
237 | return self.represent_scalar(u'tag:yaml.org,2002:timestamp', value)
238 |
239 | def represent_yaml_object(self, tag, data, cls, flow_style=None):
240 | if hasattr(data, '__getstate__'):
241 | state = data.__getstate__()
242 | else:
243 | state = data.__dict__.copy()
244 | return self.represent_mapping(tag, state, flow_style=flow_style)
245 |
246 | def represent_undefined(self, data):
247 | raise RepresenterError("cannot represent an object: %s" % data)
248 |
249 | SafeRepresenter.add_representer(type(None),
250 | SafeRepresenter.represent_none)
251 |
252 | SafeRepresenter.add_representer(str,
253 | SafeRepresenter.represent_str)
254 |
255 | SafeRepresenter.add_representer(unicode,
256 | SafeRepresenter.represent_unicode)
257 |
258 | SafeRepresenter.add_representer(bool,
259 | SafeRepresenter.represent_bool)
260 |
261 | SafeRepresenter.add_representer(int,
262 | SafeRepresenter.represent_int)
263 |
264 | SafeRepresenter.add_representer(long,
265 | SafeRepresenter.represent_long)
266 |
267 | SafeRepresenter.add_representer(float,
268 | SafeRepresenter.represent_float)
269 |
270 | SafeRepresenter.add_representer(list,
271 | SafeRepresenter.represent_list)
272 |
273 | SafeRepresenter.add_representer(tuple,
274 | SafeRepresenter.represent_list)
275 |
276 | SafeRepresenter.add_representer(dict,
277 | SafeRepresenter.represent_dict)
278 |
279 | SafeRepresenter.add_representer(set,
280 | SafeRepresenter.represent_set)
281 |
282 | SafeRepresenter.add_representer(datetime.date,
283 | SafeRepresenter.represent_date)
284 |
285 | SafeRepresenter.add_representer(datetime.datetime,
286 | SafeRepresenter.represent_datetime)
287 |
288 | SafeRepresenter.add_representer(None,
289 | SafeRepresenter.represent_undefined)
290 |
291 | class Representer(SafeRepresenter):
292 |
293 | def represent_str(self, data):
294 | tag = None
295 | style = None
296 | try:
297 | data = unicode(data, 'ascii')
298 | tag = u'tag:yaml.org,2002:str'
299 | except UnicodeDecodeError:
300 | try:
301 | data = unicode(data, 'utf-8')
302 | tag = u'tag:yaml.org,2002:python/str'
303 | except UnicodeDecodeError:
304 | data = data.encode('base64')
305 | tag = u'tag:yaml.org,2002:binary'
306 | style = '|'
307 | return self.represent_scalar(tag, data, style=style)
308 |
309 | def represent_unicode(self, data):
310 | tag = None
311 | try:
312 | data.encode('ascii')
313 | tag = u'tag:yaml.org,2002:python/unicode'
314 | except UnicodeEncodeError:
315 | tag = u'tag:yaml.org,2002:str'
316 | return self.represent_scalar(tag, data)
317 |
318 | def represent_long(self, data):
319 | tag = u'tag:yaml.org,2002:int'
320 | if int(data) is not data:
321 | tag = u'tag:yaml.org,2002:python/long'
322 | return self.represent_scalar(tag, unicode(data))
323 |
324 | def represent_complex(self, data):
325 | if data.imag == 0.0:
326 | data = u'%r' % data.real
327 | elif data.real == 0.0:
328 | data = u'%rj' % data.imag
329 | elif data.imag > 0:
330 | data = u'%r+%rj' % (data.real, data.imag)
331 | else:
332 | data = u'%r%rj' % (data.real, data.imag)
333 | return self.represent_scalar(u'tag:yaml.org,2002:python/complex', data)
334 |
335 | def represent_tuple(self, data):
336 | return self.represent_sequence(u'tag:yaml.org,2002:python/tuple', data)
337 |
338 | def represent_name(self, data):
339 | name = u'%s.%s' % (data.__module__, data.__name__)
340 | return self.represent_scalar(u'tag:yaml.org,2002:python/name:'+name, u'')
341 |
342 | def represent_module(self, data):
343 | return self.represent_scalar(
344 | u'tag:yaml.org,2002:python/module:'+data.__name__, u'')
345 |
346 | def represent_instance(self, data):
347 | # For instances of classic classes, we use __getinitargs__ and
348 | # __getstate__ to serialize the data.
349 |
350 | # If data.__getinitargs__ exists, the object must be reconstructed by
351 | # calling cls(**args), where args is a tuple returned by
352 | # __getinitargs__. Otherwise, the cls.__init__ method should never be
353 | # called and the class instance is created by instantiating a trivial
354 | # class and assigning to the instance's __class__ variable.
355 |
356 | # If data.__getstate__ exists, it returns the state of the object.
357 | # Otherwise, the state of the object is data.__dict__.
358 |
359 | # We produce either a !!python/object or !!python/object/new node.
360 | # If data.__getinitargs__ does not exist and state is a dictionary, we
361 | # produce a !!python/object node . Otherwise we produce a
362 | # !!python/object/new node.
363 |
364 | cls = data.__class__
365 | class_name = u'%s.%s' % (cls.__module__, cls.__name__)
366 | args = None
367 | state = None
368 | if hasattr(data, '__getinitargs__'):
369 | args = list(data.__getinitargs__())
370 | if hasattr(data, '__getstate__'):
371 | state = data.__getstate__()
372 | else:
373 | state = data.__dict__
374 | if args is None and isinstance(state, dict):
375 | return self.represent_mapping(
376 | u'tag:yaml.org,2002:python/object:'+class_name, state)
377 | if isinstance(state, dict) and not state:
378 | return self.represent_sequence(
379 | u'tag:yaml.org,2002:python/object/new:'+class_name, args)
380 | value = {}
381 | if args:
382 | value['args'] = args
383 | value['state'] = state
384 | return self.represent_mapping(
385 | u'tag:yaml.org,2002:python/object/new:'+class_name, value)
386 |
387 | def represent_object(self, data):
388 | # We use __reduce__ API to save the data. data.__reduce__ returns
389 | # a tuple of length 2-5:
390 | # (function, args, state, listitems, dictitems)
391 |
392 | # For reconstructing, we calls function(*args), then set its state,
393 | # listitems, and dictitems if they are not None.
394 |
395 | # A special case is when function.__name__ == '__newobj__'. In this
396 | # case we create the object with args[0].__new__(*args).
397 |
398 | # Another special case is when __reduce__ returns a string - we don't
399 | # support it.
400 |
401 | # We produce a !!python/object, !!python/object/new or
402 | # !!python/object/apply node.
403 |
404 | cls = type(data)
405 | if cls in copy_reg.dispatch_table:
406 | reduce = copy_reg.dispatch_table[cls](data)
407 | elif hasattr(data, '__reduce_ex__'):
408 | reduce = data.__reduce_ex__(2)
409 | elif hasattr(data, '__reduce__'):
410 | reduce = data.__reduce__()
411 | else:
412 | raise RepresenterError("cannot represent object: %r" % data)
413 | reduce = (list(reduce)+[None]*5)[:5]
414 | function, args, state, listitems, dictitems = reduce
415 | args = list(args)
416 | if state is None:
417 | state = {}
418 | if listitems is not None:
419 | listitems = list(listitems)
420 | if dictitems is not None:
421 | dictitems = dict(dictitems)
422 | if function.__name__ == '__newobj__':
423 | function = args[0]
424 | args = args[1:]
425 | tag = u'tag:yaml.org,2002:python/object/new:'
426 | newobj = True
427 | else:
428 | tag = u'tag:yaml.org,2002:python/object/apply:'
429 | newobj = False
430 | function_name = u'%s.%s' % (function.__module__, function.__name__)
431 | if not args and not listitems and not dictitems \
432 | and isinstance(state, dict) and newobj:
433 | return self.represent_mapping(
434 | u'tag:yaml.org,2002:python/object:'+function_name, state)
435 | if not listitems and not dictitems \
436 | and isinstance(state, dict) and not state:
437 | return self.represent_sequence(tag+function_name, args)
438 | value = {}
439 | if args:
440 | value['args'] = args
441 | if state or not isinstance(state, dict):
442 | value['state'] = state
443 | if listitems:
444 | value['listitems'] = listitems
445 | if dictitems:
446 | value['dictitems'] = dictitems
447 | return self.represent_mapping(tag+function_name, value)
448 |
449 | Representer.add_representer(str,
450 | Representer.represent_str)
451 |
452 | Representer.add_representer(unicode,
453 | Representer.represent_unicode)
454 |
455 | Representer.add_representer(long,
456 | Representer.represent_long)
457 |
458 | Representer.add_representer(complex,
459 | Representer.represent_complex)
460 |
461 | Representer.add_representer(tuple,
462 | Representer.represent_tuple)
463 |
464 | Representer.add_representer(type,
465 | Representer.represent_name)
466 |
467 | Representer.add_representer(types.ClassType,
468 | Representer.represent_name)
469 |
470 | Representer.add_representer(types.FunctionType,
471 | Representer.represent_name)
472 |
473 | Representer.add_representer(types.BuiltinFunctionType,
474 | Representer.represent_name)
475 |
476 | Representer.add_representer(types.ModuleType,
477 | Representer.represent_module)
478 |
479 | Representer.add_multi_representer(types.InstanceType,
480 | Representer.represent_instance)
481 |
482 | Representer.add_multi_representer(object,
483 | Representer.represent_object)
484 |
485 |
--------------------------------------------------------------------------------
/lib/yaml/representer.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/representer.pyc
--------------------------------------------------------------------------------
/lib/yaml/resolver.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['BaseResolver', 'Resolver']
3 |
4 | from error import *
5 | from nodes import *
6 |
7 | import re
8 |
9 | class ResolverError(YAMLError):
10 | pass
11 |
12 | class BaseResolver(object):
13 |
14 | DEFAULT_SCALAR_TAG = u'tag:yaml.org,2002:str'
15 | DEFAULT_SEQUENCE_TAG = u'tag:yaml.org,2002:seq'
16 | DEFAULT_MAPPING_TAG = u'tag:yaml.org,2002:map'
17 |
18 | yaml_implicit_resolvers = {}
19 | yaml_path_resolvers = {}
20 |
21 | def __init__(self):
22 | self.resolver_exact_paths = []
23 | self.resolver_prefix_paths = []
24 |
25 | def add_implicit_resolver(cls, tag, regexp, first):
26 | if not 'yaml_implicit_resolvers' in cls.__dict__:
27 | cls.yaml_implicit_resolvers = cls.yaml_implicit_resolvers.copy()
28 | if first is None:
29 | first = [None]
30 | for ch in first:
31 | cls.yaml_implicit_resolvers.setdefault(ch, []).append((tag, regexp))
32 | add_implicit_resolver = classmethod(add_implicit_resolver)
33 |
34 | def add_path_resolver(cls, tag, path, kind=None):
35 | # Note: `add_path_resolver` is experimental. The API could be changed.
36 | # `new_path` is a pattern that is matched against the path from the
37 | # root to the node that is being considered. `node_path` elements are
38 | # tuples `(node_check, index_check)`. `node_check` is a node class:
39 | # `ScalarNode`, `SequenceNode`, `MappingNode` or `None`. `None`
40 | # matches any kind of a node. `index_check` could be `None`, a boolean
41 | # value, a string value, or a number. `None` and `False` match against
42 | # any _value_ of sequence and mapping nodes. `True` matches against
43 | # any _key_ of a mapping node. A string `index_check` matches against
44 | # a mapping value that corresponds to a scalar key which content is
45 | # equal to the `index_check` value. An integer `index_check` matches
46 | # against a sequence value with the index equal to `index_check`.
47 | if not 'yaml_path_resolvers' in cls.__dict__:
48 | cls.yaml_path_resolvers = cls.yaml_path_resolvers.copy()
49 | new_path = []
50 | for element in path:
51 | if isinstance(element, (list, tuple)):
52 | if len(element) == 2:
53 | node_check, index_check = element
54 | elif len(element) == 1:
55 | node_check = element[0]
56 | index_check = True
57 | else:
58 | raise ResolverError("Invalid path element: %s" % element)
59 | else:
60 | node_check = None
61 | index_check = element
62 | if node_check is str:
63 | node_check = ScalarNode
64 | elif node_check is list:
65 | node_check = SequenceNode
66 | elif node_check is dict:
67 | node_check = MappingNode
68 | elif node_check not in [ScalarNode, SequenceNode, MappingNode] \
69 | and not isinstance(node_check, basestring) \
70 | and node_check is not None:
71 | raise ResolverError("Invalid node checker: %s" % node_check)
72 | if not isinstance(index_check, (basestring, int)) \
73 | and index_check is not None:
74 | raise ResolverError("Invalid index checker: %s" % index_check)
75 | new_path.append((node_check, index_check))
76 | if kind is str:
77 | kind = ScalarNode
78 | elif kind is list:
79 | kind = SequenceNode
80 | elif kind is dict:
81 | kind = MappingNode
82 | elif kind not in [ScalarNode, SequenceNode, MappingNode] \
83 | and kind is not None:
84 | raise ResolverError("Invalid node kind: %s" % kind)
85 | cls.yaml_path_resolvers[tuple(new_path), kind] = tag
86 | add_path_resolver = classmethod(add_path_resolver)
87 |
88 | def descend_resolver(self, current_node, current_index):
89 | if not self.yaml_path_resolvers:
90 | return
91 | exact_paths = {}
92 | prefix_paths = []
93 | if current_node:
94 | depth = len(self.resolver_prefix_paths)
95 | for path, kind in self.resolver_prefix_paths[-1]:
96 | if self.check_resolver_prefix(depth, path, kind,
97 | current_node, current_index):
98 | if len(path) > depth:
99 | prefix_paths.append((path, kind))
100 | else:
101 | exact_paths[kind] = self.yaml_path_resolvers[path, kind]
102 | else:
103 | for path, kind in self.yaml_path_resolvers:
104 | if not path:
105 | exact_paths[kind] = self.yaml_path_resolvers[path, kind]
106 | else:
107 | prefix_paths.append((path, kind))
108 | self.resolver_exact_paths.append(exact_paths)
109 | self.resolver_prefix_paths.append(prefix_paths)
110 |
111 | def ascend_resolver(self):
112 | if not self.yaml_path_resolvers:
113 | return
114 | self.resolver_exact_paths.pop()
115 | self.resolver_prefix_paths.pop()
116 |
117 | def check_resolver_prefix(self, depth, path, kind,
118 | current_node, current_index):
119 | node_check, index_check = path[depth-1]
120 | if isinstance(node_check, basestring):
121 | if current_node.tag != node_check:
122 | return
123 | elif node_check is not None:
124 | if not isinstance(current_node, node_check):
125 | return
126 | if index_check is True and current_index is not None:
127 | return
128 | if (index_check is False or index_check is None) \
129 | and current_index is None:
130 | return
131 | if isinstance(index_check, basestring):
132 | if not (isinstance(current_index, ScalarNode)
133 | and index_check == current_index.value):
134 | return
135 | elif isinstance(index_check, int) and not isinstance(index_check, bool):
136 | if index_check != current_index:
137 | return
138 | return True
139 |
140 | def resolve(self, kind, value, implicit):
141 | if kind is ScalarNode and implicit[0]:
142 | if value == u'':
143 | resolvers = self.yaml_implicit_resolvers.get(u'', [])
144 | else:
145 | resolvers = self.yaml_implicit_resolvers.get(value[0], [])
146 | resolvers += self.yaml_implicit_resolvers.get(None, [])
147 | for tag, regexp in resolvers:
148 | if regexp.match(value):
149 | return tag
150 | implicit = implicit[1]
151 | if self.yaml_path_resolvers:
152 | exact_paths = self.resolver_exact_paths[-1]
153 | if kind in exact_paths:
154 | return exact_paths[kind]
155 | if None in exact_paths:
156 | return exact_paths[None]
157 | if kind is ScalarNode:
158 | return self.DEFAULT_SCALAR_TAG
159 | elif kind is SequenceNode:
160 | return self.DEFAULT_SEQUENCE_TAG
161 | elif kind is MappingNode:
162 | return self.DEFAULT_MAPPING_TAG
163 |
164 | class Resolver(BaseResolver):
165 | pass
166 |
167 | Resolver.add_implicit_resolver(
168 | u'tag:yaml.org,2002:bool',
169 | re.compile(ur'''^(?:yes|Yes|YES|no|No|NO
170 | |true|True|TRUE|false|False|FALSE
171 | |on|On|ON|off|Off|OFF)$''', re.X),
172 | list(u'yYnNtTfFoO'))
173 |
174 | Resolver.add_implicit_resolver(
175 | u'tag:yaml.org,2002:float',
176 | re.compile(ur'''^(?:[-+]?(?:[0-9][0-9_]*)\.[0-9_]*(?:[eE][-+][0-9]+)?
177 | |\.[0-9_]+(?:[eE][-+][0-9]+)?
178 | |[-+]?[0-9][0-9_]*(?::[0-5]?[0-9])+\.[0-9_]*
179 | |[-+]?\.(?:inf|Inf|INF)
180 | |\.(?:nan|NaN|NAN))$''', re.X),
181 | list(u'-+0123456789.'))
182 |
183 | Resolver.add_implicit_resolver(
184 | u'tag:yaml.org,2002:int',
185 | re.compile(ur'''^(?:[-+]?0b[0-1_]+
186 | |[-+]?0[0-7_]+
187 | |[-+]?(?:0|[1-9][0-9_]*)
188 | |[-+]?0x[0-9a-fA-F_]+
189 | |[-+]?[1-9][0-9_]*(?::[0-5]?[0-9])+)$''', re.X),
190 | list(u'-+0123456789'))
191 |
192 | Resolver.add_implicit_resolver(
193 | u'tag:yaml.org,2002:merge',
194 | re.compile(ur'^(?:<<)$'),
195 | [u'<'])
196 |
197 | Resolver.add_implicit_resolver(
198 | u'tag:yaml.org,2002:null',
199 | re.compile(ur'''^(?: ~
200 | |null|Null|NULL
201 | | )$''', re.X),
202 | [u'~', u'n', u'N', u''])
203 |
204 | Resolver.add_implicit_resolver(
205 | u'tag:yaml.org,2002:timestamp',
206 | re.compile(ur'''^(?:[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9]
207 | |[0-9][0-9][0-9][0-9] -[0-9][0-9]? -[0-9][0-9]?
208 | (?:[Tt]|[ \t]+)[0-9][0-9]?
209 | :[0-9][0-9] :[0-9][0-9] (?:\.[0-9]*)?
210 | (?:[ \t]*(?:Z|[-+][0-9][0-9]?(?::[0-9][0-9])?))?)$''', re.X),
211 | list(u'0123456789'))
212 |
213 | Resolver.add_implicit_resolver(
214 | u'tag:yaml.org,2002:value',
215 | re.compile(ur'^(?:=)$'),
216 | [u'='])
217 |
218 | # The following resolver is only for documentation purposes. It cannot work
219 | # because plain scalars cannot start with '!', '&', or '*'.
220 | Resolver.add_implicit_resolver(
221 | u'tag:yaml.org,2002:yaml',
222 | re.compile(ur'^(?:!|&|\*)$'),
223 | list(u'!&*'))
224 |
225 |
--------------------------------------------------------------------------------
/lib/yaml/resolver.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/resolver.pyc
--------------------------------------------------------------------------------
/lib/yaml/scanner.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/scanner.pyc
--------------------------------------------------------------------------------
/lib/yaml/serializer.py:
--------------------------------------------------------------------------------
1 |
2 | __all__ = ['Serializer', 'SerializerError']
3 |
4 | from error import YAMLError
5 | from events import *
6 | from nodes import *
7 |
8 | class SerializerError(YAMLError):
9 | pass
10 |
11 | class Serializer(object):
12 |
13 | ANCHOR_TEMPLATE = u'id%03d'
14 |
15 | def __init__(self, encoding=None,
16 | explicit_start=None, explicit_end=None, version=None, tags=None):
17 | self.use_encoding = encoding
18 | self.use_explicit_start = explicit_start
19 | self.use_explicit_end = explicit_end
20 | self.use_version = version
21 | self.use_tags = tags
22 | self.serialized_nodes = {}
23 | self.anchors = {}
24 | self.last_anchor_id = 0
25 | self.closed = None
26 |
27 | def open(self):
28 | if self.closed is None:
29 | self.emit(StreamStartEvent(encoding=self.use_encoding))
30 | self.closed = False
31 | elif self.closed:
32 | raise SerializerError("serializer is closed")
33 | else:
34 | raise SerializerError("serializer is already opened")
35 |
36 | def close(self):
37 | if self.closed is None:
38 | raise SerializerError("serializer is not opened")
39 | elif not self.closed:
40 | self.emit(StreamEndEvent())
41 | self.closed = True
42 |
43 | #def __del__(self):
44 | # self.close()
45 |
46 | def serialize(self, node):
47 | if self.closed is None:
48 | raise SerializerError("serializer is not opened")
49 | elif self.closed:
50 | raise SerializerError("serializer is closed")
51 | self.emit(DocumentStartEvent(explicit=self.use_explicit_start,
52 | version=self.use_version, tags=self.use_tags))
53 | self.anchor_node(node)
54 | self.serialize_node(node, None, None)
55 | self.emit(DocumentEndEvent(explicit=self.use_explicit_end))
56 | self.serialized_nodes = {}
57 | self.anchors = {}
58 | self.last_anchor_id = 0
59 |
60 | def anchor_node(self, node):
61 | if node in self.anchors:
62 | if self.anchors[node] is None:
63 | self.anchors[node] = self.generate_anchor(node)
64 | else:
65 | self.anchors[node] = None
66 | if isinstance(node, SequenceNode):
67 | for item in node.value:
68 | self.anchor_node(item)
69 | elif isinstance(node, MappingNode):
70 | for key, value in node.value:
71 | self.anchor_node(key)
72 | self.anchor_node(value)
73 |
74 | def generate_anchor(self, node):
75 | self.last_anchor_id += 1
76 | return self.ANCHOR_TEMPLATE % self.last_anchor_id
77 |
78 | def serialize_node(self, node, parent, index):
79 | alias = self.anchors[node]
80 | if node in self.serialized_nodes:
81 | self.emit(AliasEvent(alias))
82 | else:
83 | self.serialized_nodes[node] = True
84 | self.descend_resolver(parent, index)
85 | if isinstance(node, ScalarNode):
86 | detected_tag = self.resolve(ScalarNode, node.value, (True, False))
87 | default_tag = self.resolve(ScalarNode, node.value, (False, True))
88 | implicit = (node.tag == detected_tag), (node.tag == default_tag)
89 | self.emit(ScalarEvent(alias, node.tag, implicit, node.value,
90 | style=node.style))
91 | elif isinstance(node, SequenceNode):
92 | implicit = (node.tag
93 | == self.resolve(SequenceNode, node.value, True))
94 | self.emit(SequenceStartEvent(alias, node.tag, implicit,
95 | flow_style=node.flow_style))
96 | index = 0
97 | for item in node.value:
98 | self.serialize_node(item, node, index)
99 | index += 1
100 | self.emit(SequenceEndEvent())
101 | elif isinstance(node, MappingNode):
102 | implicit = (node.tag
103 | == self.resolve(MappingNode, node.value, True))
104 | self.emit(MappingStartEvent(alias, node.tag, implicit,
105 | flow_style=node.flow_style))
106 | for key, value in node.value:
107 | self.serialize_node(key, node, None)
108 | self.serialize_node(value, node, key)
109 | self.emit(MappingEndEvent())
110 | self.ascend_resolver()
111 |
112 |
--------------------------------------------------------------------------------
/lib/yaml/serializer.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/serializer.pyc
--------------------------------------------------------------------------------
/lib/yaml/tokens.py:
--------------------------------------------------------------------------------
1 |
2 | class Token(object):
3 | def __init__(self, start_mark, end_mark):
4 | self.start_mark = start_mark
5 | self.end_mark = end_mark
6 | def __repr__(self):
7 | attributes = [key for key in self.__dict__
8 | if not key.endswith('_mark')]
9 | attributes.sort()
10 | arguments = ', '.join(['%s=%r' % (key, getattr(self, key))
11 | for key in attributes])
12 | return '%s(%s)' % (self.__class__.__name__, arguments)
13 |
14 | #class BOMToken(Token):
15 | # id = ''
16 |
17 | class DirectiveToken(Token):
18 | id = ''
19 | def __init__(self, name, value, start_mark, end_mark):
20 | self.name = name
21 | self.value = value
22 | self.start_mark = start_mark
23 | self.end_mark = end_mark
24 |
25 | class DocumentStartToken(Token):
26 | id = ''
27 |
28 | class DocumentEndToken(Token):
29 | id = ''
30 |
31 | class StreamStartToken(Token):
32 | id = ''
33 | def __init__(self, start_mark=None, end_mark=None,
34 | encoding=None):
35 | self.start_mark = start_mark
36 | self.end_mark = end_mark
37 | self.encoding = encoding
38 |
39 | class StreamEndToken(Token):
40 | id = ''
41 |
42 | class BlockSequenceStartToken(Token):
43 | id = ''
44 |
45 | class BlockMappingStartToken(Token):
46 | id = ''
47 |
48 | class BlockEndToken(Token):
49 | id = ''
50 |
51 | class FlowSequenceStartToken(Token):
52 | id = '['
53 |
54 | class FlowMappingStartToken(Token):
55 | id = '{'
56 |
57 | class FlowSequenceEndToken(Token):
58 | id = ']'
59 |
60 | class FlowMappingEndToken(Token):
61 | id = '}'
62 |
63 | class KeyToken(Token):
64 | id = '?'
65 |
66 | class ValueToken(Token):
67 | id = ':'
68 |
69 | class BlockEntryToken(Token):
70 | id = '-'
71 |
72 | class FlowEntryToken(Token):
73 | id = ','
74 |
75 | class AliasToken(Token):
76 | id = ''
77 | def __init__(self, value, start_mark, end_mark):
78 | self.value = value
79 | self.start_mark = start_mark
80 | self.end_mark = end_mark
81 |
82 | class AnchorToken(Token):
83 | id = ''
84 | def __init__(self, value, start_mark, end_mark):
85 | self.value = value
86 | self.start_mark = start_mark
87 | self.end_mark = end_mark
88 |
89 | class TagToken(Token):
90 | id = ''
91 | def __init__(self, value, start_mark, end_mark):
92 | self.value = value
93 | self.start_mark = start_mark
94 | self.end_mark = end_mark
95 |
96 | class ScalarToken(Token):
97 | id = ''
98 | def __init__(self, value, plain, start_mark, end_mark, style=None):
99 | self.value = value
100 | self.plain = plain
101 | self.start_mark = start_mark
102 | self.end_mark = end_mark
103 | self.style = style
104 |
105 |
--------------------------------------------------------------------------------
/lib/yaml/tokens.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/lib/yaml/tokens.pyc
--------------------------------------------------------------------------------
/pkg/Twisted-12.2.0.tar.bz2:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/pkg/Twisted-12.2.0.tar.bz2
--------------------------------------------------------------------------------
/pkg/virtualenv-1.9.1.tar.gz:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/pkg/virtualenv-1.9.1.tar.gz
--------------------------------------------------------------------------------
/pkg/zope.interface-4.0.1.tar.gz:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/xiaomi-sa/smartdns/d31893894994490e8e933d4dd621ccea458b50c5/pkg/zope.interface-4.0.1.tar.gz
--------------------------------------------------------------------------------
/script/install_smartdns.sh:
--------------------------------------------------------------------------------
1 | #!/bin/sh
2 |
3 | set -e
4 |
5 | filepath=$(cd "$(dirname "$0")"; pwd)
6 |
7 | cd $filepath/../pkg
8 | tar zxf virtualenv-1.9.1.tar.gz
9 | tar zxf zope.interface-4.0.1.tar.gz
10 | tar jxf Twisted-12.2.0.tar.bz2
11 |
12 | cd virtualenv-1.9.1
13 | python virtualenv.py $filepath/../../smartdns_env
14 | . $filepath/../../smartdns_env/bin/activate
15 |
16 | cd ../zope.interface-4.0.1
17 | python setup.py install
18 |
19 | cd ../Twisted-12.2.0
20 | python setup.py install
21 |
22 | cd .. && rm -rf virtualenv-1.9.1 zope.interface-4.0.1 Twisted-12.2.0
23 |
24 | dnsfilenum=`find $filepath/../../smartdns_env/lib/python*/site-packages/Twisted-12.2.0*/twisted/names -name dns.py | wc -l`
25 | if [ 1 -ne $dnsfilenum ]; then
26 | echo "cannot find dns.py"
27 | exit 2
28 | fi
29 | dnsfile=`find $filepath/../../smartdns_env/lib/python*/site-packages/Twisted-12.2.0*/twisted/names -name dns.py`
30 | cp -f $filepath/dns_in_twisted.py $dnsfile
31 |
32 |
--------------------------------------------------------------------------------
/script/run_dns.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | . ../../smartdns_env/bin/activate
3 | PYTHON="python"
4 | TWISTD="../script/twistd"
5 | cd ../bin
6 | $PYTHON checkconfig.py || { echo "[FATAL]配置检查失败,取消重启"; exit 1; }
7 | chmod a+r ../conf/sdns.pid &>/dev/null
8 | pid=`cat ../conf/sdns.pid 2>/dev/null`
9 | kill ${pid}
10 | while ps -p ${pid} &>/dev/null; do
11 | sleep 1
12 | done
13 | $PYTHON $TWISTD -y sdns.py -l ../log/sdns.log --pidfile=../conf/sdns.pid
14 |
--------------------------------------------------------------------------------
/script/twistd:
--------------------------------------------------------------------------------
1 | #!/bin/env python
2 | # EASY-INSTALL-SCRIPT: 'Twisted==12.2.0','twistd'
3 | __requires__ = 'Twisted==12.2.0'
4 | import pkg_resources
5 | pkg_resources.run_script('Twisted==12.2.0', 'twistd')
6 |
--------------------------------------------------------------------------------