├── .github
└── workflows
│ └── main.yml
├── GeoLite2-City.mmdb
├── LICENSE
├── README.md
├── README_EN.md
├── main.py
├── outputs
├── base64.txt
├── clash_meta.yaml
├── clash_meta_warp.yaml
└── proxy_urls.txt
├── requirements.txt
├── templates
├── clash_meta.yaml
└── clash_meta_warp.yaml
└── urls
├── clash_meta_urls.txt
├── hysteria2_urls.txt
├── hysteria_urls.txt
├── naiverproxy_urls.txt
├── singbox_urls.txt
├── ss_urls.txt
└── xray_urls.txt
/.github/workflows/main.yml:
--------------------------------------------------------------------------------
1 | name: Auto Extract
2 |
3 | on:
4 | push:
5 | branches:
6 | - main
7 | schedule:
8 | - cron: "0 0,12 * * *"
9 | workflow_dispatch:
10 | jobs:
11 | Extract:
12 | runs-on: ubuntu-latest
13 |
14 | steps:
15 | - name: Checkout repository
16 | uses: actions/checkout@v4
17 |
18 | - name: Set up Python
19 | uses: actions/setup-python@v5
20 | with:
21 | python-version: 3.11
22 |
23 | - name: Install dependencies
24 | run: pip install -r requirements.txt
25 |
26 | - name: Run extract script
27 | run: python main.py
28 |
29 | - name: Write time into README file
30 | run: |
31 | echo "DATE=$(date '+%Y-%m-%d %H:%M:%S')" >> $GITHUB_ENV
32 | sed -Ei "s|最近提取于:\`UTC [0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}\`|最近提取于:\`UTC $(date '+%Y-%m-%d %H:%M:%S')\`|g" README.md
33 | sed -Ei "s|recently extracted at: \`UTC [0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}\`|recently extracted at: \`UTC $(date '+%Y-%m-%d %H:%M:%S')\`|g" README.md
34 |
35 | - name: Commit & Push changes
36 | env:
37 | GITHUB_TOKEN: ${{ github.token }}
38 | uses: stefanzweifel/git-auto-commit-action@v4.16.0
39 | with:
40 | commit_message: Auto extract by Github Actions at UTC ${{ env.DATE }}
41 | create_branch: true
42 | branch: main
43 | commit_options: '--allow-empty'
44 | push_options: '--force'
45 |
--------------------------------------------------------------------------------
/GeoLite2-City.mmdb:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/linzjian666/chromego_extractor/44cf5077cb68cf309ea53812c9df1b74cf5a31c3/GeoLite2-City.mmdb
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2024 Linzjian666
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | # ChromeGo Extractor
4 |
5 | A python script to extract ChromeGo Proxies
6 |
7 | 一个用来提取ChromeGo代理节点的Python脚本
8 |
9 | **中文** | [English](README_EN.md)
10 |
11 |
12 |
13 | > 鸣谢
14 | > - 感谢[ChromeGo](https://github.com/bannedbook/fanqiang)项目
15 | > - 感谢[Alvin9999](https://github.com/Alvin9999/)大佬
16 | > - 感谢[chromegopacs](https://github.com/markbang/chromegopacs)提供的区域代码设置思路
17 |
18 | ## 使用说明
19 | ### 订阅链接:
20 | > 本项目已配置Github Actions自动运行,最近提取于:`UTC 2025-06-01 12:14:36`
21 |
22 | - Clash Meta (不带WARP):
23 |
24 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta.yaml](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta.yaml)
25 |
26 | - Clash Meta (带WARP):
27 |
28 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta_warp.yaml](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta_warp.yaml)
29 |
30 | - Base64:
31 |
32 | [https://raw.githubusercontent.com/linzjian666/chrome_extractor/main/outputs/base64.txt](https://raw.githubusercontent.com/linzjian666/chrome_extractor/main/outputs/base64.txt)
33 |
34 | - Proxy urls:
35 |
36 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/proxy_urls.txt](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/proxy_urls.txt)
37 |
38 |
39 |
40 | (备用)
41 |
42 | - Clash Meta (不带WARP):
43 |
44 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta.yaml](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta.yaml)
45 |
46 | - Clash Meta (带WARP):
47 |
48 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta_warp.yaml](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta_warp.yaml)
49 |
50 | - Base64:
51 |
52 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/base64.txt](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/base64.txt)
53 |
54 | - Proxy urls:
55 |
56 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/proxy_urls.txt](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/proxy_urls.txt)
57 |
58 |
59 |
60 | ### 本地运行:
61 |
62 |
63 | #### 1. 环境要求
64 | 确保你的环境满足以下要求:
65 | - Python 3.x
66 | - 安装所需的依赖:`pip install requests`
67 |
68 | #### 2. 下载脚本
69 | 克隆本项目到本地:
70 | ```bash
71 | git clone https://github.com/linzjian666/chromego-extractor.git
72 | ```
73 |
74 | #### 3. 运行脚本
75 | 1. 进入项目目录:
76 | ```bash
77 | cd chromego-extractor
78 | ```
79 | 2. 运行脚本:
80 | ```bash
81 | python main.py
82 | ```
83 |
84 | #### 4. 获取代理信息
85 | 脚本将提取 ChromeGo 代理节点信息,并保存到`outputs`目录中。
86 |
87 | #### 5. 其他
88 | 根据需要,你可以自行修改脚本的一些配置,比如保存文件的路径等。
89 |
90 |
91 |
92 | ## 免责声明
93 |
94 | **本项目仅供学习交流使用,作者不对其在实际使用中产生的任何后果负任何法律或技术责任。**
95 |
96 | 1. **使用风险**:用户在使用本项目时需自行承担风险。作者无法保证生成的配置信息适用于所有使用情境,因此可能会导致潜在的问题或错误。
97 |
98 | 2. **合规性和法律遵守**:用户使用本项目必须遵守部署服务器所在地、所在国家和用户所在国家的法律法规及云服务提供商的政策。作者不对使用者任何不当行为负责。
99 |
100 | 3. **无担保**:作者不提供关于本项目的任何担保或保证。本项目可能会受到外部因素的影响,如云服务提供商政策变更、网络故障等。用户需自行评估和处理这些风险。
101 |
102 | 4. **技术支持**:作者不承诺提供关于本项目的技术支持。用户需自行解决配置信息可能出现的问题。
103 |
104 | 5. **数据隐私**:用户需谨慎处理配置信息中可能包含的个人数据或敏感信息。作者不对因配置信息泄漏或不当使用而导致的数据隐私问题负责。
105 |
106 | **服务对象限定为非中国大陆地区用户。在使用本项目前,请仔细阅读并理解免责声明。如果不同意免责声明中的任何条款,请勿使用本项目!**
107 |
108 | ## 许可协议
109 |
110 | 本项目遵循 MIT 许可协议。有关详细信息,请参阅 [LICENSE](LICENSE) 文件。
111 |
112 | ---
113 | **欢迎提出问题或为本项目的开发做出贡献!**
114 |
115 |
119 |
--------------------------------------------------------------------------------
/README_EN.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | # ChromeGo Extractor
4 |
5 | A python script to extract ChromeGo Proxies.
6 |
7 | 一个用来提取ChromeGo代理节点的Python脚本
8 |
9 | [中文](README.md) | **English**
10 |
11 |
12 |
13 | > Acknowledgements
14 | > - Thanks to the [ChromeGo](https://github.com/bannedbook/fanqiang) project
15 | > - Thanks to [Alvin9999](https://github.com/Alvin9999/)
16 | > - Credits to [chromegopacs](https://github.com/markbang/chromegopacs) for providing the region code setting approach
17 |
18 | ## Usage Guide
19 | ### Subscription Links:
20 | > This project has been configured to run automatically with Github Actions, recently extracted at: `UTC 2024-01-27 12:28:04`
21 | >
22 | - Clash Meta (without WARP):
23 |
24 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta.yaml](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta.yaml)
25 |
26 | - Clash Meta (with WARP):
27 |
28 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta_warp.yaml](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/clash_meta_warp.yaml)
29 |
30 | - Base64:
31 |
32 | [https://raw.githubusercontent.com/linzjian666/chrome_extractor/main/outputs/base64.txt](https://raw.githubusercontent.com/linzjian666/chrome_extractor/main/outputs/base64.txt)
33 |
34 | - Proxy urls:
35 |
36 | [https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/proxy_urls.txt](https://raw.githubusercontent.com/linzjian666/chromego_extractor/main/outputs/proxy_urls.txt)
37 |
38 |
39 |
40 | (Alternate)
41 |
42 | - Clash Meta (without WARP):
43 |
44 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta.yaml](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta.yaml)
45 |
46 | - Clash Meta (with WARP):
47 |
48 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta_warp.yaml](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/clash_meta_warp.yaml)
49 |
50 | - Base64:
51 |
52 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/base64.txt](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/base64.txt)
53 |
54 | - Proxy urls:
55 |
56 | [https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/proxy_urls.txt](https://gcore.jsdelivr.net/gh/linzjian666/chromego_extractor@main/outputs/proxy_urls.txt)
57 |
58 |
59 |
60 | ### Local Execution:
61 |
62 |
63 | #### 1. System Requirements
64 | Make sure your environment meets the following requirements:
65 | - Python 3.x
66 | - Install the necessary dependencies: `pip install requests`
67 |
68 | #### 2. Download the Script
69 | Clone this project to your local machine:
70 | ```bash
71 | git clone https://github.com/linzjian666/chromego-extractor.git
72 | ```
73 |
74 | #### 3. Run the Script
75 | 1. Navigate to the project directory:
76 | ```bash
77 | cd chromego-extractor
78 | ```
79 | 2. Execute the script:
80 | ```bash
81 | python main.py
82 | ```
83 |
84 | #### 4. Obtain Proxy Information
85 | The script will extract ChromeGo proxy node information and save it to the `outputs` directory.
86 |
87 | #### 5. Additional Information
88 | As needed, you can modify certain configurations in the script, such as the file save path.
89 |
90 |
91 |
92 | ## Disclaimer
93 |
94 | **This project is intended for educational and informational purposes only. The author is not responsible for any legal or technical consequences arising from its actual use.**
95 |
96 | 1. **Usage Risk**: Users must assume all risks when using this project. The author cannot guarantee that the generated configuration information is suitable for all scenarios, which may lead to potential issues or errors.
97 |
98 | 2. **Compliance and Legal Compliance**: Users must comply with the laws and regulations of the server deployment location, the country of residence, and the user's home country, as well as the policies of cloud service providers. The author is not responsible for any improper behavior by users.
99 |
100 | 3. **No Warranty**: The author does not provide any guarantees or warranties regarding this project. This project may be affected by external factors, such as changes in cloud service provider policies, network failures, etc. Users must assess and manage these risks on their own.
101 |
102 | 4. **Technical Support**: The author does not commit to providing technical support for this project. Users are responsible for resolving any issues that may arise with the configuration information on their own.
103 |
104 | 5. **Data Privacy**: Users should handle personal data or sensitive information in the configuration information with caution. The author is not responsible for data privacy issues arising from the disclosure or improper use of configuration information.
105 |
106 | **This service is limited to users not in Chinese mainland. Please read and understand the disclaimer carefully before using this project. If you do not agree to any terms in the disclaimer, please do not use this project!**
107 |
108 | ## License Agreement
109 |
110 | This project is licensed under the MIT License. For detailed information, please refer to the [LICENSE](LICENSE) file.
111 |
112 | ---
113 | **Feel free to ask questions or contribute to the development of this project!**
114 |
115 |
119 |
--------------------------------------------------------------------------------
/main.py:
--------------------------------------------------------------------------------
1 | # -*- coding: UTF-8 -*-
2 | """
3 | Author: Linzjian666
4 | Date: 2024-01-13 11:29:53
5 | LastEditors: Linzjian666
6 | LastEditTime: 2025-05-01 09:50:00
7 | """
8 | import yaml
9 | import json
10 | import urllib.request
11 | import logging
12 | import geoip2.database
13 | import socket
14 | import re
15 | import base64
16 |
17 | def process_urls(urls_file, method):
18 | try:
19 | with open(urls_file, 'r') as f:
20 | urls = f.read().splitlines()
21 |
22 | for index, url in enumerate(urls):
23 | try:
24 | response = urllib.request.urlopen(url)
25 | data = response.read().decode('utf-8')
26 | # index += 1
27 | method(data, index)
28 | except Exception as e:
29 | logging.error(f"处理{url}时遇到错误: {e}")
30 | except Exception as e:
31 | logging.error(f"读取{urls_file}时遇到错误: {e}")
32 | return
33 |
34 | def process_clash_meta(data, index):
35 | try:
36 | content = yaml.safe_load(data)
37 | try:
38 | proxies = content['proxies']
39 | except:
40 | proxies = []
41 | for i, proxy in enumerate(proxies):
42 | if("network" in proxy and f"{proxy['network']}" == "ws"):
43 | # host = proxy['ws-opts']['headers']['host']
44 |
45 | try:
46 | host = proxy['ws-opts']['headers']['host']
47 | except KeyError:
48 | try:
49 | host = proxy['ws-opts']['headers']['Host']
50 | except KeyError:
51 | host = ''
52 |
53 | if(f"{proxy['server']}:{proxy['port']}-{host}-ws" not in servers_list):
54 | location = get_physical_location(proxy['server'])
55 | proxy['name'] = f"{location}-{proxy['type']} | {index}-{i+1}"
56 | servers_list.append(f"{proxy['server']}:{proxy['port']}-{host}-ws")
57 | else:
58 | continue
59 | elif(f"{proxy['server']}:{proxy['port']}-{proxy['type']}" not in servers_list):
60 | location = get_physical_location(proxy['server'])
61 | proxy['name'] = f"{location}-{proxy['type']} | {index}-{i+1}"
62 | servers_list.append(f"{proxy['server']}:{proxy['port']}-{proxy['type']}")
63 | else:
64 | continue
65 | extracted_proxies.append(proxy)
66 | # extracted_proxies.extend(proxies)
67 | except Exception as e:
68 | logging.error(f"处理Clash Meta配置{index}时遇到错误: {e}")
69 | return
70 |
71 | def process_hysteria(data, index):
72 | try:
73 | content = json.loads(data)
74 | # print(content)
75 | auth = content['auth_str']
76 | server_ports_slt = content['server'].split(":")
77 | server = server_ports_slt[0]
78 | ports = server_ports_slt[1]
79 | ports_slt = ports.split(',')
80 | server_port = int(ports_slt[0])
81 | if(len(ports_slt) > 1):
82 | mport = ports_slt[1]
83 | else:
84 | mport = server_port
85 | fast_open = content.get('fast_open', True)
86 | # fast_open = True
87 | insecure = content['insecure']
88 | sni = content['server_name']
89 | alpn = content['alpn']
90 | protocol = content['protocol']
91 | location = get_physical_location(server)
92 | name = f"{location}-Hysteria | {index}-0"
93 |
94 | proxy = {
95 | "name": name,
96 | "type": "hysteria",
97 | "server": server,
98 | "port": server_port,
99 | "ports": mport,
100 | "auth-str": auth,
101 | "up": 80,
102 | "down": 100,
103 | "fast-open": fast_open,
104 | "protocol": protocol,
105 | "sni": sni,
106 | "skip-cert-verify": insecure,
107 | "alpn": [alpn]
108 | }
109 | if(f"{proxy['server']}:{proxy['port']}-hysteria" not in servers_list):
110 | extracted_proxies.append(proxy)
111 | servers_list.append(f"{proxy['server']}:{proxy['port']}-hysteria")
112 | else:
113 | return
114 | except Exception as e:
115 | logging.error(f"处理Hysteria配置{index}时遇到错误: {e}")
116 | return
117 |
118 | def process_hysteria2(data, index):
119 | try:
120 | content = json.loads(data)
121 | auth = content['auth']
122 | server_ports_slt = content['server'].split(":")
123 | server = server_ports_slt[0]
124 | ports = server_ports_slt[1]
125 | ports_slt = ports.split(',')
126 | server_port = int(ports_slt[0])
127 | insecure = content['tls']['insecure']
128 | sni = content['tls']['sni']
129 | location = get_physical_location(server)
130 | name = f"{location}-Hysteria2 | {index}-0"
131 |
132 | proxy = {
133 | "name": name,
134 | "type": "hysteria2",
135 | "server": server,
136 | "port": server_port,
137 | "password": auth,
138 | "sni": sni,
139 | "skip-cert-verify": insecure
140 | }
141 | if(f"{proxy['server']}:{proxy['port']}-hysteria2" not in servers_list):
142 | extracted_proxies.append(proxy)
143 | servers_list.append(f"{proxy['server']}:{proxy['port']}-hysteria2")
144 | else:
145 | return
146 | except Exception as e:
147 | logging.error(f"处理Hysteria2配置{index}时遇到错误: {e}")
148 | return
149 |
150 | def process_xray(data, index):
151 | try:
152 | content = json.loads(data)
153 | outbounds = content['outbounds']
154 | pending_proxy = outbounds[0]
155 | type = pending_proxy['protocol']
156 | if(type == "vmess"):
157 | server = pending_proxy['settings']['vnext'][0]['address']
158 | port = pending_proxy['settings']['vnext'][0]['port']
159 | uuid = pending_proxy['settings']['vnext'][0]['users'][0]['id']
160 | alterId = pending_proxy['settings']['vnext'][0]['users'][0]['alterId']
161 | cipher = pending_proxy['settings']['vnext'][0]['users'][0]['security']
162 | network = pending_proxy['streamSettings']['network']
163 | security = pending_proxy['streamSettings'].get('security', "none")
164 | location = get_physical_location(server)
165 | name = f"{location}-{type} | {index}-0"
166 | if(security == "none"):
167 | tls = False
168 | else:
169 | tls = True
170 | sni = pending_proxy['streamSettings'].get('tlsSettings', {}).get('serverName', "")
171 | allowInsecure = pending_proxy['streamSettings'].get('tlsSettings', {}).get('allowInsecure', False)
172 |
173 | if(network in ['tcp','ws','grpc','h2']):
174 | ws_path = pending_proxy['streamSettings'].get('wsSettings', {}).get('path', "")
175 | ws_headers = pending_proxy['streamSettings'].get('wsSettings', {}).get('headers', {})
176 | grpc_serviceName = pending_proxy['streamSettings'].get('grpcSettings', {}).get('serviceName', "/")
177 | h2_path = pending_proxy['streamSettings'].get('httpSettings', {}).get('path', "/")
178 | h2_host = pending_proxy['streamSettings'].get('httpSettings', {}).get('host', [])
179 |
180 | proxy = {
181 | "name": name,
182 | "type": "vmess",
183 | "server": server,
184 | "port": port,
185 | "uuid": uuid,
186 | "alterId": alterId,
187 | "cipher": cipher,
188 | "tls": tls,
189 | "servername": sni,
190 | "skip-cert-verify": allowInsecure,
191 | "network": network,
192 | "ws-opts": {
193 | "path": ws_path,
194 | "headers": ws_headers
195 | },
196 | "grpc-opts": {
197 | "serviceName": grpc_serviceName
198 | },
199 | "h2-opts": {
200 | "path": h2_path,
201 | "host": h2_host
202 | }
203 | }
204 | else:
205 | logging.error(f"处理Xray配置{index}时遇到错误: 不支持的VMess传输协议: {network}")
206 | return
207 | elif(type == "vless"):
208 | server = pending_proxy['settings']['vnext'][0]['address']
209 | port = pending_proxy['settings']['vnext'][0]['port']
210 | uuid = pending_proxy['settings']['vnext'][0]['users'][0]['id']
211 | flow = pending_proxy['settings']['vnext'][0]['users'][0].get('flow', "")
212 | security = pending_proxy['streamSettings'].get('security', "none")
213 | network = pending_proxy['streamSettings']['network']
214 | location = get_physical_location(server)
215 | name = f"{location}-{type} | {index}-0"
216 |
217 | if(security == "none"):
218 | tls = False
219 | else:
220 | tls = True
221 | if(security == "reality"):
222 | realitySettings = pending_proxy['streamSettings'].get('realitySettings', {})
223 | sni = realitySettings.get('serverName', "")
224 | short_id = realitySettings.get('shortId', "")
225 | publicKey = realitySettings['publicKey']
226 | fingerprint = realitySettings['fingerprint']
227 |
228 | grpc_serviceName = pending_proxy['streamSettings'].get('grpcSettings', {}).get('serviceName', "/")
229 | proxy = {
230 | "name": name,
231 | "type": "vless",
232 | "server": server,
233 | "port": port,
234 | "uuid": uuid,
235 | "flow": flow,
236 | "tls": tls,
237 | "servername": sni,
238 | "network": network,
239 | "client-fingerprint": fingerprint,
240 | "grpc-opts": {
241 | "grpc-service-name": grpc_serviceName
242 | },
243 | "reality-opts": {
244 | "public-key": publicKey,
245 | "short-id": short_id,
246 | }
247 | }
248 | else:
249 | if(network in ['tcp','ws','grpc']):
250 | sni = pending_proxy['streamSettings'].get('tlsSettings', {}).get('serverName', "")
251 | allowInsecure = pending_proxy['streamSettings'].get('tlsSettings', {}).get('allowInsecure', False)
252 |
253 | ws_path = pending_proxy['streamSettings'].get('wsSettings', {}).get('path', "")
254 | ws_headers = pending_proxy['streamSettings'].get('wsSettings', {}).get('headers', {})
255 | grpc_serviceName = pending_proxy['streamSettings'].get('grpcSettings', {}).get('serviceName', "/")
256 |
257 | proxy = {
258 | "name": name,
259 | "type": "vless",
260 | "server": server,
261 | "port": port,
262 | "uuid": uuid,
263 | "tls": tls,
264 | "servername": sni,
265 | "skip-cert-verify": allowInsecure,
266 | "network": network,
267 | "ws-opts": {
268 | "path": ws_path,
269 | "headers": ws_headers
270 | },
271 | "grpc-opts": {
272 | "serviceName": grpc_serviceName
273 | }
274 | }
275 | else:
276 | logging.error(f"处理Xray配置{index}时遇到错误: 不支持的VLESS传输协议: {network}")
277 | return
278 | else:
279 | logging.error(f"处理Xray配置{index}时遇到错误: 不支持的传输协议: {type}")
280 | return
281 | if(f"{proxy['server']}:{proxy['port']}-{proxy['type']}" not in servers_list):
282 | extracted_proxies.append(proxy)
283 | servers_list.append(f"{proxy['server']}:{proxy['port']}-{proxy['type']}")
284 | else:
285 | return
286 |
287 | # print(security)
288 | # if(type == "vmess"):
289 | #
290 | # elif(type == "shadowsocks"):
291 | # cipher = pending_proxy['settings']['vnext"][0]['users"][0]['method"]
292 | # else:
293 | # cipher = "none"
294 |
295 | except Exception as e:
296 | logging.error(f"处理Xray配置{index}时遇到错误: {e}")
297 |
298 | def get_physical_location(address):
299 | address = re.sub(":.*", "", address) # 用正则表达式去除端口部分
300 | try:
301 | ip_address = socket.gethostbyname(address)
302 | except socket.gaierror:
303 | ip_address = address
304 |
305 | try:
306 | reader = geoip2.database.Reader("GeoLite2-City.mmdb") # 这里的路径需要指向你自己的数据库文件
307 | response = reader.city(ip_address)
308 | country = response.country.iso_code
309 | # city = response.city.name
310 | flag_emoji = ""
311 | for i in range(len(country)):
312 | flag_emoji += chr(ord(country[i]) + ord("🇦") - ord("A")) #
313 | if(flag_emoji == "🇹🇼"):
314 | flag_emoji = "🇨🇳"
315 | return f"{flag_emoji} {country}"
316 | except Exception as e:
317 | # logging.error(f"区域代码获取失败: {e}")
318 | return "🏳 Unknown"
319 |
320 | def write_clash_meta_profile(template_file, output_file, extracted_proxies):
321 | with open(template_file, 'r', encoding='utf-8') as f:
322 | profile = yaml.safe_load(f)
323 | if("proxies" not in profile or not profile['proxies']):
324 | profile['proxies'] = extracted_proxies
325 | else:
326 | profile['proxies'].extend(extracted_proxies)
327 | for group in profile['proxy-groups']:
328 | if(group['name'] in ['🚀 节点选择','♻️ 自动选择','⚖ 负载均衡','☁ WARP前置节点','📺 巴哈姆特','📺 哔哩哔哩','🌏 国内媒体','🌍 国外媒体','📲 电报信息','Ⓜ️ 微软云盘','Ⓜ️ 微软服务','🍎 苹果服务','📢 谷歌FCM','🤖 OpenAI','🐟 漏网之鱼']):
329 | if("proxies" not in group or not group['proxies']):
330 | group['proxies'] = [proxy['name'] for proxy in extracted_proxies]
331 | else:
332 | group['proxies'].extend(proxy['name'] for proxy in extracted_proxies)
333 | # 写入yaml文件
334 | with open(output_file, 'w', encoding='utf-8') as f:
335 | yaml.dump(profile, f, sort_keys=False, allow_unicode=True)
336 |
337 | def write_proxy_urls_file(output_file, proxies):
338 | proxy_urls = []
339 | for proxy in proxies:
340 | try:
341 | if(proxy['type'] == "vless"):
342 | name = proxy['name']
343 | server = proxy['server']
344 | port = proxy['port']
345 | uuid = proxy['uuid']
346 | tls = int(proxy.get('tls', 0))
347 | network = proxy['network']
348 | flow = proxy.get('flow', "")
349 | grpc_serviceName = proxy.get('grpc-opts', {}).get('grpc-service-name', "")
350 | ws_path = proxy.get('ws-opts', {}).get('path', "")
351 | try:
352 | ws_headers_host = proxy.get('ws-opts', {}).get('headers', {}).get('host', "")
353 | except:
354 | ws_headers_host = proxy.get('ws-opts', {}).get('headers', {}).get('Host', "")
355 |
356 | if(tls == 0):
357 | proxy_url = f"vless://{uuid}@{server}:{port}?encryption=none&flow={flow}&security=none&type={network}&serviceName={grpc_serviceName}&host={ws_headers_host}&path={ws_path}#{name}"
358 | else:
359 | sni = proxy.get('servername', "")
360 | publicKey = proxy.get('reality-opts', {}).get('public-key', "")
361 | short_id = proxy.get('reality-opts', {}).get('short-id', "")
362 | fingerprint = proxy.get('client-fingerprint', "")
363 | if(not publicKey == ""):
364 | proxy_url = f"vless://{uuid}@{server}:{port}?encryption=none&flow={flow}&security=reality&sni={sni}&fp={fingerprint}&pbk={publicKey}&sid={short_id}&type={network}&serviceName={grpc_serviceName}&host={ws_headers_host}&path={ws_path}#{name}"
365 | else:
366 | insecure = int(proxy.get('skip-cert-verify', 0))
367 | proxy_url = f"vless://{uuid}@{server}:{port}?encryption=none&flow={flow}&security=tls&sni={sni}&fp={fingerprint}&insecure={insecure}&type={network}&serviceName={grpc_serviceName}&host={ws_headers_host}&path={ws_path}#{name}"
368 |
369 | elif(proxy['type'] == "vmess"):
370 | name = proxy['name']
371 | server = proxy['server']
372 | port = proxy['port']
373 | uuid = proxy['uuid']
374 | alterId = proxy['alterId']
375 | if(int(proxy.get('tls', 0)) == 1):
376 | tls = "tls"
377 | else:
378 | tls = ""
379 | sni = proxy.get('servername', "")
380 | network = proxy['network']
381 | if(network == "tcp"):
382 | type = "none"
383 | path = ""
384 | host = ""
385 | elif(network == "ws"):
386 | type = "none"
387 | path = proxy.get('ws-opts', {}).get('path', "")
388 | try:
389 | host = proxy.get('ws-opts', {}).get('headers', {}).get('host', "")
390 | except:
391 | host = proxy.get('ws-opts', {}).get('headers', {}).get('Host', "")
392 | elif(network == "grpc"):
393 | type = "gun"
394 | path = proxy.get('grpc-opts', {}).get('grpc-service-name', "")
395 | host = ""
396 | elif(network == "h2"):
397 | type = "none"
398 | path = proxy.get('h2-opts', {}).get('path', "")
399 | # 获取host并将host列表转换为逗号分隔的字符串
400 | host = proxy.get('h2-opts', {}).get('host', [])
401 | host = ','.join(host)
402 | else:
403 | continue
404 | vmess_meta = {
405 | "v": "2",
406 | "ps": name,
407 | "add": server,
408 | "port": port,
409 | "id": uuid,
410 | "aid": alterId,
411 | "net": network,
412 | "type": type,
413 | "host": host,
414 | "path": path,
415 | "tls": tls,
416 | "sni": sni,
417 | "alpn": ""
418 | }
419 | # 将字典`vmess_meta`转换为 JSON 格式字符串并进行 Base64 编码
420 | vmess_meta = base64.b64encode(json.dumps(vmess_meta).encode('utf-8')).decode('utf-8')
421 | # 合并为完整的 `vmess://` URL
422 | proxy_url = "vmess://" + vmess_meta
423 |
424 | elif(proxy['type'] == "ss"):
425 | name = proxy['name']
426 | server = proxy['server']
427 | port = proxy['port']
428 | password = proxy['password']
429 | cipher = proxy['cipher']
430 | ss_meta = base64.b64encode(f"{cipher}:{password}").decode('utf-8')
431 | ss_meta = f"{ss_meta}@{server}:{port}#{name}"
432 | proxy_url = "ss://" + ss_meta
433 |
434 |
435 | elif(proxy['type'] == "hysteria"):
436 | name = proxy['name']
437 | server = proxy['server']
438 | port = proxy['port']
439 | protocol = proxy.get('protocol', "udp")
440 | insecure = int(proxy.get('skip-cert-verify', 0))
441 | peer = proxy.get('sni', "")
442 | try:
443 | auth = proxy['auth-str']
444 | except:
445 | auth = proxy['auth_str']
446 | upmbps = proxy.get('up', "11")
447 | downmbps = proxy.get('down', "55")
448 | alpn = proxy['alpn']
449 | alpn = ','.join(alpn) # 将 `alpn` 列表转换为逗号分隔的字符串
450 | obfs = proxy.get('obfs', "")
451 | proxy_url = f"hysteria://{server}:{port}/?protocol={protocol}&insecure={insecure}&peer={peer}&auth={auth}&upmbps={upmbps}&downmbps={downmbps}&alpn={alpn}&obfs={obfs}#{name}"
452 |
453 | elif(proxy['type'] == "hysteria2"):
454 | name = proxy['name']
455 | server = proxy['server']
456 | port = proxy['port']
457 | auth = proxy['password']
458 | sni = proxy.get('sni', "")
459 | insecure = int(proxy.get('skip-cert-verify', 0))
460 | # 如果`proxy`包含`obfs`字段,且`obfs`字段不为空
461 | if("obfs" in proxy and proxy['obfs'] != ""):
462 | obfs = proxy['obfs']
463 | obfs_password = proxy['obfs-password']
464 | proxy_url = f"hysteria2://{auth}@{server}:{port}/?sni={sni}&insecure={insecure}&obfs={obfs}&obfs-password={obfs_password}#{name}"
465 | else:
466 | proxy_url = f"hysteria2://{auth}@{server}:{port}/?sni={sni}&insecure={insecure}#{name}"
467 |
468 | elif(proxy['type'] == "tuic"):
469 | name = proxy['name']
470 | server = proxy['server']
471 | port = proxy['port']
472 | uuid = proxy['uuid']
473 | password = proxy.get('password', "")
474 | congestion_controller = proxy.get('congestion-controller', "bbr")
475 | udp_relay_mode = proxy.get('udp-relay-mode', "naive")
476 | sni = proxy.get('sni', "")
477 | alpn = proxy.get('alpn', [])
478 | alpn = ','.join(alpn) # 将 `alpn` 列表转换为逗号分隔的字符串
479 | allowInsecure = int(proxy.get('skip-cert-verify', 1))
480 | disable_sni = int(proxy.get('disable-sni', 0))
481 | proxy_url = f"tuic://{uuid}:{password}@{server}:{port}/?congestion_controller={congestion_controller}&udp_relay_mode={udp_relay_mode}&sni={sni}&alpn={alpn}&allow_insecure={allowInsecure}&disable_sni={disable_sni}#{name}"
482 |
483 | else:
484 | logging.error(f"处理 {proxy['name']} 时遇到问题: 不支持的协议: {proxy['type']}")
485 | continue
486 |
487 | # print(proxy_url)
488 | proxy_urls.append(proxy_url)
489 | except Exception as e:
490 | logging.error(f"处理 {proxy['name']} 时遇到问题: {e}")
491 | continue
492 | # 将`proxy_urls`写入`output_file`
493 | with open(output_file, 'w', encoding='utf-8') as f:
494 | for proxy_url in proxy_urls:
495 | f.write(proxy_url + "\n")
496 |
497 | def write_base64_file(output_file, proxy_urls_file):
498 | with open(proxy_urls_file, 'r', encoding='utf-8') as f:
499 | proxy_urls = f.read()
500 | with open(output_file, 'w', encoding='utf-8') as f:
501 | f.write(base64.b64encode(proxy_urls.encode('utf-8')).decode('utf-8'))
502 |
503 | if __name__ == "__main__":
504 | extracted_proxies = []
505 | servers_list = []
506 |
507 | # 处理clash meta urls
508 | process_urls("./urls/clash_meta_urls.txt", process_clash_meta)
509 |
510 | # 处理hysteria urls
511 | process_urls("./urls/hysteria_urls.txt", process_hysteria)
512 |
513 | # 处理hysteria2 urls
514 | process_urls("./urls/hysteria2_urls.txt", process_hysteria2)
515 |
516 | # 处理Xray urls
517 | process_urls("./urls/xray_urls.txt", process_xray)
518 |
519 | # logging.info(servers_list)
520 |
521 | # # 写入clash meta配置
522 | write_clash_meta_profile("./templates/clash_meta.yaml", "./outputs/clash_meta.yaml", extracted_proxies)
523 | write_clash_meta_profile("./templates/clash_meta_warp.yaml", "./outputs/clash_meta_warp.yaml", extracted_proxies)
524 |
525 | # 写入代理urls
526 | write_proxy_urls_file("./outputs/proxy_urls.txt", extracted_proxies)
527 |
528 | write_base64_file("./outputs/base64.txt", "./outputs/proxy_urls.txt")
529 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | PyYAML==6.0.1
2 | geoip2
--------------------------------------------------------------------------------
/urls/clash_meta_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/1/config.yaml
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/config.yaml
3 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/13/config.yaml
4 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/15/config.yaml
5 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/2/config.yaml
6 | https://raw.githubusercontent.com/Alvin9999/pac2/master/clash.meta2/3/config.yaml
7 | https://raw.githubusercontent.com/Alvin9999/pac2/master/quick/4/config.yaml
8 | https://raw.githubusercontent.com/Alvin9999/pac2/master/quick/1/config.yaml
9 | https://raw.githubusercontent.com/Alvin9999/pac2/master/quick/config.yaml
10 | https://raw.githubusercontent.com/Alvin9999/pac2/master/quick/3/config.yaml
11 |
--------------------------------------------------------------------------------
/urls/hysteria2_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria2/config.json
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria2/1/config.json
3 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria2/2/config.json
4 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria2/13/config.json
--------------------------------------------------------------------------------
/urls/hysteria_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria/1/config.json
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria/13/config.json
3 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria/2/config.json
4 | https://raw.githubusercontent.com/Alvin9999/pac2/master/hysteria/config.json
--------------------------------------------------------------------------------
/urls/naiverproxy_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/PAC/master/naiveproxy/1/config.json
2 | https://raw.githubusercontent.com/Alvin9999/PAC/master/naiveproxy/config.json
--------------------------------------------------------------------------------
/urls/singbox_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/singbox/1/config.json
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/singbox/config.json
--------------------------------------------------------------------------------
/urls/ss_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/ssr-wj/ssconfig.txt
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/ssr-wj/1/ssconfig.txt
--------------------------------------------------------------------------------
/urls/xray_urls.txt:
--------------------------------------------------------------------------------
1 | https://raw.githubusercontent.com/Alvin9999/pac2/master/xray/1/config.json
2 | https://raw.githubusercontent.com/Alvin9999/pac2/master/xray/2/config.json
3 | https://raw.githubusercontent.com/Alvin9999/pac2/master/xray/3/config.json
4 | https://raw.githubusercontent.com/Alvin9999/pac2/master/xray/config.json
--------------------------------------------------------------------------------