├── .gitignore ├── AUTHORS ├── LICENSE ├── MANIFEST.in ├── README ├── curlish.py ├── docs ├── Makefile ├── _static │ ├── fireteam.png │ └── screenshot.png ├── _themes │ ├── .gitignore │ └── curlish │ │ ├── layout.html │ │ ├── static │ │ ├── MyriadPro-Light.otf │ │ └── curlish.css_t │ │ └── theme.conf ├── conf.py ├── index.rst └── make.bat ├── install.sh ├── setup.cfg ├── setup.py └── setup_freeze.py /.gitignore: -------------------------------------------------------------------------------- 1 | docs/_build 2 | .DS_Store 3 | build 4 | dist 5 | curl.exe 6 | -------------------------------------------------------------------------------- /AUTHORS: -------------------------------------------------------------------------------- 1 | Curlish is written and maintained by Fireteam Ltd. 2 | 3 | - Armin Ronacher 4 | - Arnout van Meer 5 | - Dave Johnston 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2012 by Fireteam Ltd., see AUTHORS for more details. 2 | 3 | Redistribution and use in source and binary forms, with or without 4 | modification, are permitted provided that the following conditions are 5 | met: 6 | 7 | * Redistributions of source code must retain the above copyright 8 | notice, this list of conditions and the following disclaimer. 9 | 10 | * Redistributions in binary form must reproduce the above 11 | copyright notice, this list of conditions and the following 12 | disclaimer in the documentation and/or other materials provided 13 | with the distribution. 14 | 15 | * The names of the contributors may not be used to endorse or 16 | promote products derived from this software without specific 17 | prior written permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 20 | "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 21 | LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR 22 | A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT 23 | OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, 24 | SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT 25 | LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, 26 | DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY 27 | THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT 28 | (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 29 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 30 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include README 2 | -------------------------------------------------------------------------------- /README: -------------------------------------------------------------------------------- 1 | 2 | // curlish - curl with flames on top 3 | 4 | .(' 5 | /%/\\' 6 | (%(%))' 7 | curl' 8 | 9 | Ever had to speak to an OAuth 2.0 protected 10 | resource for debugging purposes? curl is a 11 | nice tool, but it totally lacks helpers for 12 | dealing with oauth. 13 | 14 | curlish comes for the rescue. It is able to 15 | remember access tokens for you and inject it 16 | into requests. 17 | 18 | Facebook is preconfigured so that you can dive 19 | into testing it: 20 | 21 | $ curlish https://graph.facebook.com/me 22 | 23 | To add more sites you can directly modify the 24 | config file which is located, conveniently 25 | in ~/.ftcurlish.json 26 | 27 | Requirements: Python 2.6 or higher. 28 | 29 | Full automated installation: 30 | 31 | $ curl -L http://bit.ly/curlish | bash 32 | 33 | Installs curlish into ~/.bin for you. 34 | 35 | (If you want to know what it executes, have a 36 | look at the install.sh file in this repo) 37 | 38 | For advanced use see the ~/.ftcurlish.json file 39 | which can be used to automatically add extra 40 | headers to all requests and more. You can 41 | also use the thing with non OAuth endpoints 42 | by just not configuring OAuth. Then it just 43 | injects extra headers and colorizes output. 44 | 45 | -------------------------------------------------------------------------------- /curlish.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | """ 4 | .(' 5 | /%/\\' 6 | (%(%))' 7 | curl' 8 | 9 | Before you can use curlish you need to register the site with the curlish 10 | client. For that you can use the --add-site parameter which will walk you 11 | through the process. 12 | 13 | example: 14 | 15 | $ curlish https://graph.facebook.com/me 16 | 17 | Notes on the authorization_code grant type: curlish spawns an HTTP server 18 | that handles a single request on http://127.0.0.1:62231/ which acts as a 19 | valid redirect target. If you need the authorization_code grant, let it 20 | redirect there. 21 | 22 | common curl options: 23 | -v verbose mode 24 | -i prints the headers 25 | -X METHOD specifies the method 26 | -H "Header: value" emits a header with a value 27 | -d "key=value" emits a pair of form data 28 | 29 | curl extension options: 30 | METHOD shortcut for -XMETHOD if it's one of the known 31 | HTTP methods. 32 | -J key=value transmits a JSON string value. 33 | -J key:=value transmits raw JSON data for a key (bool int etc.) 34 | -J @/path/to/file transmits JSON data loaded from a file. 35 | -J key=@value transmits JSON data loaded from a file for a key. 36 | --ajax Sends an X-Requested-With header with the value 37 | set to XMLHttpRequest. 38 | --cookies Enables simple cookie handling for this request. 39 | It will store cookies in ~/.ftcurlish-cookies as 40 | individual text files for each site. Use 41 | --clear-cookies to remove them. 42 | --hide-jsonp If curlish detects a JSONP response it will by 43 | default keep the wrapper function call around. 44 | If this is set it will appear as if it was a 45 | regular JSON response. 46 | """ 47 | from __future__ import with_statement 48 | import os 49 | import re 50 | import sys 51 | import cgi 52 | import webbrowser 53 | import argparse 54 | try: 55 | import json 56 | from json.encoder import JSONEncoder 57 | except ImportError: 58 | import simplejson as json 59 | from simplejson.encoder import JSONEncoder 60 | import urllib 61 | import urlparse 62 | import subprocess 63 | import base64 64 | from copy import deepcopy 65 | from httplib import HTTPConnection, HTTPSConnection 66 | from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler 67 | from getpass import getpass 68 | from uuid import UUID 69 | 70 | 71 | # Not set when frozen 72 | globals().setdefault('__file__', 'curlish.py') 73 | 74 | 75 | def str_to_uuid(s): 76 | try: 77 | UUID(s) 78 | return s 79 | except: 80 | print "%s is not a valid UUID" % s 81 | sys.exit(1) 82 | 83 | 84 | KNOWN_HTTP_METHODS = set(['GET', 'POST', 'HEAD', 'PUT', 'OPTIONS', 85 | 'TRACE', 'DELETE', 'PATCH']) 86 | 87 | DEFAULT_SETTINGS = { 88 | 'curl_path': None, 89 | 'http_port': 62231, 90 | 'json_indent': 2, 91 | 'sort_keys': True, 92 | 'colors': { 93 | 'statusline_ok': 'green', 94 | 'statusline_error': 'red', 95 | 'header': 'teal', 96 | 'brace': 'teal', 97 | 'operator': None, 98 | 'constant': 'blue', 99 | 'number': 'purple', 100 | 'string': 'yellow', 101 | 'objstring': 'green', 102 | 'jsonpfunc': None 103 | }, 104 | 'sites': { 105 | "facebook": { 106 | "extra_headers": {}, 107 | "request_token_params": { 108 | "scope": "email" 109 | }, 110 | "authorize_url": "https://www.facebook.com/dialog/oauth", 111 | "base_url": "https://graph.facebook.com/", 112 | "client_id": "384088028278656", 113 | "client_secret": "14c75a494cda2e11e8760095ec972915", 114 | "grant_type": "authorization_code", 115 | "access_token_url": "/oauth/access_token" 116 | } 117 | }, 118 | 'token_cache': {} 119 | } 120 | ANSI_CODES = { 121 | 'black': '\x1b[30m', 122 | 'blink': '\x1b[05m', 123 | 'blue': '\x1b[34m', 124 | 'bold': '\x1b[01m', 125 | 'faint': '\x1b[02m', 126 | 'green': '\x1b[32m', 127 | 'purple': '\x1b[35m', 128 | 'red': '\x1b[31m', 129 | 'reset': '\x1b[39;49;00m', 130 | 'standout': '\x1b[03m', 131 | 'teal': '\x1b[36m', 132 | 'underline': '\x1b[04m', 133 | 'white': '\x1b[37m', 134 | 'yellow': '\x1b[33m' 135 | } 136 | 137 | 138 | _list_marker = object() 139 | _value_marker = object() 140 | _jsonp_re = re.compile(r'^(.*?)\s*\((.+?)\);?\s*$(?ms)') 141 | 142 | 143 | def decode_flat_data(pairiter): 144 | def _split_key(name): 145 | result = name.split('.') 146 | for idx, part in enumerate(result): 147 | if part.isdigit(): 148 | result[idx] = int(part) 149 | return result 150 | 151 | def _enter_container(container, key): 152 | if key not in container: 153 | return container.setdefault(key, {_list_marker: False}) 154 | return container[key] 155 | 156 | def _convert(container): 157 | if _value_marker in container: 158 | force_list = False 159 | values = container.pop(_value_marker) 160 | if container.pop(_list_marker): 161 | force_list = True 162 | values.extend(_convert(x[1]) for x in 163 | sorted(container.items())) 164 | if not force_list and len(values) == 1: 165 | values = values[0] 166 | return values 167 | elif container.pop(_list_marker): 168 | return [_convert(x[1]) for x in sorted(container.items())] 169 | return dict((k, _convert(v)) for k, v in container.iteritems()) 170 | 171 | result = {_list_marker: False} 172 | for key, value in pairiter: 173 | parts = _split_key(key) 174 | if not parts: 175 | continue 176 | container = result 177 | for part in parts: 178 | last_container = container 179 | container = _enter_container(container, part) 180 | last_container[_list_marker] = isinstance(part, (int, long)) 181 | container[_value_marker] = [value] 182 | 183 | return _convert(result) 184 | 185 | 186 | def get_color(element): 187 | user_colors = settings.values['colors'] 188 | name = user_colors.get(element) 189 | if name is None and element not in user_colors: 190 | name = DEFAULT_SETTINGS['colors'].get(element) 191 | if name is not None: 192 | return ANSI_CODES.get(name, '') 193 | return '' 194 | 195 | 196 | def isatty(stream): 197 | """Is stdout connected to a terminal or a file?""" 198 | if not hasattr(stream, 'isatty'): 199 | return False 200 | if not stream.isatty(): 201 | return False 202 | return True 203 | 204 | 205 | def is_color_terminal(stream=None): 206 | """Returns `True` if this terminal has colors.""" 207 | if stream is None: 208 | stream = sys.stdout 209 | if not isatty(stream): 210 | return False 211 | if 'COLORTERM' in os.environ: 212 | return True 213 | term = os.environ.get('TERM', 'dumb').lower() 214 | if term in ('xterm', 'linux') or 'color' in term: 215 | return True 216 | return False 217 | 218 | 219 | def fail(message): 220 | """Fails with an error message.""" 221 | print >> sys.stderr, 'error:', message 222 | sys.exit(1) 223 | 224 | 225 | def find_url_arg(arguments): 226 | """Finds the URL argument in a curl argument list.""" 227 | for idx, arg in enumerate(arguments): 228 | if arg.startswith(('http:', 'https:')): 229 | return idx 230 | 231 | 232 | class AuthorizationHandler(BaseHTTPRequestHandler): 233 | """Callback handler for the code based authorization""" 234 | 235 | def do_GET(self): 236 | self.send_response(200, 'OK') 237 | self.send_header('Content-Type', 'text/html') 238 | self.end_headers() 239 | self.server.token_response = dict((k, v[-1]) for k, v in 240 | cgi.parse_qs(self.path.split('?')[-1]).iteritems()) 241 | if 'code' in self.server.token_response: 242 | title = 'Tokens Received' 243 | text = 'The tokens were transmitted successfully to curlish.' 244 | else: 245 | title = 'Error on Token Exchange' 246 | text = 'Could not exchange tokens :-(' 247 | self.wfile.write(''' 248 | 249 | %(title)s 250 | 255 |

%(title)s

256 |

%(text)s 257 |

You can now close this window, it's no longer needed. 258 | ''' % locals()) 259 | self.wfile.close() 260 | 261 | def log_message(self, *args, **kwargs): 262 | pass 263 | 264 | 265 | def get_cookie_path(): 266 | if os.name == 'nt': 267 | return os.path.expandvars(r'%APPDATA%\\FireteamCurlish\\Cookies') 268 | return os.path.expanduser(r'~/.ftcurlish-cookies') 269 | 270 | 271 | class Settings(object): 272 | """Wrapper around the settings file""" 273 | 274 | def __init__(self): 275 | if os.name == 'nt': 276 | self.filename = os.path.expandvars(r'%APPDATA%\\FireteamCurlish\\config.json') 277 | else: 278 | self.filename = os.path.expanduser(r'~/.ftcurlish.json') 279 | 280 | rv = deepcopy(DEFAULT_SETTINGS) 281 | if os.path.isfile(self.filename): 282 | with open(self.filename) as f: 283 | try: 284 | rv.update(json.load(f)) 285 | except Exception, e: 286 | fail('Error: JSON error in config file: %s' % e) 287 | if not rv['curl_path']: 288 | rv['curl_path'] = get_default_curl_path() 289 | self.values = rv 290 | 291 | def save(self): 292 | dirname = os.path.dirname(self.filename) 293 | try: 294 | os.makedirs(dirname) 295 | except OSError: 296 | pass 297 | with open(self.filename, 'w') as f: 298 | json.dump(self.values, f, indent=2) 299 | 300 | 301 | class Site(object): 302 | """Represents a single site.""" 303 | 304 | def __init__(self, name, values): 305 | def _full_url(url): 306 | if self.base_url is not None: 307 | return urlparse.urljoin(self.base_url, url) 308 | return url 309 | self.name = name 310 | self.base_url = values.get('base_url') 311 | self.grant_type = values.get('grant_type', 'authorization_code') 312 | self.access_token_url = _full_url(values.get('access_token_url')) 313 | self.authorize_url = _full_url(values.get('authorize_url')) 314 | self.client_id = values.get('client_id') 315 | self.client_secret = values.get('client_secret') 316 | self.request_token_params = values.get('request_token_params') or {} 317 | self.extra_headers = values.get('extra_headers') or {} 318 | self.bearer_transmission = values.get('bearer_transmission', 'query') 319 | self.default = values.get('default', False) 320 | self.access_token = None 321 | 322 | def make_request(self, method, url, headers=None, data=None): 323 | """Makes an HTTP request to the site.""" 324 | u = urlparse.urlparse(url) 325 | pieces = u.netloc.rsplit(':', 1) 326 | secure = u.scheme == 'https' 327 | host = pieces[0].strip('[]') 328 | if len(pieces) == 2 and pieces[-1].isdigit(): 329 | port = int(pieces[-1]) 330 | else: 331 | port = secure and 443 or 80 332 | conncls = secure and HTTPSConnection or HTTPConnection 333 | conn = conncls(host, port) 334 | if isinstance(data, dict): 335 | data = urllib.urlencode(data) 336 | 337 | real_headers = self.extra_headers.copy() 338 | real_headers.update(headers or ()) 339 | 340 | conn.request(method, u.path, data, real_headers) 341 | resp = conn.getresponse() 342 | 343 | ct = resp.getheader('Content-Type') 344 | if ct.startswith('application/json') or ct.startswith('text/javascript'): 345 | resp_data = json.loads(resp.read()) 346 | elif ct.startswith('text/html'): 347 | fail('Invalid response from server: ' + resp.read()) 348 | else: 349 | resp_data = dict((k, v[-1]) for k, v in 350 | cgi.parse_qs(resp.read()).iteritems()) 351 | 352 | return resp.status, resp_data 353 | 354 | def get_access_token(self, params): 355 | """Tries to load tokens with the given parameters.""" 356 | data = params.copy() 357 | 358 | # Provide the credentials both as a basic authorization header as well as 359 | # the parameters in the URL. Should make everybody happy. At least I hope so. 360 | data['client_id'] = self.client_id 361 | data['client_secret'] = self.client_secret 362 | creds = self.client_id + ':' + self.client_secret 363 | headers = {'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8', 364 | 'Authorization': 'Basic ' + base64.b64encode(creds)} 365 | 366 | status, data = self.make_request('POST', 367 | self.access_token_url, data=data, headers=headers) 368 | if status in (200, 201): 369 | return data['access_token'] 370 | error = data.get('error') or 'unknown_error' 371 | if error in ('invalid_grant', 'access_denied'): 372 | return None 373 | error_msg = data.get('error_description') or 'no description' 374 | fail("Couldn't authorize: %s: %s" % (error, error_msg)) 375 | 376 | def request_password_grant(self): 377 | while 1: 378 | params = {'grant_type': 'password'} 379 | params['username'] = raw_input('Username: ') 380 | params['password'] = getpass() 381 | params.update(self.request_token_params) 382 | rv = self.get_access_token(params) 383 | if rv is None: 384 | print 'Error: invalid credentials' 385 | continue 386 | settings.values['token_cache'][self.name] = rv 387 | return 388 | 389 | def request_client_credentials_grant(self): 390 | params = {'grant_type': 'client_credentials'} 391 | params.update(self.request_token_params) 392 | rv = self.get_access_token(params) 393 | if rv is None: 394 | print 'Error: client_credentials token request failed' 395 | else: 396 | settings.values['token_cache'][self.name] = rv 397 | 398 | def request_authorization_code_grant(self): 399 | redirect_uri = 'http://127.0.0.1:%d/' % settings.values['http_port'] 400 | params = { 401 | 'client_id': self.client_id, 402 | 'redirect_uri': redirect_uri, 403 | 'response_type': 'code' 404 | } 405 | params.update(self.request_token_params) 406 | browser_url = '%s?%s' % ( 407 | self.authorize_url, 408 | urllib.urlencode(params) 409 | ) 410 | webbrowser.open(browser_url) 411 | server_address = ('127.0.0.1', settings.values['http_port']) 412 | httpd = HTTPServer(server_address, AuthorizationHandler) 413 | httpd.token_response = None 414 | httpd.handle_request() 415 | if 'code' in httpd.token_response: 416 | return self.exchange_code_for_token(httpd.token_response['code'], 417 | redirect_uri) 418 | print 'Could not sign in: grant cancelled' 419 | for key, value in httpd.token_response.iteritems(): 420 | print ' %s: %s' % (key, value) 421 | sys.exit(1) 422 | 423 | def exchange_code_for_token(self, code, redirect_uri): 424 | settings.values['token_cache'][self.name] = self.get_access_token({ 425 | 'code': code, 426 | 'grant_type': 'authorization_code', 427 | 'redirect_uri': redirect_uri 428 | }) 429 | 430 | def request_tokens(self): 431 | if self.grant_type == 'password': 432 | self.request_password_grant() 433 | elif self.grant_type == 'authorization_code': 434 | self.request_authorization_code_grant() 435 | elif self.grant_type == 'client_credentials': 436 | self.request_client_credentials_grant() 437 | else: 438 | fail('Invalid grant configured: %s' % self.grant_type) 439 | 440 | def fetch_token_if_necessarys(self): 441 | token_cache = settings.values['token_cache'] 442 | if token_cache.get(self.name) is None: 443 | self.request_tokens() 444 | self.access_token = token_cache[self.name] 445 | 446 | 447 | def get_site_by_name(name): 448 | """Finds a site by its name.""" 449 | rv = settings.values['sites'].get(name) 450 | if rv is not None: 451 | return Site(name, rv) 452 | 453 | 454 | def get_site(site_name, url_arg): 455 | """Tries to look up a site from the config or automatically.""" 456 | if site_name is not None: 457 | site = get_site_by_name(site_name) 458 | if site is not None: 459 | return site 460 | fail('Site %s does not exist' % site_name) 461 | 462 | matches = [] 463 | for name, site in settings.values['sites'].iteritems(): 464 | base_url = site.get('base_url') 465 | if base_url and url_arg.startswith(base_url): 466 | matches.append(Site(name, site)) 467 | if len(matches) == 1: 468 | return matches[0] 469 | for match in matches: 470 | if match.default: 471 | return match 472 | if len(matches) > 1: 473 | fail('Too many matches. Please specificy an application ' 474 | 'explicitly or set a default') 475 | 476 | 477 | def get_default_curl_path(): 478 | """Tries to find curl and returns the path to it.""" 479 | def tryrun(path): 480 | try: 481 | subprocess.call([path, '--version'], stdout=subprocess.PIPE, 482 | stdin=subprocess.PIPE) 483 | except OSError: 484 | return False 485 | return True 486 | if tryrun('curl'): 487 | return 'curl' 488 | base = os.path.abspath(os.path.dirname(__file__)) 489 | for name in 'curl', 'curl.exe': 490 | fullpath = os.path.join(base, name) 491 | print fullpath 492 | if tryrun(fullpath): 493 | return fullpath 494 | 495 | 496 | def colorize_json_stream(iterator): 497 | """Adds colors to a JSON event stream.""" 498 | for event in iterator: 499 | color = None 500 | e = event.strip() 501 | if e in '[]{}': 502 | color = get_color('brace') 503 | elif e in ',:': 504 | color = get_color('operator') 505 | elif e[:1] == '"': 506 | color = get_color('string') 507 | elif e in ('true', 'false', 'null'): 508 | color = get_color('constant') 509 | else: 510 | color = get_color('number') 511 | if color is not None: 512 | event = color + event + ANSI_CODES['reset'] 513 | yield event 514 | 515 | 516 | def print_formatted_json(json_data, jsonp_func=None, stream=None): 517 | """Reindents JSON and colorizes if wanted. We use our own wrapper 518 | around json.dumps because we want to inject colors and the simplejson 519 | iterator encoder does some buffering between separate events that makes 520 | it really hard to inject colors. 521 | """ 522 | if stream is None: 523 | stream = sys.stdout 524 | if is_color_terminal(stream): 525 | def colorize(colorname, text): 526 | color = get_color(colorname) 527 | reset = ANSI_CODES['reset'] 528 | return color + text + reset 529 | else: 530 | colorize = lambda x, t: t 531 | 532 | def _walk(obj, indentation, inline=False, w=stream.write): 533 | i = ' ' * (indentation * settings.values['json_indent']) 534 | if not inline: 535 | w(i) 536 | if isinstance(obj, basestring): 537 | w(colorize('string', json.dumps(obj))) 538 | elif isinstance(obj, (int, long, float)): 539 | w(colorize('number', json.dumps(obj))) 540 | elif obj in (True, False, None): 541 | w(colorize('constant', json.dumps(obj))) 542 | elif isinstance(obj, list): 543 | if not obj: 544 | w(colorize('brace', '[]')) 545 | else: 546 | w(colorize('brace', '[\n')) 547 | for idx, item in enumerate(obj): 548 | if idx: 549 | w(colorize('operator', ',\n')) 550 | _walk(item, indentation + 1) 551 | w(colorize('brace', '\n' + i + ']')) 552 | elif isinstance(obj, dict): 553 | if not obj: 554 | w(colorize('brace', '{}')) 555 | else: 556 | w(colorize('brace', '{\n')) 557 | if settings.values['sort_keys']: 558 | items = sorted(obj.items(), key=lambda x: x[0].lower()) 559 | else: 560 | items = obj.iteritems() 561 | for idx, (key, value) in enumerate(items): 562 | if idx: 563 | w(colorize('operator', ',\n')) 564 | ki = i + ' ' * settings.values['json_indent'] 565 | w(ki + colorize('objstring', json.dumps(key))) 566 | w(colorize('operator', ': ')) 567 | _walk(value, indentation + 1, inline=True) 568 | w(i + colorize('brace', '\n' + i + '}')) 569 | else: 570 | # hmm. should not happen, but let's just assume it might 571 | # because of json changes 572 | w(json.dumps(obj)) 573 | 574 | 575 | if jsonp_func is not None: 576 | stream.write(colorize('jsonpfunc', jsonp_func)) 577 | stream.write(colorize('brace', '(')) 578 | _walk(json_data, 0) 579 | stream.write(colorize('brace', ')')) 580 | stream.write(colorize('operator', ';')) 581 | else: 582 | _walk(json_data, 0) 583 | stream.write('\n') 584 | stream.flush() 585 | 586 | 587 | def beautify_curl_output(p, hide_headers, hide_jsonp=False, 588 | stream=None, json_stream=False): 589 | """Parses curl output and adds colors and reindents as necessary.""" 590 | if stream is None: 591 | stream = sys.stdout 592 | json_body = False 593 | might_be_javascript = False 594 | has_colors = is_color_terminal() 595 | 596 | # Headers 597 | while 1: 598 | line = p.stdout.readline() 599 | if not line: 600 | break 601 | if has_colors and re.search(r'^HTTP/', line): 602 | if re.search('HTTP/\d+.\d+ [45]\d+', line): 603 | color = get_color('statusline_error') 604 | else: 605 | color = get_color('statusline_ok') 606 | if not hide_headers: 607 | stream.write(color + line + ANSI_CODES['reset']) 608 | continue 609 | if re.search(r'^Content-Type:\s*(text/javascript|application/(.+?\+)?json)\s*(?i)', line): 610 | json_body = True 611 | if 'javascript' in line: 612 | might_be_javascript = True 613 | if not hide_headers: 614 | # Nicer headers if we detect them 615 | if not line.startswith(' ') and ':' in line: 616 | key, value = line.split(':', 1) 617 | else: 618 | key = None 619 | if has_colors and key is not None: 620 | stream.write(get_color('header') + key + ANSI_CODES['reset'] 621 | + ': ' + value.lstrip()) 622 | else: 623 | stream.write(line) 624 | stream.flush() 625 | if line == '\r\n': 626 | break 627 | 628 | # JSON streams 629 | if json_stream: 630 | while 1: 631 | line = p.stdout.readline() 632 | if not line: 633 | break 634 | line = line.strip() 635 | if line: 636 | try: 637 | data = json.loads(line) 638 | except Exception: 639 | print 'invalid json:', line 640 | else: 641 | print_formatted_json(data, stream=stream) 642 | return 643 | 644 | iterable = p.stdout 645 | 646 | # JSON Body. Do not reindent if we have headers and are piping 647 | # into a file because of changing content length. 648 | if json_body and (hide_headers or isatty(stream)): 649 | body = ''.join(iterable) 650 | json_body = body 651 | jsonp_func = None 652 | if might_be_javascript: 653 | jsonp_match = _jsonp_re.match(body) 654 | if jsonp_match is not None: 655 | if not hide_jsonp: 656 | jsonp_func = jsonp_match.group(1) 657 | json_body = jsonp_match.group(2) 658 | try: 659 | data = json.loads(json_body) 660 | except Exception: 661 | # Something went wrong, it's malformed. Just make it an 662 | # iterable again and print it normally; 663 | iterable = body.splitlines(True) 664 | else: 665 | print_formatted_json(data, jsonp_func, stream) 666 | return 667 | 668 | # Regular body 669 | for line in iterable: 670 | stream.write(line) 671 | stream.flush() 672 | 673 | 674 | def clear_token_cache(site_name): 675 | """Delets all tokens or the token of a site.""" 676 | site = None 677 | if site_name is not None: 678 | site = get_site_by_name(site_name) 679 | if site is None: 680 | fail('Site %s does not exist' % site_name) 681 | if site is None: 682 | settings.values['token_cache'] = {} 683 | print 'Cleared the token cache' 684 | else: 685 | settings.values['token_cache'].pop(site.name, None) 686 | print 'Cleared the token cache for %s' % site.name 687 | settings.save() 688 | 689 | 690 | def init_config(): 691 | """Initializes the config""" 692 | print 'Initialized the config in %s' % settings.filename 693 | settings.save() 694 | 695 | 696 | def add_site(site_name): 697 | """Registers a new site with the config.""" 698 | def prompt(prompt, one_of=None, default=None): 699 | if default is not None: 700 | prompt += ' [%s]' % default 701 | if one_of: 702 | prompt += ' (options=%s)' % ', '.join(sorted(one_of)) 703 | while 1: 704 | value = raw_input(prompt + ': ') 705 | if value: 706 | if one_of and value not in one_of: 707 | print 'error: invalid value' 708 | continue 709 | return value 710 | if default is not None: 711 | return default 712 | 713 | authorize_url = None 714 | base_url = prompt('base_url') 715 | if prompt('Configure OAuth 2.0?', ['yes', 'no'], 'yes') == 'yes': 716 | grant_type = prompt('grant_type', 717 | one_of=['password', 'authorization_code'], 718 | default='authorization_code') 719 | access_token_url = prompt('access_token_url') 720 | if grant_type == 'authorization_code': 721 | authorize_url = prompt('authorize_url') 722 | client_id = prompt('client_id') 723 | client_secret = prompt('client_secret') 724 | bearer_transmission = prompt('bearer_transmission', 725 | one_of=['header', 'query'], default='query') 726 | else: 727 | grant_type = None 728 | access_token_url = None 729 | client_id = None 730 | client_secret = None 731 | bearer_transmission = None 732 | 733 | settings.values['sites'][site_name] = { 734 | 'extra_headers': {}, 735 | 'request_token_params': {}, 736 | 'base_url': base_url, 737 | 'grant_type': grant_type, 738 | 'base_url': base_url, 739 | 'access_token_url': access_token_url, 740 | 'authorize_url': authorize_url, 741 | 'client_id': client_id, 742 | 'client_secret': client_secret, 743 | 'grant_type': grant_type, 744 | 'bearer_transmission': bearer_transmission, 745 | 'default': False 746 | } 747 | settings.values['token_cache'].pop(site_name, None) 748 | settings.save() 749 | print 'Site %s added' % site_name 750 | 751 | 752 | def remove_site(site_name): 753 | """Removes a site from the config.""" 754 | try: 755 | settings.values['sites'].pop(site_name) 756 | except KeyError: 757 | fail('Site %s does not exist' % site_name) 758 | settings.save() 759 | print 'Site %s removed' % site_name 760 | 761 | 762 | def list_sites(): 763 | """Prints a list of all sites.""" 764 | print 'Registered sites:' 765 | print 766 | for name, site in sorted(settings.values['sites'].items()): 767 | print ' %s' % name 768 | for key, value in sorted(site.items()): 769 | if isinstance(value, dict): 770 | print ' %s:%s' % (key, not value and ' -' or '') 771 | for key, value in sorted(value.items()): 772 | print ' %s: %s' % (key, value) 773 | else: 774 | print ' %s: %s' % (key, value) 775 | print 776 | 777 | 778 | def clear_cookies(site_name): 779 | if site_name is None: 780 | import shutil 781 | try: 782 | shutil.rmtree(get_cookie_path()) 783 | except (OSError, IOError): 784 | pass 785 | print 'Deleted all cookies' 786 | return 787 | if site_name not in settings.values['sites']: 788 | fail('Site %s does not exist' % site_name) 789 | try: 790 | os.remove(os.path.join(get_cookie_path(), site_name + '.txt')) 791 | except OSError: 792 | pass 793 | print 'Cookies for %s deleted' % site_name 794 | 795 | 796 | def add_content_type_if_missing(args, content_type): 797 | """Very basic hack that adds a content type if no content type 798 | was mentioned so far. 799 | """ 800 | was_h = False 801 | for arg in args: 802 | iarg = arg.lower() 803 | if iarg.startswith('-hcontent-type'): 804 | return 805 | elif iarg == '-h': 806 | was_h = True 807 | elif was_h: 808 | if iarg.startswith('content-type'): 809 | return 810 | was_h = False 811 | args.append('-H') 812 | args.append('Content-Type: ' + content_type) 813 | 814 | 815 | def handle_curlish_arguments(site, args): 816 | new_args = [] 817 | json_pairs = [] 818 | use_cookies = False 819 | hide_jsonp = False 820 | 821 | argiter = iter(args) 822 | def _get_next_arg(error): 823 | try: 824 | return argiter.next() 825 | except StopIteration: 826 | fail('Error: ' + error) 827 | 828 | def handle_json_value(value): 829 | dkey = '' 830 | def _load_json_value(filename): 831 | try: 832 | with open(filename) as f: 833 | return json.load(f) 834 | except IOError as e: 835 | fail('Error: could not read from file: %s' % e) 836 | except Exception as e: 837 | fail('Error: invalid JSON data for "%s"' % dkey) 838 | 839 | if value[:1] == '@': 840 | value = _load_json_value(value[1:]) 841 | else: 842 | args = value.split('=', 1) 843 | if len(args) < 2: 844 | fail('Error: wrong argument count for -J') 845 | dkey, value = args 846 | if dkey.endswith(':'): 847 | dkey = dkey[:-1] 848 | try: 849 | value = json.loads(value) 850 | except Exception: 851 | fail('Error: invalid JSON data for "%s"' % dkey) 852 | elif value[:1] == '@': 853 | value = _load_json_value(value[1:]) 854 | json_pairs.append((dkey, value)) 855 | 856 | last_arg_was_x = False 857 | for idx, arg in enumerate(argiter): 858 | # Automatic -X in front of known http method names 859 | if arg in KNOWN_HTTP_METHODS and not last_arg_was_x: 860 | new_args.append('-X' + arg) 861 | # Shortcut for X-Requested-With 862 | elif arg == '--ajax': 863 | new_args.append('-H') 864 | new_args.append('X-Requested-With: XMLHttpRequest') 865 | # Cookie support 866 | elif arg == '--cookies': 867 | use_cookies = True 868 | # Hide JSONP function name? 869 | elif arg == '--hide-jsonp': 870 | hide_jsonp = True 871 | # JSON data 872 | elif arg == '-J': 873 | handle_json_value(_get_next_arg('-J requires an argument')) 874 | elif arg.startswith('-J'): 875 | handle_json_value(arg[2:]) 876 | # Regular argument 877 | else: 878 | new_args.append(arg) 879 | last_arg_was_x = arg == '-X' 880 | 881 | json_data = decode_flat_data(json_pairs) 882 | need_json = bool(json_data) 883 | if len(json_data) == 1 and '' in json_data: 884 | json_data = json_data[''] 885 | 886 | if need_json: 887 | add_content_type_if_missing(new_args, 'application/json') 888 | new_args.append('--data-binary') 889 | new_args.append(json.dumps(json_data)) 890 | 891 | if use_cookies: 892 | cookie_path = get_cookie_path() 893 | if not os.path.isdir(cookie_path): 894 | os.makedirs(cookie_path) 895 | if site is None: 896 | cookie_filename = os.path.join(cookie_path, '_default.txt') 897 | else: 898 | cookie_filename = os.path.join(cookie_path, site.name + '.txt') 899 | new_args.extend(( 900 | '-c', cookie_filename, 901 | '-b', cookie_filename 902 | )) 903 | 904 | return new_args, {'hide_jsonp': hide_jsonp} 905 | 906 | 907 | def invoke_curl(site, curl_path, args, url_arg, dump_args=False, 908 | dump_response=None, json_stream=False): 909 | if args[0] == '--': 910 | args.pop(0) 911 | 912 | if not curl_path: 913 | fail('Could not find curl. Put it into your config') 914 | 915 | url = args[url_arg] 916 | if site is not None and site.bearer_transmission is not None: 917 | if site.bearer_transmission == 'header': 918 | args += ['-H', 'Authorization: Bearer %s' % site.access_token] 919 | elif site.bearer_transmission == 'query': 920 | url += ('?' in url and '&' or '?') + 'access_token=' + \ 921 | urllib.quote(site.access_token) 922 | else: 923 | fail('Bearer transmission %s is unknown.' % site.bearer_transmission) 924 | 925 | args[url_arg] = url 926 | 927 | if site is not None: 928 | for key, value in site.extra_headers.iteritems(): 929 | args += ['-H', '%s: %s' % (key, value)] 930 | 931 | # Force response headers 932 | hide_headers = False 933 | if '-i' not in args: 934 | args.append('-i') 935 | hide_headers = True 936 | 937 | # Hide stats but keep errors 938 | args.append('-sS') 939 | 940 | # Unbuffered 941 | args.append('-N') 942 | 943 | # Disable expect by default 944 | args.append('-HExpect:') 945 | 946 | # Handle curlish specific argument shortcuts 947 | args, options = handle_curlish_arguments(site, args) 948 | 949 | if dump_args: 950 | print ' '.join('"%s"' % x.replace('"', '\\"') if 951 | any(y.isspace() for y in x) else x for x in args) 952 | return 953 | 954 | p = subprocess.Popen([curl_path] + args, stdout=subprocess.PIPE, 955 | bufsize=1) 956 | if dump_response is not None: 957 | f = open(dump_response, 'w') 958 | else: 959 | f = sys.stdout 960 | beautify_curl_output(p, hide_headers, hide_jsonp=options['hide_jsonp'], 961 | stream=f, json_stream=json_stream) 962 | if f is not sys.stdout: 963 | f.close() 964 | sys.exit(p.wait()) 965 | 966 | 967 | # Load the settings once before we start up 968 | settings = Settings() 969 | 970 | 971 | def main(): 972 | parser = argparse.ArgumentParser(description="curl, with flames on top", 973 | add_help=False) 974 | parser.add_argument('-h', '--help', action='store_true', 975 | help='Prints this help.') 976 | parser.add_argument('--init-config', action='store_true', 977 | help='Adds an empty config if it is currently ' 978 | 'missing.') 979 | parser.add_argument('--site', help='The site to use. By default it will ' 980 | 'guess the site from the URL of the request preferring ' 981 | 'sites with default set to True.') 982 | parser.add_argument('--clear-token-cache', action='store_true', 983 | help='Clears the token cache. By default of all the ' 984 | 'sites, can be limited to one site with --site.') 985 | parser.add_argument('--add-site', help='Registers a new site with curlish.', 986 | metavar='NAME') 987 | parser.add_argument('--remove-site', help='Unregisters a site from curlish.', 988 | metavar='NAME') 989 | parser.add_argument('--list-sites', help='Lists all known sites', 990 | action='store_true') 991 | parser.add_argument('--clear-cookies', action='store_true', 992 | help='Deletes all the cookies or cookies that belong ' 993 | 'to one specific site only.') 994 | parser.add_argument('--dump-curl-args', action='store_true', 995 | help='Instead of executing dump the curl command line ' 996 | 'arguments for this call') 997 | parser.add_argument('--dump-response', help='Instead of writing the response ' 998 | 'to stdout, write the response into a file instead') 999 | parser.add_argument('--json-stream', action='store_true', 1000 | default=False, 1001 | help='Assumes that the response from the server is a JSON ' 1002 | 'response stream and colorizes each element ' 1003 | 'individually and skips past empty chunks.') 1004 | 1005 | try: 1006 | args, extra_args = parser.parse_known_args() 1007 | except Exception, e: 1008 | print e 1009 | sys.exit(1) 1010 | 1011 | if args.help: 1012 | parser.print_help() 1013 | print __doc__.rstrip() 1014 | return 1015 | 1016 | # Custom commands 1017 | if args.clear_token_cache: 1018 | clear_token_cache(args.site) 1019 | return 1020 | if args.init_config: 1021 | init_config() 1022 | return 1023 | if args.add_site: 1024 | add_site(args.add_site) 1025 | return 1026 | if args.remove_site: 1027 | remove_site(args.remove_site) 1028 | return 1029 | if args.list_sites: 1030 | list_sites() 1031 | return 1032 | if args.clear_cookies: 1033 | clear_cookies(args.site) 1034 | return 1035 | 1036 | # Redirect everything else to curl via the site 1037 | url_arg = find_url_arg(extra_args) 1038 | if url_arg is None: 1039 | parser.print_usage() 1040 | return 1041 | site = get_site(args.site, extra_args[url_arg]) 1042 | if site is not None and site.grant_type is not None: 1043 | site.fetch_token_if_necessarys() 1044 | settings.save() 1045 | invoke_curl(site, settings.values['curl_path'], extra_args, url_arg, 1046 | dump_args=args.dump_curl_args, 1047 | dump_response=args.dump_response, 1048 | json_stream=args.json_stream) 1049 | 1050 | 1051 | if __name__ == '__main__': 1052 | try: 1053 | main() 1054 | except KeyboardInterrupt: 1055 | pass 1056 | -------------------------------------------------------------------------------- /docs/Makefile: -------------------------------------------------------------------------------- 1 | # Makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = sphinx-build 7 | PAPER = 8 | BUILDDIR = _build 9 | 10 | # Internal variables. 11 | PAPEROPT_a4 = -D latex_paper_size=a4 12 | PAPEROPT_letter = -D latex_paper_size=letter 13 | ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . 14 | # the i18n builder cannot share the environment and doctrees with the others 15 | I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . 16 | 17 | .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext 18 | 19 | help: 20 | @echo "Please use \`make ' where is one of" 21 | @echo " html to make standalone HTML files" 22 | @echo " dirhtml to make HTML files named index.html in directories" 23 | @echo " singlehtml to make a single large HTML file" 24 | @echo " pickle to make pickle files" 25 | @echo " json to make JSON files" 26 | @echo " htmlhelp to make HTML files and a HTML help project" 27 | @echo " qthelp to make HTML files and a qthelp project" 28 | @echo " devhelp to make HTML files and a Devhelp project" 29 | @echo " epub to make an epub" 30 | @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" 31 | @echo " latexpdf to make LaTeX files and run them through pdflatex" 32 | @echo " text to make text files" 33 | @echo " man to make manual pages" 34 | @echo " texinfo to make Texinfo files" 35 | @echo " info to make Texinfo files and run them through makeinfo" 36 | @echo " gettext to make PO message catalogs" 37 | @echo " changes to make an overview of all changed/added/deprecated items" 38 | @echo " linkcheck to check all external links for integrity" 39 | @echo " doctest to run all doctests embedded in the documentation (if enabled)" 40 | 41 | clean: 42 | -rm -rf $(BUILDDIR)/* 43 | 44 | html: 45 | $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html 46 | @echo 47 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." 48 | 49 | dirhtml: 50 | $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml 51 | @echo 52 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." 53 | 54 | singlehtml: 55 | $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml 56 | @echo 57 | @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." 58 | 59 | pickle: 60 | $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle 61 | @echo 62 | @echo "Build finished; now you can process the pickle files." 63 | 64 | json: 65 | $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json 66 | @echo 67 | @echo "Build finished; now you can process the JSON files." 68 | 69 | htmlhelp: 70 | $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp 71 | @echo 72 | @echo "Build finished; now you can run HTML Help Workshop with the" \ 73 | ".hhp project file in $(BUILDDIR)/htmlhelp." 74 | 75 | qthelp: 76 | $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp 77 | @echo 78 | @echo "Build finished; now you can run "qcollectiongenerator" with the" \ 79 | ".qhcp project file in $(BUILDDIR)/qthelp, like this:" 80 | @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Curlish.qhcp" 81 | @echo "To view the help file:" 82 | @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Curlish.qhc" 83 | 84 | devhelp: 85 | $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp 86 | @echo 87 | @echo "Build finished." 88 | @echo "To view the help file:" 89 | @echo "# mkdir -p $$HOME/.local/share/devhelp/Curlish" 90 | @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Curlish" 91 | @echo "# devhelp" 92 | 93 | epub: 94 | $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub 95 | @echo 96 | @echo "Build finished. The epub file is in $(BUILDDIR)/epub." 97 | 98 | latex: 99 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 100 | @echo 101 | @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." 102 | @echo "Run \`make' in that directory to run these through (pdf)latex" \ 103 | "(use \`make latexpdf' here to do that automatically)." 104 | 105 | latexpdf: 106 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 107 | @echo "Running LaTeX files through pdflatex..." 108 | $(MAKE) -C $(BUILDDIR)/latex all-pdf 109 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." 110 | 111 | text: 112 | $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text 113 | @echo 114 | @echo "Build finished. The text files are in $(BUILDDIR)/text." 115 | 116 | man: 117 | $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man 118 | @echo 119 | @echo "Build finished. The manual pages are in $(BUILDDIR)/man." 120 | 121 | texinfo: 122 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 123 | @echo 124 | @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." 125 | @echo "Run \`make' in that directory to run these through makeinfo" \ 126 | "(use \`make info' here to do that automatically)." 127 | 128 | info: 129 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 130 | @echo "Running Texinfo files through makeinfo..." 131 | make -C $(BUILDDIR)/texinfo info 132 | @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." 133 | 134 | gettext: 135 | $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale 136 | @echo 137 | @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." 138 | 139 | changes: 140 | $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes 141 | @echo 142 | @echo "The overview file is in $(BUILDDIR)/changes." 143 | 144 | linkcheck: 145 | $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck 146 | @echo 147 | @echo "Link check complete; look for any errors in the above output " \ 148 | "or in $(BUILDDIR)/linkcheck/output.txt." 149 | 150 | doctest: 151 | $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest 152 | @echo "Testing of doctests in the sources finished, look at the " \ 153 | "results in $(BUILDDIR)/doctest/output.txt." 154 | -------------------------------------------------------------------------------- /docs/_static/fireteam.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fireteam/curlish/50ace8238b0f6eeb137842f8dd6f6c6d13bad3ab/docs/_static/fireteam.png -------------------------------------------------------------------------------- /docs/_static/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fireteam/curlish/50ace8238b0f6eeb137842f8dd6f6c6d13bad3ab/docs/_static/screenshot.png -------------------------------------------------------------------------------- /docs/_themes/.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | *.pyo 3 | .DS_Store 4 | -------------------------------------------------------------------------------- /docs/_themes/curlish/layout.html: -------------------------------------------------------------------------------- 1 | {% extends "basic/layout.html" %} 2 | {% block header %} 3 | {{ super() }} 4 | {% if pagename == 'index' %} 5 |

6 | {% endif %} 7 | {% endblock %} 8 | {% block footer %} 9 | {% if pagename == 'index' %} 10 |
11 |
12 | Curlish was conjured by the fine folks from Fireteam and brought to you 14 | under the very liberal three clause BSD license. 15 |
16 | {% endif %} 17 | {% endblock %} 18 | {# do not display relbars #} 19 | {% block relbar1 %}{% endblock %} 20 | {% block relbar2 %} 21 | {% if theme_github_fork %} 22 | Fork me on GitHub 24 | {% endif %} 25 | {% endblock %} 26 | {% block sidebar1 %}{% endblock %} 27 | {% block sidebar2 %}{% endblock %} 28 | -------------------------------------------------------------------------------- /docs/_themes/curlish/static/MyriadPro-Light.otf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fireteam/curlish/50ace8238b0f6eeb137842f8dd6f6c6d13bad3ab/docs/_themes/curlish/static/MyriadPro-Light.otf -------------------------------------------------------------------------------- /docs/_themes/curlish/static/curlish.css_t: -------------------------------------------------------------------------------- 1 | /* 2 | * curlish.css_t 3 | * ~~~~~~~~~~~~~ 4 | * 5 | * Sphinx stylesheet -- curlish theme based on flask ones. 6 | * 7 | * :copyright: Copyright 2007-2012 by the Sphinx team, see AUTHORS. 8 | * :license: BSD, see LICENSE for details. 9 | * 10 | */ 11 | 12 | @import url("basic.css"); 13 | @import url(http://fonts.googleapis.com/css?family=Lato:light,regular); 14 | 15 | /* -- page layout ----------------------------------------------------------- */ 16 | 17 | body { 18 | font-family: 'Ubuntu', sans-serif; 19 | font-weight: 300; 20 | font-size: 17px; 21 | color: #000; 22 | background: white; 23 | margin: 0; 24 | padding: 0; 25 | } 26 | 27 | div.documentwrapper { 28 | float: left; 29 | width: 100%; 30 | } 31 | 32 | div.bodywrapper { 33 | margin: 40px auto 0 auto; 34 | width: 700px; 35 | } 36 | 37 | hr { 38 | border: 1px solid #B1B4B6; 39 | } 40 | 41 | div.body { 42 | background-color: #ffffff; 43 | color: #3E4349; 44 | padding: 0 30px 30px 30px; 45 | } 46 | 47 | img.floatingflask { 48 | padding: 0 0 10px 10px; 49 | float: right; 50 | } 51 | 52 | div.footer { 53 | text-align: right; 54 | color: #888; 55 | padding: 10px; 56 | font-size: 14px; 57 | width: 650px; 58 | margin: 0 auto 40px auto; 59 | } 60 | 61 | div.footer a { 62 | color: #888; 63 | text-decoration: underline; 64 | } 65 | 66 | div.related { 67 | line-height: 32px; 68 | color: #888; 69 | } 70 | 71 | div.related ul { 72 | padding: 0 0 0 10px; 73 | } 74 | 75 | div.related a { 76 | color: #444; 77 | } 78 | 79 | /* -- body styles ----------------------------------------------------------- */ 80 | 81 | a { 82 | color: #215974; 83 | text-decoration: underline; 84 | } 85 | 86 | a:hover { 87 | color: #888; 88 | text-decoration: underline; 89 | } 90 | 91 | div.body { 92 | padding-bottom: 40px; /* saved for footer */ 93 | } 94 | 95 | div.body h1, 96 | div.body h2, 97 | div.body h3, 98 | div.body h4, 99 | div.body h5, 100 | div.body h6 { 101 | font-family: 'Lato', sans-serif; 102 | font-weight: 300; 103 | margin: 30px 0px 10px 0px; 104 | padding: 0; 105 | color: black; 106 | } 107 | 108 | div.indexwrapper h1 { 109 | font-weight: 60; 110 | font-size: 50px; 111 | margin: 0 0 10px -20px; 112 | } 113 | 114 | div.body h2 { font-size: 180%; } 115 | div.body h3 { font-size: 150%; } 116 | div.body h4 { font-size: 130%; } 117 | div.body h5 { font-size: 100%; } 118 | div.body h6 { font-size: 100%; } 119 | 120 | a.headerlink { 121 | color: white; 122 | padding: 0 4px; 123 | text-decoration: none; 124 | } 125 | 126 | a.headerlink:hover { 127 | color: #444; 128 | background: #eaeaea; 129 | } 130 | 131 | div.body p, div.body dd, div.body li { 132 | line-height: 1.4em; 133 | } 134 | 135 | div.admonition { 136 | background: #fafafa; 137 | margin: 20px -30px; 138 | padding: 10px 30px; 139 | border-top: 1px solid #ccc; 140 | border-bottom: 1px solid #ccc; 141 | } 142 | 143 | div.admonition p.admonition-title { 144 | font-family: 'Garamond', 'Georgia', serif; 145 | font-weight: normal; 146 | font-size: 24px; 147 | margin: 0 0 10px 0; 148 | padding: 0; 149 | line-height: 1; 150 | } 151 | 152 | div.admonition p.last { 153 | margin-bottom: 0; 154 | } 155 | 156 | div.highlight{ 157 | background-color: white; 158 | } 159 | 160 | dt:target, .highlight { 161 | background: #FAF3E8; 162 | } 163 | 164 | div.note { 165 | background-color: #eee; 166 | border: 1px solid #ccc; 167 | } 168 | 169 | div.seealso { 170 | background-color: #ffc; 171 | border: 1px solid #ff6; 172 | } 173 | 174 | div.topic { 175 | background-color: #eee; 176 | } 177 | 178 | div.warning { 179 | background-color: #ffe4e4; 180 | border: 1px solid #f66; 181 | } 182 | 183 | p.admonition-title { 184 | display: inline; 185 | } 186 | 187 | p.admonition-title:after { 188 | content: ":"; 189 | } 190 | 191 | pre, tt { 192 | font-family: 'Consolas', 'Menlo', 'Deja Vu Sans Mono', 'Bitstream Vera Sans Mono', monospace; 193 | font-size: 0.85em; 194 | } 195 | 196 | img.screenshot { 197 | } 198 | 199 | tt.descname, tt.descclassname { 200 | font-size: 0.95em; 201 | } 202 | 203 | tt.descname { 204 | padding-right: 0.08em; 205 | } 206 | 207 | img.screenshot { 208 | -moz-box-shadow: 2px 2px 4px #eee; 209 | -webkit-box-shadow: 2px 2px 4px #eee; 210 | box-shadow: 2px 2px 4px #eee; 211 | } 212 | 213 | table.docutils { 214 | border: 1px solid #888; 215 | -moz-box-shadow: 2px 2px 4px #eee; 216 | -webkit-box-shadow: 2px 2px 4px #eee; 217 | box-shadow: 2px 2px 4px #eee; 218 | } 219 | 220 | table.docutils td, table.docutils th { 221 | border: 1px solid #888; 222 | padding: 0.25em 0.7em; 223 | } 224 | 225 | table.field-list, table.footnote { 226 | border: none; 227 | -moz-box-shadow: none; 228 | -webkit-box-shadow: none; 229 | box-shadow: none; 230 | } 231 | 232 | table.footnote { 233 | margin: 15px 0; 234 | width: 100%; 235 | border: 1px solid #eee; 236 | } 237 | 238 | table.field-list th { 239 | padding: 0 0.8em 0 0; 240 | } 241 | 242 | table.field-list td { 243 | padding: 0; 244 | } 245 | 246 | table.footnote td { 247 | padding: 0.5em; 248 | } 249 | 250 | dl { 251 | margin: 0; 252 | padding: 0; 253 | } 254 | 255 | dl dd { 256 | margin-left: 30px; 257 | } 258 | 259 | pre { 260 | padding: 0; 261 | margin: 15px -30px; 262 | padding: 8px; 263 | line-height: 1.3em; 264 | padding: 7px 30px; 265 | background: #eee; 266 | border-radius: 2px; 267 | -moz-border-radius: 2px; 268 | -webkit-border-radius: 2px; 269 | } 270 | 271 | dl pre { 272 | margin-left: -60px; 273 | padding-left: 60px; 274 | } 275 | 276 | tt { 277 | background-color: #ecf0f3; 278 | color: #222; 279 | /* padding: 1px 2px; */ 280 | } 281 | 282 | tt.xref, a tt { 283 | background-color: #FBFBFB; 284 | } 285 | 286 | a:hover tt { 287 | background: #EEE; 288 | } 289 | 290 | div.behindthis { 291 | margin: 0 auto; 292 | padding: 0 0 60px 0; 293 | width: 500px; 294 | color: #555; 295 | text-align: center; 296 | font-size: 15px; 297 | background: url(fireteam.png) center 60px no-repeat; 298 | height: 170px; 299 | } 300 | -------------------------------------------------------------------------------- /docs/_themes/curlish/theme.conf: -------------------------------------------------------------------------------- 1 | [theme] 2 | inherit = basic 3 | stylesheet = curlish.css 4 | nosidebar = true 5 | pygments_style = tango 6 | 7 | [options] 8 | github_fork = fireteam/curlish 9 | -------------------------------------------------------------------------------- /docs/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # Curlish documentation build configuration file, created by 4 | # sphinx-quickstart on Sat Mar 31 13:08:00 2012. 5 | # 6 | # This file is execfile()d with the current directory set to its containing dir. 7 | # 8 | # Note that not all possible configuration values are present in this 9 | # autogenerated file. 10 | # 11 | # All configuration values have a default; values that are commented out 12 | # serve to show the default. 13 | 14 | import sys, os 15 | 16 | # If extensions (or modules to document with autodoc) are in another directory, 17 | # add these directories to sys.path here. If the directory is relative to the 18 | # documentation root, use os.path.abspath to make it absolute, like shown here. 19 | #sys.path.insert(0, os.path.abspath('.')) 20 | 21 | # -- General configuration ----------------------------------------------------- 22 | 23 | # If your documentation needs a minimal Sphinx version, state it here. 24 | #needs_sphinx = '1.0' 25 | 26 | # Add any Sphinx extension module names here, as strings. They can be extensions 27 | # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. 28 | extensions = [] 29 | 30 | # Add any paths that contain templates here, relative to this directory. 31 | templates_path = ['_templates'] 32 | 33 | # The suffix of source filenames. 34 | source_suffix = '.rst' 35 | 36 | # The encoding of source files. 37 | #source_encoding = 'utf-8-sig' 38 | 39 | # The master toctree document. 40 | master_doc = 'index' 41 | 42 | # General information about the project. 43 | project = u'Curlish' 44 | copyright = u'2012, Fireteam Ltd.' 45 | 46 | # The version info for the project you're documenting, acts as replacement for 47 | # |version| and |release|, also used in various other places throughout the 48 | # built documents. 49 | # 50 | # The short X.Y version. 51 | import subprocess 52 | version = subprocess.Popen([sys.executable, 'setup.py', '--version'], 53 | cwd='..', stdout=subprocess.PIPE).communicate()[0].strip() 54 | # The full version, including alpha/beta/rc tags. 55 | release = version 56 | 57 | # The language for content autogenerated by Sphinx. Refer to documentation 58 | # for a list of supported languages. 59 | #language = None 60 | 61 | # There are two options for replacing |today|: either, you set today to some 62 | # non-false value, then it is used: 63 | #today = '' 64 | # Else, today_fmt is used as the format for a strftime call. 65 | #today_fmt = '%B %d, %Y' 66 | 67 | # List of patterns, relative to source directory, that match files and 68 | # directories to ignore when looking for source files. 69 | exclude_patterns = ['_build'] 70 | 71 | # The reST default role (used for this markup: `text`) to use for all documents. 72 | #default_role = None 73 | 74 | # If true, '()' will be appended to :func: etc. cross-reference text. 75 | #add_function_parentheses = True 76 | 77 | # If true, the current module name will be prepended to all description 78 | # unit titles (such as .. function::). 79 | #add_module_names = True 80 | 81 | # If true, sectionauthor and moduleauthor directives will be shown in the 82 | # output. They are ignored by default. 83 | #show_authors = False 84 | 85 | # The name of the Pygments (syntax highlighting) style to use. 86 | pygments_style = 'sphinx' 87 | 88 | # A list of ignored prefixes for module index sorting. 89 | #modindex_common_prefix = [] 90 | 91 | 92 | # -- Options for HTML output --------------------------------------------------- 93 | 94 | # The theme to use for HTML and HTML Help pages. See the documentation for 95 | # a list of builtin themes. 96 | html_theme = 'curlish' 97 | 98 | # Theme options are theme-specific and customize the look and feel of a theme 99 | # further. For a list of options available for each theme, see the 100 | # documentation. 101 | #html_theme_options = {} 102 | 103 | # Add any paths that contain custom themes here, relative to this directory. 104 | html_theme_path = ['_themes'] 105 | 106 | # The name for this set of Sphinx documents. If None, it defaults to 107 | # " v documentation". 108 | #html_title = None 109 | 110 | # A shorter title for the navigation bar. Default is the same as html_title. 111 | #html_short_title = None 112 | 113 | # The name of an image file (relative to this directory) to place at the top 114 | # of the sidebar. 115 | #html_logo = None 116 | 117 | # The name of an image file (within the static path) to use as favicon of the 118 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 119 | # pixels large. 120 | #html_favicon = None 121 | 122 | # Add any paths that contain custom static files (such as style sheets) here, 123 | # relative to this directory. They are copied after the builtin static files, 124 | # so a file named "default.css" will overwrite the builtin "default.css". 125 | html_static_path = ['_static'] 126 | 127 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, 128 | # using the given strftime format. 129 | #html_last_updated_fmt = '%b %d, %Y' 130 | 131 | # If true, SmartyPants will be used to convert quotes and dashes to 132 | # typographically correct entities. 133 | #html_use_smartypants = True 134 | 135 | # Custom sidebar templates, maps document names to template names. 136 | #html_sidebars = {} 137 | 138 | # Additional templates that should be rendered to pages, maps page names to 139 | # template names. 140 | #html_additional_pages = {} 141 | 142 | # If false, no module index is generated. 143 | #html_domain_indices = True 144 | 145 | # If false, no index is generated. 146 | #html_use_index = True 147 | 148 | # If true, the index is split into individual pages for each letter. 149 | #html_split_index = False 150 | 151 | # If true, links to the reST sources are added to the pages. 152 | #html_show_sourcelink = True 153 | 154 | # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. 155 | #html_show_sphinx = True 156 | 157 | # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. 158 | #html_show_copyright = True 159 | 160 | # If true, an OpenSearch description file will be output, and all pages will 161 | # contain a tag referring to it. The value of this option must be the 162 | # base URL from which the finished HTML is served. 163 | #html_use_opensearch = '' 164 | 165 | # This is the file name suffix for HTML files (e.g. ".xhtml"). 166 | #html_file_suffix = None 167 | 168 | # Output file base name for HTML help builder. 169 | htmlhelp_basename = 'Curlishdoc' 170 | 171 | 172 | # -- Options for LaTeX output -------------------------------------------------- 173 | 174 | latex_elements = { 175 | # The paper size ('letterpaper' or 'a4paper'). 176 | #'papersize': 'letterpaper', 177 | 178 | # The font size ('10pt', '11pt' or '12pt'). 179 | #'pointsize': '10pt', 180 | 181 | # Additional stuff for the LaTeX preamble. 182 | #'preamble': '', 183 | } 184 | 185 | # Grouping the document tree into LaTeX files. List of tuples 186 | # (source start file, target name, title, author, documentclass [howto/manual]). 187 | latex_documents = [ 188 | ('index', 'Curlish.tex', u'Curlish Documentation', 189 | u'Fireteam Ltd.', 'manual'), 190 | ] 191 | 192 | # The name of an image file (relative to this directory) to place at the top of 193 | # the title page. 194 | #latex_logo = None 195 | 196 | # For "manual" documents, if this is true, then toplevel headings are parts, 197 | # not chapters. 198 | #latex_use_parts = False 199 | 200 | # If true, show page references after internal links. 201 | #latex_show_pagerefs = False 202 | 203 | # If true, show URL addresses after external links. 204 | #latex_show_urls = False 205 | 206 | # Documents to append as an appendix to all manuals. 207 | #latex_appendices = [] 208 | 209 | # If false, no module index is generated. 210 | #latex_domain_indices = True 211 | 212 | 213 | # -- Options for manual page output -------------------------------------------- 214 | 215 | # One entry per manual page. List of tuples 216 | # (source start file, name, description, authors, manual section). 217 | man_pages = [ 218 | ('index', 'curlish', u'Curlish Documentation', 219 | [u'Fireteam Ltd.'], 1) 220 | ] 221 | 222 | # If true, show URL addresses after external links. 223 | #man_show_urls = False 224 | 225 | 226 | # -- Options for Texinfo output ------------------------------------------------ 227 | 228 | # Grouping the document tree into Texinfo files. List of tuples 229 | # (source start file, target name, title, author, 230 | # dir menu entry, description, category) 231 | texinfo_documents = [ 232 | ('index', 'Curlish', u'Curlish Documentation', 233 | u'Fireteam Ltd.', 'Curlish', 'One line description of project.', 234 | 'Miscellaneous'), 235 | ] 236 | 237 | # Documents to append as an appendix to all manuals. 238 | #texinfo_appendices = [] 239 | 240 | # If false, no module index is generated. 241 | #texinfo_domain_indices = True 242 | 243 | # How to display URL addresses: 'footnote', 'no', or 'inline'. 244 | #texinfo_show_urls = 'footnote' 245 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | Curlish 2 | ======= 3 | 4 | **curl with flames on top** 5 | 6 | Ever had to speak to an OAuth 2.0 protected resource for debugging 7 | purposes? curl is a nice tool, but it totally lacks helpers for dealing 8 | with oauth. 9 | 10 | curlish comes for the rescue. It is able to remember access tokens for 11 | you and inject it into requests. Facebook comes preconfigured so you can 12 | start using it right away. 13 | 14 | How it Looks 15 | ------------ 16 | 17 | .. image:: _static/screenshot.png 18 | :alt: Curlish in action 19 | 20 | Installation 21 | ------------ 22 | 23 | Curlish is a small script written in Python without any further 24 | dependencies but things that are shipped with Python 2.7. If you are 25 | running an older Python version you will need to install simplejson. 26 | 27 | Quick installation:: 28 | 29 | $ curl -L http://bit.ly/curlish | bash 30 | 31 | This will download the current version of curlish and put it into 32 | ``~/.bin``. Make sure to have that path on your searchpath. 33 | 34 | Basic Usage 35 | ----------- 36 | 37 | Out of the box curlish forwards all arguments but the ones that are used 38 | to control curlish to the ``curl`` executable. The result from curl is 39 | processed and nicely colorized if your terminal supports that. Also we're 40 | reindenting JSON for you so that it's more readable. 41 | 42 | To get more out of it you need to register a site with it. This will make 43 | curl understand OAuth for a specific API. By default we already have 44 | facebook preconfigured for you. 45 | 46 | Just use ``curlish`` as if it was ``curl`` and enjoy. 47 | 48 | Configuration 49 | ------------- 50 | 51 | To add a site you can either use ``--add-site NAME`` or just edit the 52 | ``~/.ftcurlish.json`` file. You will find that it looks something like 53 | this: 54 | 55 | .. sourcecode:: javascript 56 | 57 | { 58 | "http_port": 62231, 59 | "sites": { 60 | "facebook": { 61 | "grant_type": "authorization_code", 62 | "extra_headers": {}, 63 | "request_token_params": { 64 | "scope": "email" 65 | }, 66 | "authorize_url": "https://www.facebook.com/dialog/oauth", 67 | "base_url": "https://graph.facebook.com/", 68 | "client_id": "384088028278656", 69 | "client_secret": "14c75a494cda2e11e8760095ec972915", 70 | "access_token_url": "/oauth/access_token" 71 | } 72 | }, 73 | } 74 | 75 | These values are all copy/pasted from the application configuration page 76 | on Facebook. Adjust that to whatever website you want to talk to. Some 77 | important keys and values: 78 | 79 | ``grant_type`` 80 | The type of the grant that the API supports. The default is 81 | ``authorization_code`` which means that a browser based flow is used. 82 | This is the most common one. For some services you can switch to a 83 | ``password`` grant which means that we will prompt you for username 84 | and password and exchange that information for a authorization token. 85 | 86 | Note that very few services support password based logins. 87 | 88 | You can also set the grant to ``null`` in which case the OAuth 89 | features are disabled. This makes it useful if you want to speak 90 | to APIs protected with other schemes. In that case only the 91 | ``extra_headers`` and ``base_url`` parameters are really used. 92 | 93 | ``extra_headers`` 94 | That's a dictionary of extra headers that are sent with **all** HTTP 95 | requests to the service. You can use this to use a custom 96 | authorization headers or similar things. 97 | 98 | ``request_token_params`` 99 | Sent with the authorization request. For instance you can set the 100 | ``scope`` for the token with that. 101 | 102 | ``base_url`` 103 | The base URL. We will automatically enable this site for you for all 104 | requests that start with this base URL. It's also the base URL for 105 | ``access_token_url`` and ``authorize_url`` if those are not absolute. 106 | 107 | ``authorize_url`` 108 | The authorization URL for the ``authorization_code`` flow. 109 | 110 | ``client_id`` 111 | The client ID from the application configuration. 112 | 113 | ``client_secret`` 114 | The client secret from the application configuration. 115 | 116 | ``access_token_url`` 117 | The URL where the token can be managed. 118 | 119 | Browser Based Flow 120 | ------------------ 121 | 122 | Curlish by default opens an HTTP server on ``127.0.0.1:62231`` that 123 | handles exactly one request which is the response from the authorization 124 | dialog. If you need to register an application make sure the redirect 125 | URI is ``http://127.0.0.1:62231``. If you can't use that port for 126 | whatever reason you can change it in the config. 127 | 128 | Clearing Tokens 129 | --------------- 130 | 131 | Because detecting stale tokens is specific for each individual service 132 | we're not attempting to detect expired tokens. As such if you get a 133 | notification that a token is expired from the API you need to remove it 134 | from the token cache:: 135 | 136 | $ curlish --clear-token-cache --site facebook 137 | 138 | If you don't specify the site it will remove all cached tokens. 139 | 140 | Common Curl Arguments 141 | --------------------- 142 | 143 | ``-v`` 144 | Enables verbose mode 145 | 146 | ``-i`` 147 | Prints response headers 148 | 149 | ``-X METHOD`` 150 | specifies the HTTP method instead of automatically picking it. For 151 | known HTTP methods you can also leave off the ``-X`` prefix as a 152 | ``curlish`` specific feature. 153 | 154 | ``-H "Header: value"`` 155 | Emits a request header with a specific value. 156 | 157 | ``-d "key=value"`` 158 | Emits a pair of form data. 159 | 160 | Curl Extension Arguments 161 | ------------------------ 162 | 163 | In addition to the curl arguments, `curlish` supports a few other ones as 164 | shortcuts for common tasks: 165 | 166 | ``-Jkey=value`` 167 | Sends a JSON string value as key to some object. If the key is empty 168 | the whole body of the JSON transmission will just be that string 169 | value. The key can be in dotted notation to construct objects. See 170 | below. 171 | 172 | ``-Jkey:=value`` 173 | Like ``-Jkey=value`` but the value part has to be a JSON object -- no 174 | conversion to string takes place. You can use this to send integers 175 | and boolean values. 176 | 177 | ``-J@filename`` 178 | Sends a file as JSON body to the server. 179 | 180 | ``-Jkey=@filename`` 181 | Sends a JSON body where a key is loaded from a JSON file. 182 | 183 | ``--ajax`` 184 | Sends an ``X-Requested-With: XMLHttpRequest`` header. 185 | 186 | ``GET``, ``POST``, etc. 187 | If it's one of the common HTTP methods the ``-X`` prefix is implicit. 188 | 189 | ``--cookies`` 190 | Enables simple cookie handling for this request. See :ref:`cookies` 191 | for more information. 192 | 193 | ``--hide-jsonp`` 194 | If curlish detects a JSONP response it will by default keep the 195 | wrapper function call around. If this is set it will appear as if it 196 | was a regular JSON response. 197 | 198 | Sending JSON Objects 199 | -------------------- 200 | 201 | Since we're supporting dotted notation you can send complex JSON objects 202 | and arrays. Basically the key is in dotted notation and the system figures 203 | out the rest:: 204 | 205 | curlish -Jfoo.int:=1 -Jfoo.string=42 206 | 207 | Results in this JSON data: 208 | 209 | .. sourcecode:: javascript 210 | 211 | { 212 | "foo": { 213 | "int": 1, 214 | "string": "42" 215 | } 216 | } 217 | 218 | .. _cookies: 219 | 220 | Automatic Cookie Management 221 | --------------------------- 222 | 223 | Curlish also simplifies cookie handling compared to plain old curl. If 224 | you pass ``--cookies`` to curlish it will create a file in 225 | ``~/.ftcurlish-cookies`` for each site which stores the cookies in plain 226 | text. To delete those cookies again you can either delete that file by 227 | hand or pass ``--clear-cookies`` to curlish. 228 | -------------------------------------------------------------------------------- /docs/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | REM Command file for Sphinx documentation 4 | 5 | if "%SPHINXBUILD%" == "" ( 6 | set SPHINXBUILD=sphinx-build 7 | ) 8 | set BUILDDIR=_build 9 | set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . 10 | set I18NSPHINXOPTS=%SPHINXOPTS% . 11 | if NOT "%PAPER%" == "" ( 12 | set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% 13 | set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% 14 | ) 15 | 16 | if "%1" == "" goto help 17 | 18 | if "%1" == "help" ( 19 | :help 20 | echo.Please use `make ^` where ^ is one of 21 | echo. html to make standalone HTML files 22 | echo. dirhtml to make HTML files named index.html in directories 23 | echo. singlehtml to make a single large HTML file 24 | echo. pickle to make pickle files 25 | echo. json to make JSON files 26 | echo. htmlhelp to make HTML files and a HTML help project 27 | echo. qthelp to make HTML files and a qthelp project 28 | echo. devhelp to make HTML files and a Devhelp project 29 | echo. epub to make an epub 30 | echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter 31 | echo. text to make text files 32 | echo. man to make manual pages 33 | echo. texinfo to make Texinfo files 34 | echo. gettext to make PO message catalogs 35 | echo. changes to make an overview over all changed/added/deprecated items 36 | echo. linkcheck to check all external links for integrity 37 | echo. doctest to run all doctests embedded in the documentation if enabled 38 | goto end 39 | ) 40 | 41 | if "%1" == "clean" ( 42 | for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i 43 | del /q /s %BUILDDIR%\* 44 | goto end 45 | ) 46 | 47 | if "%1" == "html" ( 48 | %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html 49 | if errorlevel 1 exit /b 1 50 | echo. 51 | echo.Build finished. The HTML pages are in %BUILDDIR%/html. 52 | goto end 53 | ) 54 | 55 | if "%1" == "dirhtml" ( 56 | %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml 57 | if errorlevel 1 exit /b 1 58 | echo. 59 | echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. 60 | goto end 61 | ) 62 | 63 | if "%1" == "singlehtml" ( 64 | %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml 65 | if errorlevel 1 exit /b 1 66 | echo. 67 | echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. 68 | goto end 69 | ) 70 | 71 | if "%1" == "pickle" ( 72 | %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle 73 | if errorlevel 1 exit /b 1 74 | echo. 75 | echo.Build finished; now you can process the pickle files. 76 | goto end 77 | ) 78 | 79 | if "%1" == "json" ( 80 | %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json 81 | if errorlevel 1 exit /b 1 82 | echo. 83 | echo.Build finished; now you can process the JSON files. 84 | goto end 85 | ) 86 | 87 | if "%1" == "htmlhelp" ( 88 | %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp 89 | if errorlevel 1 exit /b 1 90 | echo. 91 | echo.Build finished; now you can run HTML Help Workshop with the ^ 92 | .hhp project file in %BUILDDIR%/htmlhelp. 93 | goto end 94 | ) 95 | 96 | if "%1" == "qthelp" ( 97 | %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp 98 | if errorlevel 1 exit /b 1 99 | echo. 100 | echo.Build finished; now you can run "qcollectiongenerator" with the ^ 101 | .qhcp project file in %BUILDDIR%/qthelp, like this: 102 | echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Curlish.qhcp 103 | echo.To view the help file: 104 | echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Curlish.ghc 105 | goto end 106 | ) 107 | 108 | if "%1" == "devhelp" ( 109 | %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp 110 | if errorlevel 1 exit /b 1 111 | echo. 112 | echo.Build finished. 113 | goto end 114 | ) 115 | 116 | if "%1" == "epub" ( 117 | %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub 118 | if errorlevel 1 exit /b 1 119 | echo. 120 | echo.Build finished. The epub file is in %BUILDDIR%/epub. 121 | goto end 122 | ) 123 | 124 | if "%1" == "latex" ( 125 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex 126 | if errorlevel 1 exit /b 1 127 | echo. 128 | echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. 129 | goto end 130 | ) 131 | 132 | if "%1" == "text" ( 133 | %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text 134 | if errorlevel 1 exit /b 1 135 | echo. 136 | echo.Build finished. The text files are in %BUILDDIR%/text. 137 | goto end 138 | ) 139 | 140 | if "%1" == "man" ( 141 | %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man 142 | if errorlevel 1 exit /b 1 143 | echo. 144 | echo.Build finished. The manual pages are in %BUILDDIR%/man. 145 | goto end 146 | ) 147 | 148 | if "%1" == "texinfo" ( 149 | %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo 150 | if errorlevel 1 exit /b 1 151 | echo. 152 | echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. 153 | goto end 154 | ) 155 | 156 | if "%1" == "gettext" ( 157 | %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale 158 | if errorlevel 1 exit /b 1 159 | echo. 160 | echo.Build finished. The message catalogs are in %BUILDDIR%/locale. 161 | goto end 162 | ) 163 | 164 | if "%1" == "changes" ( 165 | %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes 166 | if errorlevel 1 exit /b 1 167 | echo. 168 | echo.The overview file is in %BUILDDIR%/changes. 169 | goto end 170 | ) 171 | 172 | if "%1" == "linkcheck" ( 173 | %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck 174 | if errorlevel 1 exit /b 1 175 | echo. 176 | echo.Link check complete; look for any errors in the above output ^ 177 | or in %BUILDDIR%/linkcheck/output.txt. 178 | goto end 179 | ) 180 | 181 | if "%1" == "doctest" ( 182 | %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest 183 | if errorlevel 1 exit /b 1 184 | echo. 185 | echo.Testing of doctests in the sources finished, look at the ^ 186 | results in %BUILDDIR%/doctest/output.txt. 187 | goto end 188 | ) 189 | 190 | :end 191 | -------------------------------------------------------------------------------- /install.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | BIN_DIR=`dirname ~/.bin/dummy`; 4 | if [ -n "$1" ]; then 5 | BIN_DIR="$1"; 6 | fi 7 | 8 | CURL=curl 9 | 10 | if [ -z $(which "$CURL") ]; then 11 | echo 'error: curl required to use curlish. Please install it first.' 12 | exit 1 13 | fi 14 | 15 | echo 'Downloading curlish...' 16 | mkdir -p "$BIN_DIR" 17 | curl -s https://raw.github.com/fireteam/curlish/master/curlish.py > $BIN_DIR/curlish 18 | chmod +x $BIN_DIR/curlish 19 | echo 20 | echo "Curlish installed successfully to $BIN_DIR/curlish" 21 | echo "Add $BIN_DIR to your PATH if you haven't so far:" 22 | echo 23 | echo -n $' echo \'export PATH="$PATH:'; 24 | echo "$BIN_DIR\"' >> ~/.bashrc"; 25 | -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | [upload_docs] 2 | upload-dir = docs/_build/html 3 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | import os 2 | from setuptools import setup 3 | 4 | readme = open(os.path.join(os.path.dirname(__file__), 'README'), 'r').read() 5 | 6 | setup( 7 | name='curlish', 8 | author='Fireteam Ltd.', 9 | author_email='support@fireteam.net', 10 | version='1.22', 11 | url='http://github.com/fireteam/curlish', 12 | py_modules=['curlish'], 13 | description='A wrapper for curl that adds OAuth support', 14 | long_description=readme, 15 | entry_points={ 16 | 'console_scripts': [ 17 | 'curlish = curlish:main' 18 | ] 19 | }, 20 | zip_safe=False, 21 | classifiers=[ 22 | 'License :: OSI Approved :: BSD License', 23 | 'Programming Language :: Python' 24 | ] 25 | ) 26 | -------------------------------------------------------------------------------- /setup_freeze.py: -------------------------------------------------------------------------------- 1 | import os 2 | from cx_Freeze import setup, Executable 3 | 4 | 5 | build_exe_options = { 6 | 'include_files': ['curl.exe'] 7 | } 8 | 9 | 10 | setup(name="curlish", 11 | version="1.22", 12 | description="Curlish", 13 | options={"build_exe": build_exe_options}, 14 | executables=[Executable("curlish.py")]) 15 | --------------------------------------------------------------------------------