├── src ├── main │ ├── webapp │ │ ├── assets │ │ │ ├── img │ │ │ │ ├── flash.png │ │ │ │ ├── up-arrow.png │ │ │ │ ├── glyphicons-halflings.png │ │ │ │ └── glyphicons-halflings-white.png │ │ │ ├── js │ │ │ │ └── custom.js │ │ │ └── css │ │ │ │ ├── github.css │ │ │ │ └── custom.less │ │ ├── crontab │ │ │ ├── create.vm │ │ │ ├── update.vm │ │ │ ├── list.vm │ │ │ ├── info.vm │ │ │ └── edit_form.vm │ │ ├── schema │ │ │ ├── dbs.vm │ │ │ ├── db.vm │ │ │ └── table.vm │ │ ├── index.vm │ │ ├── query │ │ │ ├── result.vm │ │ │ ├── create.vm │ │ │ ├── info.vm │ │ │ └── list.vm │ │ ├── user │ │ │ └── set.vm │ │ ├── WEB-INF │ │ │ ├── velocity.properties │ │ │ └── web.xml │ │ ├── diagnostics │ │ │ └── info.vm │ │ └── layout │ │ │ ├── VM_global_library.vm │ │ │ └── Default.vm │ ├── java │ │ └── org │ │ │ └── apache │ │ │ └── hadoop │ │ │ └── hive │ │ │ └── hwi │ │ │ ├── servlet │ │ │ ├── RIndex.java │ │ │ ├── RDiagnostics.java │ │ │ ├── RUser.java │ │ │ ├── VelocityViewProcessor.java │ │ │ ├── RBase.java │ │ │ ├── RSchema.java │ │ │ ├── Velocity.java │ │ │ ├── RQuery.java │ │ │ └── RCrontab.java │ │ │ ├── util │ │ │ ├── APIResult.java │ │ │ ├── HadoopUtil.java │ │ │ ├── HWIHiveHistory.java │ │ │ ├── QueryUtil.java │ │ │ └── HWIHiveHistoryViewer.java │ │ │ ├── HWIException.java │ │ │ ├── query │ │ │ ├── QueryCrontab.java │ │ │ ├── RunningRunner.java │ │ │ ├── QueryManager.java │ │ │ ├── QueryStore.java │ │ │ └── QueryRunner.java │ │ │ ├── HWIContextListener.java │ │ │ ├── model │ │ │ ├── Pagination.java │ │ │ ├── MCrontab.java │ │ │ └── MQuery.java │ │ │ └── HWIServer.java │ └── resources │ │ ├── hive-site.xml.template │ │ └── package.jdo └── test │ └── java │ └── org │ └── apache │ └── hadoop │ └── hive │ └── hwi │ ├── TestRunHWIServer.java │ └── TestEverything.java ├── .gitignore ├── NOTICE ├── header.txt ├── README.md ├── pom.xml └── LICENSE /src/main/webapp/assets/img/flash.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anjuke/hwi/HEAD/src/main/webapp/assets/img/flash.png -------------------------------------------------------------------------------- /src/main/webapp/assets/img/up-arrow.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anjuke/hwi/HEAD/src/main/webapp/assets/img/up-arrow.png -------------------------------------------------------------------------------- /src/main/webapp/assets/img/glyphicons-halflings.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anjuke/hwi/HEAD/src/main/webapp/assets/img/glyphicons-halflings.png -------------------------------------------------------------------------------- /src/main/webapp/assets/img/glyphicons-halflings-white.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anjuke/hwi/HEAD/src/main/webapp/assets/img/glyphicons-halflings-white.png -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *.log 2 | *.out 3 | .classpath 4 | .project 5 | .settings/ 6 | target/ 7 | *.swp 8 | src/main/resources/hive-site.xml 9 | work/ 10 | metastore_db/ 11 | -------------------------------------------------------------------------------- /src/main/webapp/crontab/create.vm: -------------------------------------------------------------------------------- 1 | #set($title='crontab create') 2 | #set($nav='crontab.create') 3 | 4 |

Create a Hive Crontab

5 |
6 | 7 | #parse('/crontab/edit_form.vm') 8 | -------------------------------------------------------------------------------- /src/main/webapp/schema/dbs.vm: -------------------------------------------------------------------------------- 1 | #set($nav='schema') 2 | 3 |

Database

4 |
5 | 6 | 7 | #foreach($db in $dbs) 8 | 9 | 12 | 13 | #end 14 | 15 |
10 | $db 11 |
-------------------------------------------------------------------------------- /src/main/webapp/index.vm: -------------------------------------------------------------------------------- 1 | #set($title='hwi') 2 | 3 |
4 |

Hive Web Interface

5 |

The Hive Web Interface (HWI) offers an alternative to the 6 | command line interface (CLI). Once authenticated users can start 7 | HWIWebSessions. A HWIWebSession lives on the server users can 8 | submit queries and return later to view the status of the query and 9 | view any results it produced.

10 |
11 | -------------------------------------------------------------------------------- /src/main/webapp/crontab/update.vm: -------------------------------------------------------------------------------- 1 | #set($title='crontab update') 2 | #set($nav='crontab.list') 3 | 4 |

Update a Hive Crontab

5 |
6 | 7 | #set($name=$crontab.getName()) 8 | #set($query=$crontab.getQuery()) 9 | #set($tokens=$crontab.getCrontab().split(" ")) 10 | #set($hour=$tokens[2]) 11 | #set($day=$tokens[3]) 12 | #set($month=$tokens[4]) 13 | #set($week=$tokens[5]) 14 | 15 | #parse('/crontab/edit_form.vm') 16 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | This product includes software developed by The Apache Software 2 | Foundation (http://www.apache.org/). 3 | 4 | This product includes/uses jersey (http://jersey.java.net/), 5 | Copyright (c) 2013, Oracle and/or its affiliates. 6 | 7 | This product includes/uses quartz-scheduler(http://quartz-scheduler.org/), 8 | Copyright (c) 2001-2010 Terracotta, Inc. 9 | 10 | This product includes/uses Bootstrap (http://twitter.github.com/bootstrap/), 11 | Copyright (c) 2012 Twitter, Inc. 12 | 13 | -------------------------------------------------------------------------------- /src/main/webapp/query/result.vm: -------------------------------------------------------------------------------- 1 | #set($nav='query.list') 2 | 3 |
4 | $query.getName() 5 | / 6 |

Result

7 |
8 | 9 |
10 | 11 |
12 |
Location
13 |
$query.getResultLocation()
14 |
DownLoad
15 |
16 |
17 | 18 |
19 | 20 | 21 | #foreach($r in $partialResult)$r
#end -------------------------------------------------------------------------------- /header.txt: -------------------------------------------------------------------------------- 1 | Copyright (C) [2013] [Anjuke Inc] 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /src/main/webapp/user/set.vm: -------------------------------------------------------------------------------- 1 | #set($nav='user.set') 2 | 3 | #set($title='set user') 4 | 5 |

Set User

6 |
7 | 8 |
9 | 10 |
11 | 12 |
13 | 14 |
15 |
16 | 17 |
18 | 19 |
20 |
-------------------------------------------------------------------------------- /src/test/java/org/apache/hadoop/hive/hwi/TestRunHWIServer.java: -------------------------------------------------------------------------------- 1 | package org.apache.hadoop.hive.hwi; 2 | 3 | import org.apache.hadoop.hive.shims.JettyShims; 4 | import org.apache.hadoop.hive.shims.ShimLoader; 5 | 6 | public class TestRunHWIServer { 7 | 8 | /** 9 | * @param args 10 | * @throws Exception 11 | */ 12 | public static void main(String[] args) throws Exception { 13 | 14 | JettyShims.Server webServer; 15 | webServer = ShimLoader.getJettyShims().startServer("0.0.0.0", 9999); 16 | 17 | webServer.addWar("src/main/webapp/", "/hwi"); 18 | 19 | webServer.start(); 20 | webServer.join(); 21 | 22 | } 23 | 24 | } -------------------------------------------------------------------------------- /src/main/webapp/schema/db.vm: -------------------------------------------------------------------------------- 1 | #set($nav='schema') 2 | 3 |
4 | Schema 5 | / 6 |

$db.getName()

7 |
8 |
9 | 10 |
11 |
Name
12 |
$db.getName()
13 |
Description
14 |
#alt($db.getDescription() '--')
15 |
16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | #foreach ($table in $tables) 27 | 28 | 29 | 30 | #end 31 | 32 |
Table
$table
33 | -------------------------------------------------------------------------------- /src/main/webapp/WEB-INF/velocity.properties: -------------------------------------------------------------------------------- 1 | # Filepath for error template, 2 | # relative to web application root directory 3 | tools.view.servlet.error.template = Error.vm 4 | 5 | # Directory for layout templates, 6 | # relative to web application root directory 7 | tools.view.servlet.layout.directory = layout/ 8 | 9 | # Filepath of the default layout template 10 | # relative to the layout directory 11 | # NOT relative to the root directory of the webapp! 12 | tools.view.servlet.layout.default.template = Default.vm 13 | 14 | # Filepath of the Velocimacro library 15 | # relative to web application root directory 16 | velocimacro.library = layout/VM_global_library.vm 17 | 18 | # Controls Velocimacro library autoloading 19 | # for test 20 | file.resource.loader.cache = false 21 | velocimacro.library.autoreload = true -------------------------------------------------------------------------------- /src/main/webapp/assets/js/custom.js: -------------------------------------------------------------------------------- 1 | $(function(){ 2 | 3 | /* back to top */ 4 | $(window).scroll(function () { 5 | if ($(this).scrollTop() > 400) { 6 | $('#back-top').fadeIn() 7 | } else { 8 | $('#back-top').fadeOut() 9 | } 10 | }); 11 | 12 | $('#back-top').click(function () { 13 | $('body,html').animate({ 14 | scrollTop: 0 15 | }, 800) 16 | return false 17 | }); 18 | 19 | /* code syntax highlight */ 20 | hljs.initHighlightingOnLoad() 21 | 22 | /* tip */ 23 | $('.tipper').tooltip({title:function(){ 24 | return $(this).siblings('.tip').text() 25 | }}) 26 | 27 | /* toggle */ 28 | $('.toggle').click(function(){ 29 | return $(this).siblings('.togglable').toggle() 30 | }) 31 | 32 | }); 33 | -------------------------------------------------------------------------------- /src/main/webapp/query/create.vm: -------------------------------------------------------------------------------- 1 | #set($title='query create') 2 | #set($nav='query.create') 3 | 4 |

Create a Hive Query

5 |
6 | 7 |
8 | 9 |
10 | 11 |
12 | 13 |
14 |
15 | 16 |
17 | 18 |
19 | 20 |
21 |
22 | 23 |
24 | 25 |
26 |
27 | -------------------------------------------------------------------------------- /src/main/webapp/diagnostics/info.vm: -------------------------------------------------------------------------------- 1 |

Diagnostics

2 |
3 | 4 |

System.getProperties()

5 |
6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | #foreach ($key in $p.keySet()) 15 | 16 | 17 | 18 | 19 | #end 20 | 21 |
NameValue
$key$p.getProperty($key)
22 | 23 |
24 | 25 |

System.getenv()

26 |
27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | #foreach ($key in $env.keySet()) 36 | 37 | 38 | 39 | 40 | #end 41 | 42 |
NameValue
$key$env.get($key)
43 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RIndex.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import javax.ws.rs.GET; 19 | import javax.ws.rs.Path; 20 | import javax.ws.rs.Produces; 21 | 22 | import org.apache.commons.logging.Log; 23 | import org.apache.commons.logging.LogFactory; 24 | 25 | import com.sun.jersey.api.view.Viewable; 26 | 27 | @Path("/") 28 | public class RIndex extends RBase { 29 | protected static final Log l4j = LogFactory.getLog(RIndex.class 30 | .getName()); 31 | 32 | @GET 33 | @Produces("text/html") 34 | public Viewable index() { 35 | return new Viewable("/index.vm"); 36 | } 37 | 38 | } 39 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RDiagnostics.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import javax.ws.rs.GET; 19 | import javax.ws.rs.Path; 20 | import javax.ws.rs.Produces; 21 | 22 | import org.apache.commons.logging.Log; 23 | import org.apache.commons.logging.LogFactory; 24 | 25 | import com.sun.jersey.api.view.Viewable; 26 | 27 | @Path("/diagnostics") 28 | public class RDiagnostics extends RBase { 29 | protected static final Log l4j = LogFactory.getLog(RDiagnostics.class.getName()); 30 | 31 | @GET 32 | @Produces("text/html") 33 | public Viewable dbs() { 34 | request.setAttribute("p", System.getProperties()); 35 | request.setAttribute("env", System.getenv()); 36 | 37 | return new Viewable("/diagnostics/info.vm"); 38 | } 39 | 40 | } 41 | -------------------------------------------------------------------------------- /src/main/resources/hive-site.xml.template: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | javax.jdo.option.ConnectionURL 8 | jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&charset=latin1 9 | 10 | 11 | 12 | javax.jdo.option.ConnectionDriverName 13 | com.mysql.jdbc.Driver 14 | 15 | 16 | 17 | javax.jdo.option.ConnectionUserName 18 | hive 19 | 20 | 21 | 22 | javax.jdo.option.ConnectionPassword 23 | hive 24 | 25 | 26 | 27 | fs.default.name 28 | hdfs://127.0.0.1:9000 29 | 30 | 31 | 32 | mapred.job.tracker 33 | 127.0.0.1:9001 34 | 35 | 36 | 37 | hive.hwi.war.file 38 | /lib/hive-hwi-1.0.war 39 | This is the WAR file for Hive Web Interface 40 | 41 | 42 | 43 | hive.hwi.result 44 | /user/hive/result 45 | this is storage directory for hive sql results 46 | 47 | 48 | 49 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/util/APIResult.java: -------------------------------------------------------------------------------- 1 | package org.apache.hadoop.hive.hwi.util; 2 | 3 | import com.google.gson.Gson; 4 | import java.util.HashMap; 5 | 6 | public class APIResult { 7 | 8 | private String result; 9 | 10 | private String msg; 11 | 12 | private Integer id; 13 | 14 | public static final String ERR = "error"; 15 | 16 | public static final String OK = "ok"; 17 | 18 | public APIResult (String ret, String message) { 19 | result = ret; 20 | msg = message; 21 | } 22 | 23 | public APIResult (String ret, String message, Integer id) { 24 | result = ret; 25 | msg = message; 26 | this.id = id; 27 | } 28 | 29 | public String getResult() { 30 | return result; 31 | } 32 | 33 | public void setResult(String result) { 34 | this.result = result; 35 | } 36 | 37 | public String getMsg() { 38 | return msg; 39 | } 40 | 41 | public void setMsg(String msg) { 42 | this.msg = msg; 43 | } 44 | 45 | public String toJson() { 46 | HashMap map = new HashMap(); 47 | map.put("result", this.result); 48 | map.put("msg", this.msg); 49 | if (this.result.equals(OK)) { 50 | map.put("id", this.id); 51 | } 52 | Gson gson = new Gson(); 53 | String json = gson.toJson(map); 54 | return json; 55 | } 56 | 57 | public Integer getId() { 58 | return id; 59 | } 60 | 61 | public void setId(Integer id) { 62 | this.id = id; 63 | } 64 | 65 | } 66 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RUser.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import javax.ws.rs.FormParam; 19 | import javax.ws.rs.GET; 20 | import javax.ws.rs.POST; 21 | import javax.ws.rs.Path; 22 | import javax.ws.rs.Produces; 23 | 24 | import com.sun.jersey.api.view.Viewable; 25 | 26 | @Path("/users") 27 | public class RUser extends RBase { 28 | 29 | @Path("set") 30 | @Produces("text/html;charset=ISO-8859-1") 31 | @GET 32 | public Viewable set() { 33 | request.setAttribute("user", getUser()); 34 | return new Viewable("/user/set.vm"); 35 | } 36 | 37 | @Path("set") 38 | @Produces("text/html;charset=ISO-8859-1") 39 | @POST 40 | public Viewable set(@FormParam(value = "user") String user) { 41 | setUser(user); 42 | request.setAttribute("msg", "set user successful!"); 43 | request.setAttribute("msg-type", "success"); 44 | return new Viewable("/user/set.vm"); 45 | } 46 | 47 | } 48 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/HWIException.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Licensed to the Apache Software Foundation (ASF) under one 3 | * or more contributor license agreements. See the NOTICE file 4 | * distributed with this work for additional information 5 | * regarding copyright ownership. The ASF licenses this file 6 | * to you under the Apache License, Version 2.0 (the 7 | * "License"); you may not use this file except in compliance 8 | * with the License. You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | package org.apache.hadoop.hive.hwi; 20 | 21 | /** 22 | * HWIException. 23 | * 24 | */ 25 | public class HWIException extends Exception { 26 | 27 | private static final long serialVersionUID = 1L; 28 | 29 | public HWIException() { 30 | super(); 31 | } 32 | 33 | /** Specify an error String with the Exception. */ 34 | public HWIException(String arg0) { 35 | super(arg0); 36 | } 37 | 38 | /** Wrap an Exception in HWIException. */ 39 | public HWIException(Throwable arg0) { 40 | super(arg0); 41 | } 42 | 43 | /** Specify an error String and wrap an Exception in HWIException. */ 44 | public HWIException(String arg0, Throwable arg1) { 45 | super(arg0, arg1); 46 | } 47 | 48 | } 49 | -------------------------------------------------------------------------------- /src/main/webapp/crontab/list.vm: -------------------------------------------------------------------------------- 1 | #set($nav='crontab.list') 2 | 3 | #set($urlQuery = "") 4 | 5 | #if($crontabName) 6 | #set($urlQuery = $urlQuery + "crontabName=$crontabName") 7 | #end 8 | 9 |
10 |

Crontabs

11 | 12 | 18 |
19 |
20 | 21 |
22 | 23 | #pagination( $pagination $urlQuery ) 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | #foreach($crontab in $pagination.getItems()) 38 | 39 | 40 | 41 | 46 | 47 | 48 | 49 | 50 | #end 51 | 52 |
#NameQueryCrontabStatusHistory
$crontab.getId()$crontab.getName() 42 | #shortQuery($crontab.getQuery()) 43 |
$shortQuery
44 |
$crontab.getQuery()
45 |
$crontab.getCrontab()$crontab.getStatus()
53 | 54 | #if($pagination.getItems().size() > 10) 55 | #pagination( $pagination $urlQuery ) 56 | #end 57 | 58 |
-------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/VelocityViewProcessor.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import com.sun.jersey.api.core.ResourceConfig; 19 | import com.sun.jersey.api.view.Viewable; 20 | import com.sun.jersey.spi.template.ViewProcessor; 21 | 22 | import javax.servlet.ServletConfig; 23 | import javax.servlet.http.HttpServletRequest; 24 | import javax.ws.rs.core.Context; 25 | import javax.ws.rs.ext.Provider; 26 | 27 | 28 | import java.io.IOException; 29 | import java.io.OutputStream; 30 | import java.io.OutputStreamWriter; 31 | 32 | /** 33 | * @author qiangwang@anjuke.com 34 | */ 35 | @Provider 36 | public class VelocityViewProcessor implements ViewProcessor { 37 | 38 | @Context 39 | private ThreadLocal request; 40 | 41 | private final Velocity v; 42 | 43 | public VelocityViewProcessor(@Context ResourceConfig resourceConfig, @Context ServletConfig sc) { 44 | v = new Velocity(sc); 45 | } 46 | 47 | @Override 48 | public String resolve(String path) { 49 | if(v.templateExists(path)){ 50 | return path; 51 | }else{ 52 | return null; 53 | } 54 | } 55 | 56 | @Override 57 | public void writeTo(String resolvedPath, Viewable viewable, OutputStream out) 58 | throws IOException { 59 | 60 | // Commit the status and headers to the HttpServletResponse 61 | out.flush(); 62 | 63 | v.render(resolvedPath, request.get(), new OutputStreamWriter(out)); 64 | } 65 | } 66 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/query/QueryCrontab.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.query; 17 | 18 | import java.text.SimpleDateFormat; 19 | import java.util.Calendar; 20 | import java.util.Date; 21 | import java.util.TimeZone; 22 | 23 | import org.apache.commons.logging.Log; 24 | import org.apache.commons.logging.LogFactory; 25 | import org.apache.hadoop.hive.hwi.model.MCrontab; 26 | import org.apache.hadoop.hive.hwi.model.MQuery; 27 | import org.quartz.Job; 28 | import org.quartz.JobDataMap; 29 | import org.quartz.JobExecutionContext; 30 | import org.quartz.JobExecutionException; 31 | 32 | public class QueryCrontab implements Job { 33 | protected static final Log l4j = LogFactory.getLog(QueryManager.class 34 | .getName()); 35 | 36 | @Override 37 | public void execute(JobExecutionContext context) 38 | throws JobExecutionException { 39 | JobDataMap map = context.getJobDetail().getJobDataMap(); 40 | 41 | int crontabId = map.getIntValue("crontabId"); 42 | 43 | MCrontab crontab = QueryStore.getInstance().getCrontabById(crontabId); 44 | 45 | Date created = Calendar.getInstance(TimeZone.getDefault()).getTime(); 46 | SimpleDateFormat sf = new SimpleDateFormat("yyyy-MM-dd_HH:mm:ss"); 47 | String name = "[" + crontab.getName() + "] " + sf.format(created); 48 | 49 | MQuery query = new MQuery(name, crontab.getQuery(), 50 | crontab.getCallback(), crontab.getUserId(), 51 | crontab.getGroupId()); 52 | 53 | query.setCrontabId(crontabId); 54 | QueryStore.getInstance().insertQuery(query); 55 | 56 | QueryManager.getInstance().submit(query); 57 | } 58 | } 59 | -------------------------------------------------------------------------------- /src/main/webapp/WEB-INF/web.xml: -------------------------------------------------------------------------------- 1 | 2 | 18 | 19 | 20 | 21 | 22 | Used to manage Hive Sessions 23 | org.apache.hadoop.hive.hwi.HWIContextListener 24 | 25 | 26 | 27 | 28 | 30 29 | 30 | 31 | 32 | index.jsp 33 | 34 | 35 | 36 | ServletAdaptor 37 | com.sun.jersey.spi.container.servlet.ServletContainer 38 | 39 | com.sun.jersey.config.property.packages 40 | org.apache.hadoop.hive.hwi 41 | 42 | 1 43 | 44 | 45 | 46 | default 47 | /assets/* 48 | 49 | 50 | 51 | ServletAdaptor 52 | /* 53 | 54 | 55 | 56 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RBase.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import java.util.HashMap; 19 | 20 | import javax.servlet.http.Cookie; 21 | import javax.servlet.http.HttpServletRequest; 22 | import javax.servlet.http.HttpServletResponse; 23 | import javax.servlet.http.HttpSession; 24 | import javax.ws.rs.core.Context; 25 | 26 | import org.apache.commons.collections.map.HashedMap; 27 | 28 | public abstract class RBase { 29 | 30 | public static final String USER_COOKIE_NAME = "user"; 31 | 32 | @Context 33 | protected HttpServletRequest request; 34 | 35 | @Context 36 | protected HttpServletResponse response; 37 | 38 | /** 39 | * set current user 40 | * @param user 41 | */ 42 | protected void setUser(String user) { 43 | Cookie cookie = new Cookie(USER_COOKIE_NAME, user); 44 | cookie.setMaxAge(365 * 24 * 60 * 60); 45 | cookie.setPath("/hwi"); 46 | response.addCookie(cookie); 47 | } 48 | 49 | /** 50 | * return current user 51 | * 52 | * @return 53 | */ 54 | protected String getUser() { 55 | try { 56 | String user = getCookies().get(USER_COOKIE_NAME); 57 | return user; 58 | } catch (Exception e) { 59 | return null; 60 | } 61 | } 62 | 63 | /** 64 | * return user cookies 65 | * 66 | * @return user cookies hashmap 67 | */ 68 | protected HashMap getCookies() { 69 | Cookie _cookies[] = request.getCookies(); 70 | 71 | HashMap cookies = new HashMap(); 72 | for (int i=0; i<_cookies.length; i++) { 73 | cookies.put(_cookies[i].getName(), _cookies[i].getValue()); 74 | } 75 | 76 | return cookies; 77 | } 78 | 79 | } 80 | -------------------------------------------------------------------------------- /src/main/webapp/assets/css/github.css: -------------------------------------------------------------------------------- 1 | /* 2 | 3 | github.com style (c) Vasily Polovnyov 4 | 5 | */ 6 | 7 | pre code { 8 | display: block; padding: 0.5em; 9 | color: #000; 10 | /*background: #f8f8ff*/ 11 | } 12 | 13 | pre .comment, 14 | pre .template_comment, 15 | pre .diff .header, 16 | pre .javadoc { 17 | color: #998; 18 | font-style: italic 19 | } 20 | 21 | pre .keyword, 22 | pre .css .rule .keyword, 23 | pre .winutils, 24 | pre .javascript .title, 25 | pre .nginx .title, 26 | pre .subst, 27 | pre .request, 28 | pre .status { 29 | color: #000; 30 | font-weight: bold 31 | } 32 | 33 | pre .number, 34 | pre .hexcolor { 35 | color: #40a070 36 | } 37 | 38 | pre .string, 39 | pre .tag .value, 40 | pre .phpdoc, 41 | pre .tex .formula { 42 | color: #d14 43 | } 44 | 45 | pre .title, 46 | pre .id { 47 | color: #900; 48 | font-weight: bold 49 | } 50 | 51 | pre .javascript .title, 52 | pre .lisp .title, 53 | pre .clojure .title, 54 | pre .subst { 55 | font-weight: normal 56 | } 57 | 58 | pre .class .title, 59 | pre .haskell .type, 60 | pre .vhdl .literal, 61 | pre .tex .command { 62 | color: #458; 63 | font-weight: bold 64 | } 65 | 66 | pre .tag, 67 | pre .tag .title, 68 | pre .rules .property, 69 | pre .django .tag .keyword { 70 | color: #000080; 71 | font-weight: normal 72 | } 73 | 74 | pre .attribute, 75 | pre .variable, 76 | pre .instancevar, 77 | pre .lisp .body { 78 | color: #008080 79 | } 80 | 81 | pre .regexp { 82 | color: #009926 83 | } 84 | 85 | pre .class { 86 | color: #458; 87 | font-weight: bold 88 | } 89 | 90 | pre .symbol, 91 | pre .ruby .symbol .string, 92 | pre .ruby .symbol .keyword, 93 | pre .ruby .symbol .keymethods, 94 | pre .lisp .keyword, 95 | pre .tex .special, 96 | pre .input_number { 97 | color: #990073 98 | } 99 | 100 | pre .built_in, 101 | pre .lisp .title, 102 | pre .clojure .built_in { 103 | color: #0086b3 104 | } 105 | 106 | pre .preprocessor, 107 | pre .pi, 108 | pre .doctype, 109 | pre .shebang, 110 | pre .cdata { 111 | color: #999; 112 | font-weight: bold 113 | } 114 | 115 | pre .deletion { 116 | background: #fdd 117 | } 118 | 119 | pre .addition { 120 | background: #dfd 121 | } 122 | 123 | pre .diff .change { 124 | background: #0086b3 125 | } 126 | 127 | pre .chunk { 128 | color: #aaa 129 | } 130 | 131 | pre .tex .formula { 132 | opacity: 0.5; 133 | } 134 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/HWIContextListener.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Licensed to the Apache Software Foundation (ASF) under one 3 | * or more contributor license agreements. See the NOTICE file 4 | * distributed with this work for additional information 5 | * regarding copyright ownership. The ASF licenses this file 6 | * to you under the Apache License, Version 2.0 (the 7 | * "License"); you may not use this file except in compliance 8 | * with the License. You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | package org.apache.hadoop.hive.hwi; 20 | 21 | import javax.servlet.ServletContextEvent; 22 | 23 | import org.apache.commons.logging.Log; 24 | import org.apache.commons.logging.LogFactory; 25 | import org.apache.hadoop.hive.hwi.query.QueryManager; 26 | 27 | /** 28 | * After getting a contextInitialized event this component starts an instance of 29 | * the HiveSessionManager. 30 | * 31 | */ 32 | public class HWIContextListener implements javax.servlet.ServletContextListener { 33 | 34 | protected static final Log l4j = LogFactory.getLog(HWIContextListener.class 35 | .getName()); 36 | 37 | /** 38 | * The Hive Web Interface manages multiple hive sessions. This event is used 39 | * to start a Runnable, QueryManager as a thread inside the servlet 40 | * container. 41 | * 42 | * @param sce 43 | * An event fired by the servlet context on startup 44 | */ 45 | public void contextInitialized(ServletContextEvent sce) { 46 | 47 | QueryManager.getInstance().start(); 48 | 49 | } 50 | 51 | /** 52 | * When the Hive Web Interface is closing we locate the Runnable 53 | * HiveSessionManager and set it's internal goOn variable to false. This 54 | * should allow the application to gracefully shutdown. 55 | * 56 | * @param sce 57 | * An event fired by the servlet context on context shutdown 58 | */ 59 | public void contextDestroyed(ServletContextEvent sce) { 60 | 61 | QueryManager.getInstance().shutdown(); 62 | 63 | } 64 | } 65 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/model/Pagination.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.model; 17 | 18 | import java.util.List; 19 | import java.util.Map; 20 | 21 | import javax.jdo.Query; 22 | 23 | public class Pagination { 24 | 25 | private final Query query; 26 | private final Map map; 27 | private final int page; 28 | private final int pageSize; 29 | private Long total; 30 | private List items; 31 | 32 | public Pagination(Query query, Map map, int page, 33 | int pageSize) { 34 | this.query = query; 35 | this.map = map; 36 | this.page = page; 37 | this.pageSize = pageSize; 38 | } 39 | 40 | public Query getQuery() { 41 | return query; 42 | } 43 | 44 | public Map getMap() { 45 | return map; 46 | } 47 | 48 | public int getPage() { 49 | return page; 50 | } 51 | 52 | public int getPageSize() { 53 | return pageSize; 54 | } 55 | 56 | public Long getTotal() { 57 | if (total == null) { 58 | Query newQuery = query.getPersistenceManager().newQuery(query); 59 | newQuery.setOrdering(null); 60 | newQuery.setResult("COUNT(id)"); 61 | total = (Long) newQuery.executeWithMap(map); 62 | } 63 | return total; 64 | } 65 | 66 | @SuppressWarnings("unchecked") 67 | public List getItems() { 68 | if (items == null) { 69 | Query newQuery = query.getPersistenceManager().newQuery(query); 70 | int offset = (page - 1) * pageSize; 71 | newQuery.setRange(offset, offset + pageSize); 72 | items = (List) newQuery.executeWithMap(map); 73 | } 74 | return items; 75 | } 76 | 77 | public int getPages() { 78 | return (int) Math.ceil((double) getTotal() / pageSize); 79 | } 80 | } 81 | -------------------------------------------------------------------------------- /src/main/webapp/crontab/info.vm: -------------------------------------------------------------------------------- 1 | #set($nav='crontab.list') 2 | 3 |
4 | Crontabs 5 | / 6 |

7 | $crontab.getName() 8 | 9 | 10 |

11 |
12 |
13 | 14 |
15 |
$crontab.getStatus()
16 |
17 | 18 | 19 | #if ( $crontab.getStatus() == 'RUNNING' || $crontab.getStatus() == 'DELETED') 20 | 21 | #end 22 | 23 | #if ( $crontab.getStatus() == 'PAUSED') 24 | 25 | #end 26 | 27 | #if ( $crontab.getStatus() == 'RUNNING' || $crontab.getStatus() == 'PAUSED') 28 | 29 | 30 | #end 31 | 32 |
33 |
34 | 35 | 47 | 48 | 49 |
50 |

Basic

51 |
52 |
53 |
Crontab
54 |
$crontab.getCrontab()
55 |
Query
56 |
57 | #shortQuery($crontab.getQuery()) 58 |
$shortQuery
59 |
$crontab.getQuery()
60 |
61 |
Callback
62 |
#alt($crontab.getCallback() '--')
63 |
Created
64 |
#alt($createdTime "--")
65 |
Updated
66 |
#alt($updatedTime "--")
67 |
68 | -------------------------------------------------------------------------------- /src/main/webapp/layout/VM_global_library.vm: -------------------------------------------------------------------------------- 1 | #macro(alt $a $b)#if("$!a"!="")$a#{else}$b#end#end 2 | 3 | #macro(query_status $status) 4 | #set($map = { 5 | "INITED": "", 6 | "FINISHED": "label-success", 7 | "FAILED": "label-important", 8 | "SYNTAXERROR": "label-warning", 9 | "RUNNING": "label-info"})$map["$status"]#end 10 | 11 | #macro(crontab_status $status) 12 | #set($map = { 13 | "RUNNING": "label-info", 14 | "PAUSED": "label-warning", 15 | "DELETED": "label-important"})$map["$status"]#end 16 | 17 | #macro (pagination $pagination $urlQuery) 18 | 72 | #end 73 | 74 | #macro (shortQuery $query) 75 | #if($query.length() > 30) 76 | #set($shortQuery=$query.substring(0,30)) 77 | #else 78 | #set($shortQuery=$query) 79 | #end 80 | 81 | #set($shortQuery=$shortQuery.replaceAll("\r|\n", " ")) 82 | #end 83 | -------------------------------------------------------------------------------- /src/main/webapp/query/info.vm: -------------------------------------------------------------------------------- 1 | #set($nav='query.list') 2 | 3 |
4 | Queries 5 | / 6 |

7 | $query.getName() 8 | #if ( $query.getCrontabId() ) 9 | 10 | 11 | 12 | #else 13 | 14 | 15 | 16 | #end 17 | 18 | 19 | 20 |

21 |
22 |
23 | 24 |
25 |
$query.getStatus()
26 |
27 | #if($jobInfos) 28 | 33 | #end 34 |
35 |
36 | 37 |
38 |

Basic

39 |
40 |
41 |
Query
42 |
43 | #shortQuery($query.getQuery()) 44 |
$shortQuery
45 |
$query.getQuery()
46 |
47 |
Callback
48 |
#alt($query.getCallback() "--")
49 |
Result location
50 |
51 | #if ($query.getStatus() == "FINISHED") 52 | View Result 53 | #else 54 | -- 55 | #end 56 |
57 |
Error message
58 |
59 | #if($query.getErrorMsg()) 60 | $query.getErrorMsg() 61 | #else 62 | -- 63 | #end 64 |
65 |
Error code
66 |
#alt($query.getErrorCode() "--")
67 |
Created
68 |
#alt($createdTime "--")
69 |
Updated
70 |
#alt($updatedTime "--")
71 |
72 |
73 |

Stats

74 |
75 |
76 |
Cpu Time
77 |
#alt($cpuTime "--")
78 |
Total Time
79 |
#alt($totalTime "--")
80 | #if($savedTime) 81 |
Saved Time
82 |
83 | $savedTime 84 |
85 | #end 86 |
87 | -------------------------------------------------------------------------------- /src/main/webapp/query/list.vm: -------------------------------------------------------------------------------- 1 | 2 | #if($extra == 'my') 3 | #set($nav='query.my') 4 | #else 5 | #set($nav='query.list') 6 | #end 7 | 8 | #set($urlQuery = "") 9 | 10 | #if($crontabId) 11 | #set($urlQuery = $urlQuery + "crontabId=$crontabId") 12 | #end 13 | 14 | #if($queryName) 15 | #if($urlQuery != '') 16 | #set($urlQuery = $urlQuery + "&") 17 | #end 18 | #set($urlQuery = $urlQuery + "queryName=$queryName") 19 | #end 20 | 21 | #if($extra) 22 | #if($urlQuery != '') 23 | #set($urlQuery = $urlQuery + "&") 24 | #end 25 | #set($urlQuery = $urlQuery + "extra=$extra") 26 | #end 27 | 28 |
29 |

#if($extra == "my") My #end Queries

30 | 31 | 37 |
38 |
39 | 40 |
41 | 42 | #pagination( $pagination $urlQuery ) 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | #foreach($query in $pagination.getItems()) 56 | 57 | 58 | 59 | 64 | 65 | 74 | 75 | #end 76 | 77 |
#NameQueryStatusResult
$query.getId()$query.getName() 60 | #shortQuery($query.getQuery()) 61 |
$shortQuery
62 |
$query.getQuery()
63 |
$query.getStatus() 66 | #if ($query.getStatus() == "FINISHED") 67 | 68 | #end 69 | 70 | #if ($query.getErrorMsg()) 71 | 72 | #end 73 |
78 | 79 | #if($pagination.getItems().size() > 10) 80 | #pagination( $pagination $urlQuery ) 81 | #end 82 | 83 |
-------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/query/RunningRunner.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.query; 17 | 18 | import java.util.concurrent.LinkedBlockingQueue; 19 | 20 | import org.apache.commons.logging.Log; 21 | import org.apache.commons.logging.LogFactory; 22 | 23 | public class RunningRunner implements Runnable { 24 | protected static final Log l4j = LogFactory.getLog(RunningRunner.class 25 | .getName()); 26 | 27 | public static enum Progress { 28 | CONTINUE, EXIT 29 | }; 30 | 31 | public static interface Running { 32 | public Progress running(); 33 | } 34 | 35 | private LinkedBlockingQueue runnings; 36 | 37 | private Thread t; 38 | 39 | public RunningRunner() { 40 | runnings = new LinkedBlockingQueue(); 41 | } 42 | 43 | public void start() { 44 | t = new Thread(this); 45 | t.start(); 46 | } 47 | 48 | public void run() { 49 | l4j.info("RunningRunner started."); 50 | 51 | while (true) { 52 | try { 53 | // block if no more running 54 | Running running = runnings.take(); 55 | 56 | Progress progress = running.running(); 57 | switch (progress) { 58 | case CONTINUE: 59 | runnings.put(running); 60 | break; 61 | case EXIT: 62 | break; 63 | } 64 | 65 | l4j.debug("go to sleep..."); 66 | Thread.sleep(100); 67 | } catch (InterruptedException e) { 68 | return; 69 | } 70 | } 71 | 72 | } 73 | 74 | public boolean add(Running running) { 75 | return runnings.offer(running); 76 | } 77 | 78 | public void shutdown() { 79 | l4j.info("RunningRunner shutting down."); 80 | try { 81 | t.interrupt(); 82 | t.join(100); 83 | } catch (InterruptedException e) { 84 | e.printStackTrace(); 85 | } 86 | l4j.info("RunningRunner shutdown complete."); 87 | } 88 | } 89 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # HWI 2 | 3 | A New Hive Web Interface 4 | 5 | ## Features 6 | 7 | **Query** 8 | 9 | * Seamless intergration with existing hive deployment 10 | * Save user queries in database 11 | * Save hive query result in hdfs 12 | * Show progress of queries 13 | * Summary info, such as cpu time, total time, saved time 14 | * Callback when hive query is completed 15 | * Download full result 16 | 17 | **Crontab** 18 | 19 | * Create crontabs to run queries automatically 20 | 21 | ## Implemention 22 | 23 | Following frameworks are used: 24 | 25 | * [Twitter Bootstrap](http://twitter.github.com/bootstrap/) for html, css, js 26 | * [Velocity](http://velocity.apache.org/) for view templating 27 | * [Jersey](http://jersey.java.net/) for url routing 28 | * [Quartz](http://quartz-scheduler.org/) for queries scheduling 29 | 30 | ## How to use 31 | 32 | **Download** 33 | 34 | $ git clone git@github.com:anjuke/hwi.git 35 | 36 | **Package** 37 | 38 | $ cd hwi 39 | $ mvn clean package war:war 40 | 41 | **Deploy** 42 | 43 | $ mv ${HIVE_HOME}/lib/hive-hwi-x.jar ${HIVE_HOME}/lib/hive-hwi-x.jar.bak 44 | $ mv ${HIVE_HOME}/lib/hive-hwi-x.war ${HIVE_HOME}/lib/hive-hwi-x.war.bak 45 | 46 | $ cp target/hive-hwi-1.0.jar ${HIVE_HOME}/lib/hive-hwi-1.0.jar 47 | $ cp target/hive-hwi-1.0.war ${HIVE_HOME}/lib/hive-hwi-1.0.war 48 | 49 | **Pay attention!** Maybe you prefer not to simply overide conf/hive-site.xml 50 | 51 | $ cp src/main/resources/hive-site.xml.template ${HIVE_HOME}/conf/hive-site.xml 52 | 53 | **Run** 54 | 55 | $ hive --service hwi 56 | 57 | ## Develop 58 | 59 | **Create Database** 60 | 61 | $ create database hive 62 | $ create user 'hive'@'localhost' identified by password 'hive' 63 | $ grant all on hive.* to 'hive'@'localhost' 64 | $ flush privileges 65 | 66 | **config file** 67 | 68 | ``` 69 | cp src/main/resources/hive-site.xml.template src/main/resources/hive-site.xml 70 | vim src/main/resources/hive-site.xml 71 | 72 | cp src/main/resources/hive-site.xml {$HIVE_HOME}/conf/ 73 | cp ~/.m2/repository/mysql/mysql-connector-java/5.1.22/mysql-connector-java-5.1.22.jar {$HIVE_HOME}/lib/ 74 | ``` 75 | 76 | 77 | **Install DataNucleus** 78 | 79 | * install url is `http://www.datanucleus.org/downloads/eclipse-update/` 80 | 81 | **Create Eclipse Configuration** 82 | 83 | $ mvn eclipse:eclipse 84 | 85 | **Import Hwi Maven Project** 86 | 87 | 1. eclipse import maven project 88 | 2. please use java environment 89 | 3. hwi project right click, DataNuclens => Add DataNuclens Support 90 | 4. hwi project right click, DataNuclens => Enable Auto-Enhancement 91 | 92 | **run project** 93 | 94 | 1. hadoop fs -mkdir /user/hive/result 95 | 2. use TestRunHWIServer.java to run as java application 96 | 3. you can open your browser, input "http://localhost:9999/hwi/" 97 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RSchema.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import java.util.List; 19 | 20 | import javax.ws.rs.GET; 21 | import javax.ws.rs.Path; 22 | import javax.ws.rs.PathParam; 23 | import javax.ws.rs.Produces; 24 | import javax.ws.rs.WebApplicationException; 25 | 26 | import org.apache.commons.logging.Log; 27 | import org.apache.commons.logging.LogFactory; 28 | import org.apache.hadoop.hive.conf.HiveConf; 29 | import org.apache.hadoop.hive.metastore.HiveMetaStoreClient; 30 | import org.apache.hadoop.hive.metastore.api.Database; 31 | import org.apache.hadoop.hive.metastore.api.MetaException; 32 | import org.apache.hadoop.hive.metastore.api.Table; 33 | import org.apache.hadoop.hive.ql.session.SessionState; 34 | 35 | import com.sun.jersey.api.view.Viewable; 36 | 37 | @Path("/schema") 38 | public class RSchema extends RBase { 39 | protected static final Log l4j = LogFactory.getLog(RSchema.class.getName()); 40 | 41 | @GET 42 | @Produces("text/html") 43 | public Viewable dbs() { 44 | HiveConf hiveConf = new HiveConf(SessionState.class); 45 | try { 46 | HiveMetaStoreClient client = new HiveMetaStoreClient(hiveConf); 47 | List dbs = client.getAllDatabases(); 48 | client.close(); 49 | request.setAttribute("dbs", dbs); 50 | } catch (MetaException e) { 51 | throw new WebApplicationException(e); 52 | } 53 | 54 | return new Viewable("/schema/dbs.vm"); 55 | } 56 | 57 | @GET 58 | @Path("{name}") 59 | @Produces("text/html") 60 | public Viewable db(@PathParam(value = "name") String name) { 61 | HiveConf hiveConf = new HiveConf(SessionState.class); 62 | try { 63 | HiveMetaStoreClient client = new HiveMetaStoreClient(hiveConf); 64 | Database db = client.getDatabase(name); 65 | List tables = client.getAllTables(name); 66 | client.close(); 67 | request.setAttribute("db", db); 68 | request.setAttribute("tables", tables); 69 | } catch (Exception e) { 70 | throw new WebApplicationException(e); 71 | } 72 | 73 | return new Viewable("/schema/db.vm"); 74 | } 75 | 76 | @GET 77 | @Path("{dbName}/{tableName}") 78 | @Produces("text/html") 79 | public Viewable table(@PathParam(value = "dbName") String dbName, 80 | @PathParam(value = "tableName") String tableName) { 81 | HiveConf hiveConf = new HiveConf(SessionState.class); 82 | try { 83 | HiveMetaStoreClient client = new HiveMetaStoreClient(hiveConf); 84 | Table t = client.getTable(dbName, tableName); 85 | client.close(); 86 | request.setAttribute("t", t); 87 | } catch (Exception e) { 88 | throw new WebApplicationException(e); 89 | } 90 | 91 | return new Viewable("/schema/table.vm"); 92 | } 93 | 94 | } 95 | -------------------------------------------------------------------------------- /src/main/webapp/layout/Default.vm: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | #alt($title "hwi") 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 |
18 | 19 | 61 | 62 |
63 | 64 | #if($msg) 65 |
66 | 67 | $msg 68 |
69 | #end 70 | 71 | $screen_content 72 |
73 | 74 |
75 |
76 | 77 | 78 | 79 | 80 | 81 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/util/HadoopUtil.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.util; 17 | 18 | import java.io.BufferedReader; 19 | import java.io.IOException; 20 | import java.io.InputStreamReader; 21 | 22 | import org.apache.commons.logging.Log; 23 | import org.apache.commons.logging.LogFactory; 24 | import org.apache.hadoop.conf.Configuration; 25 | import org.apache.hadoop.hive.conf.HiveConf; 26 | import org.apache.hadoop.hive.ql.session.SessionState; 27 | 28 | public class HadoopUtil { 29 | 30 | protected static final Log l4j = LogFactory.getLog(HadoopUtil.class 31 | .getName()); 32 | 33 | public static String getJobTrackerURL(String jobid) { 34 | HiveConf conf = new HiveConf(SessionState.class); 35 | String jt = conf.get("mapred.job.tracker"); 36 | String jth = conf.get("mapred.job.tracker.http.address"); 37 | String[] jtparts = null; 38 | String[] jthttpParts = null; 39 | if (jt.equalsIgnoreCase("local")) { 40 | jtparts = new String[2]; 41 | jtparts[0] = "local"; 42 | jtparts[1] = ""; 43 | } else { 44 | jtparts = jt.split(":"); 45 | } 46 | if (jth.contains(":")) { 47 | jthttpParts = jth.split(":"); 48 | } else { 49 | jthttpParts = new String[2]; 50 | jthttpParts[0] = jth; 51 | jthttpParts[1] = ""; 52 | } 53 | return "http://" + jtparts[0] + ":" + jthttpParts[1] 54 | + "/jobdetails.jsp?jobid=" + jobid + "&refresh=30"; 55 | } 56 | 57 | /* 58 | * incorrect, datanode can't be random 59 | */ 60 | public static String getDataNodeURL(String path) { 61 | Configuration conf = new Configuration(); 62 | conf.addResource("hdfs-default.xml"); 63 | conf.addResource("hdfs-site.xml"); 64 | 65 | String nnHttp = conf.get("dfs.http.address"); 66 | String dnHttp = conf.get("dfs.datanode.http.address"); 67 | 68 | String host = ""; 69 | try { 70 | ClassLoader classLoader = Thread.currentThread() 71 | .getContextClassLoader(); 72 | BufferedReader in = new BufferedReader(new InputStreamReader( 73 | classLoader.getResource("slaves").openStream())); 74 | host = in.readLine(); 75 | } catch (IOException e) { 76 | e.printStackTrace(); 77 | l4j.error(e.getMessage()); 78 | } 79 | 80 | String nnPort = ""; 81 | if (nnHttp.contains(":")) { 82 | nnPort = nnHttp.split(":")[1]; 83 | } 84 | 85 | String dnPort = ""; 86 | if (dnHttp.contains(":")) { 87 | dnPort = dnHttp.split(":")[1]; 88 | } 89 | 90 | return "http://" + host + ":" + dnPort 91 | + "/browseDirectory.jsp?namenodeInfoPort=" + nnPort + "&dir=" 92 | + path; 93 | } 94 | 95 | } 96 | -------------------------------------------------------------------------------- /src/main/webapp/assets/css/custom.less: -------------------------------------------------------------------------------- 1 | .dl-horizontal{ 2 | dt, dd { 3 | padding: 5px 0px; 4 | } 5 | } 6 | 7 | hr.bold { 8 | background: transparent url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAYAAAAECAYAAACtBE5DAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyJpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuMC1jMDYwIDYxLjEzNDc3NywgMjAxMC8wMi8xMi0xNzozMjowMCAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvIiB4bWxuczp4bXBNTT0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wL21tLyIgeG1sbnM6c3RSZWY9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9zVHlwZS9SZXNvdXJjZVJlZiMiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENTNSBNYWNpbnRvc2giIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6OENDRjNBN0E2NTZBMTFFMEI3QjRBODM4NzJDMjlGNDgiIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6OENDRjNBN0I2NTZBMTFFMEI3QjRBODM4NzJDMjlGNDgiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo4Q0NGM0E3ODY1NkExMUUwQjdCNEE4Mzg3MkMyOUY0OCIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo4Q0NGM0E3OTY1NkExMUUwQjdCNEE4Mzg3MkMyOUY0OCIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/PqqezsUAAAAfSURBVHjaYmRABcYwBiM2QSA4y4hNEKYDQxAEAAIMAHNGAzhkPOlYAAAAAElFTkSuQmCC); 9 | border: 0 none; 10 | height: 4px; 11 | margin: 15px 0 25px 0px; 12 | padding: 0; 13 | } 14 | 15 | .right-btn { 16 | margin-left: 5px; 17 | } 18 | 19 | .toggle { 20 | cursor: pointer; 21 | } 22 | 23 | .no-border { 24 | border: none; 25 | } 26 | 27 | .top-border { 28 | border: none; 29 | border-top: 1px solid #DDD; 30 | } 31 | 32 | pre.thin { 33 | margin: 2px 0px; 34 | padding: 1px; 35 | } 36 | 37 | form.h2 { 38 | margin: 20px 0px 0px 0px; 39 | } 40 | 41 | html{ 42 | width: 100%; 43 | height: 100%; 44 | background: #FAFAFA; 45 | } 46 | 47 | body { 48 | background: transparent; 49 | } 50 | 51 | #sidebar { 52 | position: relative; 53 | margin-top: 50px; 54 | text-align: center; 55 | 56 | img { 57 | width: 70px; 58 | height: 70px; 59 | z-index: -1; 60 | } 61 | } 62 | 63 | #body { 64 | margin-top: 50px; 65 | padding: 0px 30px; 66 | } 67 | 68 | .table-pages { 69 | position: relative; 70 | .pages { 71 | position: absolute; 72 | top: 0px; 73 | left: -80px; 74 | 75 | ul { 76 | border-bottom: 1px solid #DDD; 77 | list-style: none; 78 | margin: 0px; 79 | li { 80 | padding: 0px; 81 | border: 1px solid #DDD; 82 | border-bottom: none; 83 | text-align: center; 84 | a { 85 | padding: 5px 20px; 86 | display: block; 87 | } 88 | &.active { 89 | a { 90 | color: #DDD; 91 | background: #FCF8E3; 92 | } 93 | } 94 | } 95 | } 96 | } 97 | } 98 | 99 | #back-top { 100 | position: fixed; 101 | left: 20px; 102 | bottom: 10px; 103 | 104 | display: block; 105 | width: 108px; 106 | height: 108px; 107 | 108 | background: #DDD url(../img/up-arrow.png) no-repeat center; 109 | 110 | &:hover { 111 | background-color: #777; 112 | } 113 | } 114 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/util/HWIHiveHistory.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.util; 17 | 18 | import java.io.BufferedReader; 19 | import java.io.FileInputStream; 20 | import java.io.IOException; 21 | import java.io.InputStreamReader; 22 | import java.util.HashMap; 23 | import java.util.Map; 24 | import java.util.regex.Matcher; 25 | import java.util.regex.Pattern; 26 | 27 | import org.apache.hadoop.hive.ql.history.HiveHistory; 28 | import org.apache.hadoop.hive.ql.session.SessionState; 29 | 30 | /** 31 | * HiveHistory.parseLine has concurrent issue! In HiveHistory.parseLine, the 32 | * static member parseBuffer was used to store parse result, which is stupid 33 | * What is worse, parseLine is private, so I CAN'T simply overide it! AND I 34 | * dont' know why there are so many FUCKING private static final members! 35 | **/ 36 | public class HWIHiveHistory extends HiveHistory { 37 | 38 | public static final String KEY = "(\\w+)"; 39 | public static final String VALUE = "[[^\"]?]+"; // anything but a " in "" 40 | public static final String ROW_COUNT_PATTERN = "TABLE_ID_(\\d+)_ROWCOUNT"; 41 | 42 | public static final Pattern pattern = Pattern.compile(KEY + "=" + "\"" 43 | + VALUE + "\""); 44 | 45 | public HWIHiveHistory(SessionState ss) { 46 | super(ss); 47 | } 48 | 49 | public static void parseHiveHistory(String path, Listener l) 50 | throws IOException { 51 | FileInputStream fi = new FileInputStream(path); 52 | BufferedReader reader = new BufferedReader(new InputStreamReader(fi)); 53 | try { 54 | String line = null; 55 | StringBuilder buf = new StringBuilder(); 56 | while ((line = reader.readLine()) != null) { 57 | buf.append(line); 58 | // if it does not end with " then it is line continuation 59 | if (!line.trim().endsWith("\"")) { 60 | continue; 61 | } 62 | parseLine(buf.toString(), l); 63 | buf = new StringBuilder(); 64 | } 65 | } finally { 66 | try { 67 | reader.close(); 68 | } catch (IOException ex) { 69 | } 70 | } 71 | } 72 | 73 | protected static void parseLine(String line, Listener l) throws IOException { 74 | // extract the record type 75 | int idx = line.indexOf(' '); 76 | String recType = line.substring(0, idx); 77 | String data = line.substring(idx + 1, line.length()); 78 | 79 | Matcher matcher = pattern.matcher(data); 80 | 81 | Map parseBuffer = new HashMap(); 82 | 83 | while (matcher.find()) { 84 | String tuple = matcher.group(0); 85 | String[] parts = tuple.split("="); 86 | 87 | parseBuffer.put(parts[0], 88 | parts[1].substring(1, parts[1].length() - 1)); 89 | } 90 | 91 | l.handle(RecordTypes.valueOf(recType), parseBuffer); 92 | } 93 | 94 | } 95 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/util/QueryUtil.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.util; 17 | 18 | import java.util.regex.Matcher; 19 | import java.util.regex.Pattern; 20 | 21 | import org.apache.commons.logging.Log; 22 | import org.apache.commons.logging.LogFactory; 23 | import org.apache.hadoop.hive.ql.history.HiveHistory.TaskInfo; 24 | 25 | public class QueryUtil { 26 | 27 | protected static final Log l4j = LogFactory.getLog(QueryUtil.class 28 | .getName()); 29 | 30 | public static HWIHiveHistoryViewer getHiveHistoryViewer(String historyFile) { 31 | if (historyFile == null) { 32 | return null; 33 | } 34 | 35 | try { 36 | HWIHiveHistoryViewer hv = new HWIHiveHistoryViewer(historyFile); 37 | return hv; 38 | } catch (Exception e) { 39 | e.printStackTrace(); 40 | l4j.error(e.getMessage()); 41 | return null; 42 | } 43 | } 44 | 45 | public static String getJobId(HWIHiveHistoryViewer hv) { 46 | if (hv == null) { 47 | return null; 48 | } 49 | 50 | String jobId = ""; 51 | 52 | for (String taskKey : hv.getTaskInfoMap().keySet()) { 53 | TaskInfo ti = hv.getTaskInfoMap().get(taskKey); 54 | for (String tiKey : ti.hm.keySet()) { 55 | l4j.debug(tiKey + ":" + ti.hm.get(tiKey)); 56 | 57 | if (tiKey.equalsIgnoreCase("TASK_HADOOP_ID")) { 58 | String tid = ti.hm.get(tiKey); 59 | if (!jobId.contains(tid)) { 60 | jobId = jobId + tid + ";"; 61 | } 62 | } 63 | } 64 | } 65 | 66 | return jobId; 67 | } 68 | 69 | public static Integer getCpuTime(HWIHiveHistoryViewer hv) { 70 | if (hv == null) { 71 | return null; 72 | } 73 | 74 | int cpuTime = 0; 75 | 76 | Pattern pattern = Pattern 77 | .compile("Map-Reduce Framework.CPU time spent \\(ms\\):(\\d+),"); 78 | 79 | for (String taskKey : hv.getTaskInfoMap().keySet()) { 80 | TaskInfo ti = hv.getTaskInfoMap().get(taskKey); 81 | for (String tiKey : ti.hm.keySet()) { 82 | if (tiKey.equalsIgnoreCase("TASK_COUNTERS")) { 83 | l4j.debug(tiKey + ":" + ti.hm.get(tiKey)); 84 | 85 | Matcher matcher = pattern.matcher(ti.hm.get(tiKey)); 86 | if (matcher.find()) { 87 | try { 88 | cpuTime += Integer.parseInt(matcher.group(1)); 89 | } catch (NumberFormatException e) { 90 | l4j.error(matcher.group(1) + " is not int"); 91 | } 92 | } 93 | } 94 | } 95 | } 96 | 97 | return cpuTime; 98 | } 99 | 100 | public static String getSafeQuery(String query) { 101 | query = query.replaceAll("\r|\n", " "); 102 | return query; 103 | } 104 | 105 | } 106 | -------------------------------------------------------------------------------- /src/main/webapp/schema/table.vm: -------------------------------------------------------------------------------- 1 | #set($nav='schema') 2 | 3 |
4 | Schema 5 | / 6 | $t.getDbName() 7 | / 8 |

$t.getTableName()

9 |
10 | 11 |
12 | 13 | #set($sd = $t.getSd()) 14 | 15 |
16 |
ColsSize
17 |
$sd.getColsSize()
18 |
Input Format
19 |
$sd.getInputFormat()
20 |
Output Format
21 |
$sd.getOutputFormat()
22 |
Is Compressed?
23 |
$sd.isCompressed()
24 |
Location
25 |
$sd.getLocation()
26 |
Number Of Buckets
27 |
$sd.getNumBuckets()
28 |
29 | 30 |
31 |

Field Schema

32 |
33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | #foreach ($fs in $sd.getCols()) 43 | 44 | 45 | 46 | 47 | 48 | #end 49 | 50 |
NameTypeComment
$fs.getName()$fs.getType()$!{fs.getComment()}
51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | #foreach ($col in $sd.getBucketCols()) 60 | 61 | 62 | 63 | #end 64 | 65 |
Bucket Columns
$col
66 | 67 |
68 |

Sort Columns

69 |
70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | #foreach ($o in $sd.getSortCols()) 79 | 80 | 81 | 82 | 83 | #end 84 | 85 |
ColumnOrder
$o.getCol()$o.getOrder()
86 | 87 |
88 |

Parameters

89 |
90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | #foreach ($key in $sd.getParameters().keySet()) 99 | 100 | 101 | 102 | 103 | #end 104 | 105 |
NameValue
$key$sd.getParameters().get($key)
106 | 107 |
108 |

SerDe Info

109 |
110 | 111 | #set($si = $sd.getSerdeInfo()) 112 | 113 |
114 |
Name
115 |
$!{si.getName()}
116 |
Lib
117 |
$!{si.getSerializationLib()}
118 |
119 | 120 |
121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | #foreach ($key in $si.getParameters().keySet()) 131 | 132 | 133 | 134 | 135 | #end 136 | 137 |
Namevalue
$key$si.getParameters().get($key)
138 | 139 |
140 |

Partition Information

141 |
142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | 150 | 151 | #foreach ($fs in $t.getPartitionKeys()) 152 | 153 | 154 | 155 | 156 | 157 | #end 158 | 159 |
NameTypeComment
$fs.getName()$fs.getType()$!{fs.getComment()}
160 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/util/HWIHiveHistoryViewer.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one 3 | * or more contributor license agreements. See the NOTICE file 4 | * distributed with this work for additional information 5 | * regarding copyright ownership. The ASF licenses this file 6 | * to you under the Apache License, Version 2.0 (the 7 | * "License"); you may not use this file except in compliance 8 | * with the License. You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | package org.apache.hadoop.hive.hwi.util; 20 | 21 | import java.io.IOException; 22 | import java.util.HashMap; 23 | import java.util.Map; 24 | 25 | import org.apache.hadoop.hive.ql.history.HiveHistory.Keys; 26 | import org.apache.hadoop.hive.ql.history.HiveHistory.Listener; 27 | import org.apache.hadoop.hive.ql.history.HiveHistory.QueryInfo; 28 | import org.apache.hadoop.hive.ql.history.HiveHistory.RecordTypes; 29 | import org.apache.hadoop.hive.ql.history.HiveHistory.TaskInfo; 30 | 31 | /** 32 | * HiveHistoryViewer. 33 | * 34 | */ 35 | public class HWIHiveHistoryViewer implements Listener { 36 | 37 | String historyFile; 38 | 39 | String sessionId; 40 | 41 | // Job Hash Map 42 | private final HashMap jobInfoMap = new HashMap(); 43 | 44 | // Task Hash Map 45 | private final HashMap taskInfoMap = new HashMap(); 46 | 47 | public HWIHiveHistoryViewer(String path) { 48 | historyFile = path; 49 | init(); 50 | } 51 | 52 | public String getSessionId() { 53 | return sessionId; 54 | } 55 | 56 | public Map getJobInfoMap() { 57 | return jobInfoMap; 58 | } 59 | 60 | public Map getTaskInfoMap() { 61 | return taskInfoMap; 62 | } 63 | 64 | /** 65 | * Parse history files. 66 | */ 67 | protected void init() { 68 | 69 | try { 70 | HWIHiveHistory.parseHiveHistory(historyFile, this); 71 | } catch (IOException e) { 72 | e.printStackTrace(); 73 | } 74 | 75 | } 76 | 77 | /** 78 | * Implementation Listner interface function. 79 | * 80 | * @see org.apache.hadoop.hive.ql.history.HiveHistory.Listener#handle(org.apache.hadoop.hive.ql.history.HiveHistory.RecordTypes, 81 | * java.util.Map) 82 | */ 83 | public void handle(RecordTypes recType, Map values) { 84 | 85 | if (recType == RecordTypes.SessionStart) { 86 | sessionId = values.get(Keys.SESSION_ID.name()); 87 | } else if (recType == RecordTypes.QueryStart 88 | || recType == RecordTypes.QueryEnd) { 89 | String key = values.get(Keys.QUERY_ID.name()); 90 | QueryInfo ji; 91 | if (jobInfoMap.containsKey(key)) { 92 | ji = jobInfoMap.get(key); 93 | 94 | ji.hm.putAll(values); 95 | 96 | } else { 97 | ji = new QueryInfo(); 98 | ji.hm = new HashMap(); 99 | ji.hm.putAll(values); 100 | 101 | jobInfoMap.put(key, ji); 102 | 103 | } 104 | } else if (recType == RecordTypes.TaskStart 105 | || recType == RecordTypes.TaskEnd 106 | || recType == RecordTypes.TaskProgress) { 107 | 108 | String jobid = values.get(Keys.QUERY_ID.name()); 109 | String taskid = values.get(Keys.TASK_ID.name()); 110 | String key = jobid + ":" + taskid; 111 | TaskInfo ti; 112 | if (taskInfoMap.containsKey(key)) { 113 | ti = taskInfoMap.get(key); 114 | ti.hm.putAll(values); 115 | } else { 116 | ti = new TaskInfo(); 117 | ti.hm = new HashMap(); 118 | ti.hm.putAll(values); 119 | taskInfoMap.put(key, ti); 120 | 121 | } 122 | 123 | } 124 | 125 | } 126 | 127 | } -------------------------------------------------------------------------------- /src/main/webapp/crontab/edit_form.vm: -------------------------------------------------------------------------------- 1 |
2 | 3 |
4 | 5 |
6 | 7 |
8 |
9 | 10 |
11 | 12 |
13 | 14 | 15 | 56 |
57 |
58 | 59 |
60 | 61 |
62 | 0 0 63 | 64 | 65 | 66 | 67 | * 68 | 69 | 83 |
84 | 85 |
86 | 87 | 88 |
89 | 90 |
91 |
92 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/model/MCrontab.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.model; 17 | 18 | import java.util.Calendar; 19 | import java.util.Date; 20 | import java.util.TimeZone; 21 | 22 | import javax.jdo.annotations.IdGeneratorStrategy; 23 | import javax.jdo.annotations.PersistenceCapable; 24 | import javax.jdo.annotations.Persistent; 25 | import javax.jdo.annotations.PrimaryKey; 26 | 27 | @PersistenceCapable 28 | public class MCrontab { 29 | 30 | public static enum Status { 31 | RUNNING, PAUSED, DELETED 32 | }; 33 | 34 | @PrimaryKey 35 | @Persistent(valueStrategy=IdGeneratorStrategy.IDENTITY) 36 | private Integer id; 37 | 38 | private String name; 39 | 40 | private String query; 41 | 42 | private String callback; 43 | 44 | private String userId; 45 | 46 | private String groupId; 47 | 48 | private String crontab; 49 | 50 | private Date created; 51 | 52 | private Date updated; 53 | 54 | private Status status; 55 | 56 | public MCrontab(String name, String query, String callback, String crontab, 57 | String userId, String groupId) { 58 | this.name = name; 59 | this.query = query; 60 | this.callback = callback; 61 | this.crontab = crontab; 62 | this.userId = userId; 63 | this.groupId = groupId; 64 | this.created = Calendar.getInstance(TimeZone.getDefault()).getTime(); 65 | this.updated = this.created; 66 | this.setStatus(Status.PAUSED); 67 | } 68 | 69 | public void copy(MCrontab mcrontab) { 70 | name = mcrontab.name; 71 | query = mcrontab.query; 72 | callback = mcrontab.callback; 73 | crontab = mcrontab.crontab; 74 | created = mcrontab.created; 75 | updated = mcrontab.updated; 76 | status = mcrontab.status; 77 | userId = mcrontab.userId; 78 | groupId = mcrontab.groupId; 79 | } 80 | 81 | public Integer getId() { 82 | return id; 83 | } 84 | 85 | public void setId(Integer id) { 86 | this.id = id; 87 | } 88 | 89 | public String getName() { 90 | return name; 91 | } 92 | 93 | public void setName(String name) { 94 | this.name = name; 95 | } 96 | 97 | public String getQuery() { 98 | return query; 99 | } 100 | 101 | public void setQuery(String query) { 102 | this.query = query; 103 | } 104 | 105 | public String getCallback() { 106 | return callback; 107 | } 108 | 109 | public void setCallback(String callback) { 110 | this.callback = callback; 111 | } 112 | 113 | public String getCrontab() { 114 | return crontab; 115 | } 116 | 117 | public void setCrontab(String crontab) { 118 | this.crontab = crontab; 119 | } 120 | 121 | public String getUserId() { 122 | return userId; 123 | } 124 | 125 | public void setUserId(String userId) { 126 | this.userId = userId; 127 | } 128 | 129 | public String getGroupId() { 130 | return groupId; 131 | } 132 | 133 | public void setGroupId(String groupId) { 134 | this.groupId = groupId; 135 | } 136 | 137 | public Date getCreated() { 138 | return created; 139 | } 140 | 141 | public void setCreated(Date created) { 142 | this.created = created; 143 | } 144 | 145 | public Date getUpdated() { 146 | return updated; 147 | } 148 | 149 | public void setUpdated(Date updated) { 150 | this.updated = updated; 151 | } 152 | 153 | public Status getStatus() { 154 | return status; 155 | } 156 | 157 | public void setStatus(Status status) { 158 | this.status = status; 159 | } 160 | 161 | } 162 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/HWIServer.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Licensed to the Apache Software Foundation (ASF) under one 3 | * or more contributor license agreements. See the NOTICE file 4 | * distributed with this work for additional information 5 | * regarding copyright ownership. The ASF licenses this file 6 | * to you under the Apache License, Version 2.0 (the 7 | * "License"); you may not use this file except in compliance 8 | * with the License. You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | package org.apache.hadoop.hive.hwi; 20 | 21 | import java.io.File; 22 | import java.io.IOException; 23 | 24 | import org.apache.commons.logging.Log; 25 | import org.apache.commons.logging.LogFactory; 26 | import org.apache.hadoop.hive.conf.HiveConf; 27 | import org.apache.hadoop.hive.shims.JettyShims; 28 | import org.apache.hadoop.hive.shims.ShimLoader; 29 | 30 | /** 31 | * This is the entry point for HWI. A web server is invoked in the same manner 32 | * as the hive CLI. Rather then opening a command line session a web server is 33 | * started and a web application to work with hive is started. 34 | */ 35 | public class HWIServer { 36 | protected static final Log l4j = LogFactory.getLog(HWIServer.class 37 | .getName()); 38 | 39 | private JettyShims.Server webServer; 40 | private final String[] args; 41 | 42 | /** 43 | * 44 | * @param args 45 | * These are the command line arguments. Usually -hiveconf. 46 | * @throws java.io.IOException 47 | */ 48 | public HWIServer(String[] args) throws IOException { 49 | this.args = args; 50 | } 51 | 52 | /** 53 | * This method initialized the internal Jetty Servlet Engine. It adds the 54 | * hwi context path. 55 | * 56 | * @throws java.io.IOException 57 | * Port already in use, bad bind etc. 58 | */ 59 | public void start() throws IOException { 60 | 61 | HiveConf conf = new HiveConf(this.getClass()); 62 | 63 | String listen = null; 64 | int port = -1; 65 | 66 | listen = conf.getVar(HiveConf.ConfVars.HIVEHWILISTENHOST); 67 | port = conf.getIntVar(HiveConf.ConfVars.HIVEHWILISTENPORT); 68 | 69 | if (listen.equals("")) { 70 | l4j.warn("hive.hwi.listen.host was not specified defaulting to 0.0.0.0"); 71 | listen = "0.0.0.0"; 72 | } 73 | if (port == -1) { 74 | l4j.warn("hive.hwi.listen.port was not specified defaulting to 9999"); 75 | port = 9999; 76 | } 77 | 78 | String hwiWAR = conf.getVar(HiveConf.ConfVars.HIVEHWIWARFILE); 79 | String hivehome = System.getenv().get("HIVE_HOME"); 80 | File hwiWARFile = new File(hivehome, hwiWAR); 81 | if (!hwiWARFile.exists()) { 82 | l4j.fatal("HWI WAR file not found at " + hwiWARFile.toString()); 83 | System.exit(1); 84 | } 85 | 86 | webServer = ShimLoader.getJettyShims().startServer(listen, port); 87 | webServer.addWar(hwiWARFile.toString(), "/hwi"); 88 | 89 | /* 90 | * The command line args may be used by multiple components. Rather by 91 | * setting these as a system property we avoid having to specifically 92 | * pass them 93 | */ 94 | StringBuilder sb = new StringBuilder(); 95 | for (String arg : args) { 96 | sb.append(arg + " "); 97 | } 98 | System.setProperty("hwi-args", sb.toString()); 99 | 100 | try { 101 | while (true) { 102 | try { 103 | webServer.start(); 104 | webServer.join(); 105 | l4j.debug(" HWI Web Server is started."); 106 | break; 107 | } catch (org.mortbay.util.MultiException ex) { 108 | throw ex; 109 | } 110 | } 111 | } catch (IOException ie) { 112 | throw ie; 113 | } catch (Exception e) { 114 | IOException ie = new IOException("Problem starting HWI server"); 115 | ie.initCause(e); 116 | l4j.error("Parsing hwi.listen.port caused exception ", e); 117 | throw ie; 118 | } 119 | } 120 | 121 | /** 122 | * 123 | * @param args 124 | * as of now no arguments are supported 125 | * @throws java.lang.Exception 126 | * Could be thrown if due to issues with Jetty or bad 127 | * configuration options 128 | * 129 | */ 130 | public static void main(String[] args) throws Exception { 131 | HWIServer hwi = new HWIServer(args); 132 | l4j.info("HWI is starting up"); 133 | hwi.start(); 134 | } 135 | 136 | /** 137 | * Shut down the running HWI Server. 138 | * 139 | * @throws Exception 140 | * Running Thread.stop() can and probably will throw this 141 | */ 142 | public void stop() throws Exception { 143 | l4j.info("HWI is shutting down"); 144 | webServer.stop(); 145 | } 146 | 147 | } 148 | -------------------------------------------------------------------------------- /src/main/resources/package.jdo: -------------------------------------------------------------------------------- 1 | 2 | 3 | 21 | 22 | 30 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | 147 | 148 | 149 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/Velocity.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import java.io.IOException; 19 | import java.io.StringWriter; 20 | import java.io.Writer; 21 | 22 | import javax.servlet.ServletConfig; 23 | import javax.servlet.http.HttpServletRequest; 24 | 25 | import org.apache.velocity.Template; 26 | import org.apache.velocity.tools.view.VelocityView; 27 | import org.apache.velocity.context.Context; 28 | import org.apache.velocity.exception.MethodInvocationException; 29 | 30 | public class Velocity extends VelocityView{ 31 | 32 | /** 33 | * The velocity.properties key for specifying the servlet's error template. 34 | */ 35 | public static final String PROPERTY_ERROR_TEMPLATE = "tools.view.servlet.error.template"; 36 | 37 | /** 38 | * The velocity.properties key for specifying the relative directory holding 39 | * layout templates. 40 | */ 41 | public static final String PROPERTY_LAYOUT_DIR = "tools.view.servlet.layout.directory"; 42 | 43 | /** 44 | * The velocity.properties key for specifying the servlet's default layout 45 | * template's filename. 46 | */ 47 | public static final String PROPERTY_DEFAULT_LAYOUT = "tools.view.servlet.layout.default.template"; 48 | 49 | /** 50 | * The default error template's filename. 51 | */ 52 | public static final String DEFAULT_ERROR_TEMPLATE = "Error.vm"; 53 | 54 | /** 55 | * The default layout directory 56 | */ 57 | public static final String DEFAULT_LAYOUT_DIR = "layout/"; 58 | 59 | /** 60 | * The default filename for the servlet's default layout 61 | */ 62 | public static final String DEFAULT_DEFAULT_LAYOUT = "Default.vm"; 63 | 64 | /** 65 | * The context key that will hold the content of the screen. 66 | * 67 | * This key ($screen_content) must be present in the layout template for the 68 | * current screen to be rendered. 69 | */ 70 | public static final String KEY_SCREEN_CONTENT = "screen_content"; 71 | 72 | /** 73 | * The context/parameter key used to specify an alternate layout to be used 74 | * for a request instead of the default layout. 75 | */ 76 | public static final String KEY_LAYOUT = "layout"; 77 | 78 | /** 79 | * The context key that holds the {@link Throwable} that broke the rendering 80 | * of the requested screen. 81 | */ 82 | public static final String KEY_ERROR_CAUSE = "error_cause"; 83 | 84 | /** 85 | * The context key that holds the stack trace of the error that broke the 86 | * rendering of the requested screen. 87 | */ 88 | public static final String KEY_ERROR_STACKTRACE = "stack_trace"; 89 | 90 | /** 91 | * The context key that holds the {@link MethodInvocationException} that 92 | * broke the rendering of the requested screen. 93 | * 94 | * If this value is placed in the context, then $error_cause will hold the 95 | * error that this invocation exception is wrapping. 96 | */ 97 | public static final String KEY_ERROR_INVOCATION_EXCEPTION = "invocation_exception"; 98 | 99 | protected String errorTemplate; 100 | protected String layoutDir; 101 | protected String defaultLayout; 102 | 103 | public Velocity(ServletConfig config) { 104 | super(config); 105 | 106 | // check for default template path overrides 107 | errorTemplate = getProperty(PROPERTY_ERROR_TEMPLATE, 108 | DEFAULT_ERROR_TEMPLATE); 109 | layoutDir = getProperty(PROPERTY_LAYOUT_DIR, DEFAULT_LAYOUT_DIR); 110 | defaultLayout = getProperty(PROPERTY_DEFAULT_LAYOUT, 111 | DEFAULT_DEFAULT_LAYOUT); 112 | 113 | // preventive error checking! directory must end in / 114 | if (!layoutDir.endsWith("/")) { 115 | layoutDir += '/'; 116 | } 117 | 118 | // for efficiency's sake, make defaultLayout a full path now 119 | defaultLayout = layoutDir + defaultLayout; 120 | } 121 | 122 | public void render(String path, HttpServletRequest request, Writer writer) 123 | throws IOException { 124 | 125 | // then get a context 126 | Context context = createContext(request, null); 127 | 128 | // get the template 129 | Template template = getTemplate(path); 130 | 131 | // merge the template and context into the response 132 | mergeTemplate(template, context, writer); 133 | } 134 | 135 | /** 136 | * Overrides VelocityViewServlet.mergeTemplate to do a two-pass render for 137 | * handling layouts 138 | */ 139 | protected void mergeTemplate(Template template, Context context, 140 | Writer writer) throws IOException { 141 | // 142 | // this section is based on Tim Colson's "two pass render" 143 | // 144 | // Render the screen content 145 | StringWriter sw = new StringWriter(); 146 | template.merge(context, sw); 147 | // Add the resulting content to the context 148 | context.put(KEY_SCREEN_CONTENT, sw.toString()); 149 | 150 | //Get layout 151 | Object obj = context.get(KEY_LAYOUT); 152 | String layout = (obj == null) ? null : obj.toString(); 153 | if (layout == null) { 154 | layout = defaultLayout; 155 | } else { 156 | layout = layoutDir + layout; 157 | } 158 | 159 | //Render layout 160 | try { 161 | template = getTemplate(layout); 162 | } catch (Exception e) { 163 | if (!layout.equals(defaultLayout)) { 164 | template = getTemplate(defaultLayout); 165 | } 166 | } 167 | merge(template, context, writer); 168 | } 169 | 170 | public boolean templateExists(String name){ 171 | return velocity.resourceExists(name); 172 | } 173 | 174 | } 175 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/model/MQuery.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.model; 17 | 18 | import java.util.Calendar; 19 | import java.util.Date; 20 | import java.util.TimeZone; 21 | 22 | import javax.jdo.annotations.IdGeneratorStrategy; 23 | import javax.jdo.annotations.PersistenceCapable; 24 | import javax.jdo.annotations.Persistent; 25 | import javax.jdo.annotations.PrimaryKey; 26 | 27 | @PersistenceCapable 28 | public class MQuery { 29 | 30 | public static enum Status { 31 | INITED, RUNNING, FINISHED, CANCELLED, FAILED, SYNTAXERROR 32 | }; 33 | 34 | @PrimaryKey 35 | @Persistent(valueStrategy=IdGeneratorStrategy.IDENTITY) 36 | private Integer id; 37 | 38 | private String name; 39 | 40 | private String query; 41 | 42 | private String resultLocation; 43 | 44 | private Status status; 45 | 46 | private String errorMsg; 47 | 48 | private Integer errorCode; 49 | 50 | private String callback; 51 | 52 | private String jobId; 53 | 54 | private String userId; 55 | 56 | private String groupId; 57 | 58 | private Integer crontabId; 59 | 60 | private Date created; 61 | 62 | private Date updated; 63 | 64 | private Integer cpuTime; 65 | 66 | private Integer totalTime; 67 | 68 | public MQuery(String name, String query, String callback, String userId, 69 | String groupId) { 70 | this.name = name; 71 | this.query = query; 72 | this.callback = callback; 73 | this.resultLocation = ""; 74 | this.created = Calendar.getInstance(TimeZone.getDefault()).getTime(); 75 | this.updated = this.created; 76 | this.status = Status.INITED; 77 | this.userId = userId; 78 | this.groupId = groupId; 79 | } 80 | 81 | public void copy(MQuery mquery) { 82 | name = mquery.name; 83 | query = mquery.query; 84 | callback = mquery.callback; 85 | resultLocation = mquery.resultLocation; 86 | errorMsg = mquery.errorMsg; 87 | errorCode = mquery.errorCode; 88 | created = mquery.created; 89 | updated = mquery.updated; 90 | status = mquery.status; 91 | userId = mquery.userId; 92 | groupId = mquery.groupId; 93 | jobId = mquery.jobId; 94 | crontabId = mquery.crontabId; 95 | cpuTime = mquery.cpuTime; 96 | totalTime = mquery.totalTime; 97 | } 98 | 99 | public String getName() { 100 | return name; 101 | } 102 | 103 | public void setName(String name) { 104 | this.name = name; 105 | } 106 | 107 | public String getQuery() { 108 | return query; 109 | } 110 | 111 | public void setQuery(String query) { 112 | this.query = query; 113 | } 114 | 115 | public String getResultLocation() { 116 | return resultLocation; 117 | } 118 | 119 | public void setResultLocation(String resultLocation) { 120 | this.resultLocation = resultLocation; 121 | } 122 | 123 | public Status getStatus() { 124 | return status; 125 | } 126 | 127 | public void setStatus(Status status) { 128 | this.status = status; 129 | } 130 | 131 | public String getErrorMsg() { 132 | return errorMsg; 133 | } 134 | 135 | public void setErrorMsg(String errorMsg) { 136 | this.errorMsg = errorMsg; 137 | } 138 | 139 | public Integer getErrorCode() { 140 | return errorCode; 141 | } 142 | 143 | public void setErrorCode(Integer errorCode) { 144 | this.errorCode = errorCode; 145 | } 146 | 147 | public String getCallback() { 148 | return callback; 149 | } 150 | 151 | public void setCallback(String callback) { 152 | this.callback = callback; 153 | } 154 | 155 | public String getUserId() { 156 | return userId; 157 | } 158 | 159 | public void setUserId(String userId) { 160 | this.userId = userId; 161 | } 162 | 163 | public String getGroupId() { 164 | return groupId; 165 | } 166 | 167 | public void setGroupId(String groupId) { 168 | this.groupId = groupId; 169 | } 170 | 171 | public Integer getCrontabId() { 172 | return crontabId; 173 | } 174 | 175 | public void setCrontabId(Integer crontabId) { 176 | this.crontabId = crontabId; 177 | } 178 | 179 | public Date getCreated() { 180 | return created; 181 | } 182 | 183 | public void setCreated(Date created) { 184 | this.created = created; 185 | } 186 | 187 | public Date getUpdated() { 188 | return updated; 189 | } 190 | 191 | public void setUpdated(Date updated) { 192 | this.updated = updated; 193 | } 194 | 195 | public Integer getId() { 196 | return id; 197 | } 198 | 199 | public void setId(Integer id) { 200 | this.id = id; 201 | } 202 | 203 | public String getJobId() { 204 | return jobId; 205 | } 206 | 207 | public void setJobId(String jobId) { 208 | this.jobId = jobId; 209 | } 210 | 211 | public Integer getCpuTime() { 212 | return cpuTime; 213 | } 214 | 215 | public void setCpuTime(Integer cpuTime) { 216 | this.cpuTime = cpuTime; 217 | } 218 | 219 | public Integer getTotalTime() { 220 | return totalTime; 221 | } 222 | 223 | public void setTotalTime(Integer totalTime) { 224 | this.totalTime = totalTime; 225 | } 226 | 227 | } 228 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/query/QueryManager.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.query; 17 | 18 | import java.util.List; 19 | import java.util.Properties; 20 | 21 | import org.apache.hadoop.hive.hwi.model.MCrontab; 22 | import org.apache.hadoop.hive.hwi.model.MQuery; 23 | import org.apache.hadoop.hive.hwi.query.RunningRunner.Running; 24 | import org.quartz.CronScheduleBuilder; 25 | import org.quartz.JobBuilder; 26 | import org.quartz.JobDataMap; 27 | import org.quartz.JobDetail; 28 | import org.quartz.JobKey; 29 | import org.quartz.Scheduler; 30 | import org.quartz.SchedulerException; 31 | import org.quartz.Trigger; 32 | import org.quartz.TriggerBuilder; 33 | import org.quartz.impl.StdSchedulerFactory; 34 | 35 | import org.apache.commons.logging.Log; 36 | import org.apache.commons.logging.LogFactory; 37 | 38 | public class QueryManager { 39 | protected static final Log l4j = LogFactory.getLog(QueryManager.class 40 | .getName()); 41 | 42 | private static QueryManager instance; 43 | 44 | private Scheduler scheduler; 45 | private RunningRunner runner; 46 | 47 | private QueryManager() { 48 | } 49 | 50 | public static QueryManager getInstance() { 51 | if (instance == null) { 52 | synchronized (QueryManager.class) { 53 | if (instance == null) { 54 | instance = new QueryManager(); 55 | } 56 | } 57 | } 58 | return instance; 59 | } 60 | 61 | public void start() { 62 | try { 63 | l4j.info("QueryManager starting."); 64 | startScheduler(); 65 | loadCrontabs(); 66 | startRunner(); 67 | l4j.info("QueryManager started."); 68 | } catch (SchedulerException e) { 69 | e.printStackTrace(); 70 | l4j.error("QueryManager failed to start."); 71 | } 72 | } 73 | 74 | protected void startScheduler() throws SchedulerException { 75 | Properties props = new Properties(); 76 | props.setProperty("org.quartz.threadPool.threadCount", "30"); 77 | 78 | StdSchedulerFactory ssf = new StdSchedulerFactory(props); 79 | scheduler = ssf.getScheduler(); 80 | scheduler.start(); 81 | } 82 | 83 | protected void loadCrontabs() { 84 | List crontabs = QueryStore.getInstance().runningCrontabs(); 85 | for (MCrontab crontab : crontabs) { 86 | schedule(crontab); 87 | } 88 | l4j.info("Crontabs loaded."); 89 | } 90 | 91 | protected void startRunner() { 92 | runner = new RunningRunner(); 93 | runner.start(); 94 | } 95 | 96 | public boolean submit(MQuery mquery) { 97 | if (scheduler == null) 98 | return false; 99 | 100 | if (mquery.getId() == null) 101 | return false; 102 | 103 | JobDetail job = JobBuilder.newJob(QueryRunner.class) 104 | .withIdentity(mquery.getId().toString(), mquery.getUserId()) 105 | .build(); 106 | 107 | JobDataMap map = job.getJobDataMap(); 108 | map.put("mqueryId", mquery.getId()); 109 | 110 | Trigger trigger = TriggerBuilder.newTrigger() 111 | .withIdentity(mquery.getId().toString(), mquery.getUserId()) 112 | .startNow().build(); 113 | 114 | try { 115 | scheduler.scheduleJob(job, trigger); 116 | } catch (SchedulerException e) { 117 | e.printStackTrace(); 118 | l4j.error("QueryManager failed to schedule."); 119 | return false; 120 | } 121 | 122 | return true; 123 | } 124 | 125 | public boolean schedule(MCrontab crontab) { 126 | if (scheduler == null) 127 | return false; 128 | 129 | if (crontab.getId() == null) 130 | return false; 131 | 132 | JobDetail job = JobBuilder.newJob(QueryCrontab.class) 133 | .withIdentity(crontab.getId().toString(), crontab.getUserId()) 134 | .build(); 135 | 136 | JobDataMap map = job.getJobDataMap(); 137 | map.put("crontabId", crontab.getId()); 138 | 139 | Trigger trigger = TriggerBuilder 140 | .newTrigger() 141 | .withIdentity(crontab.getId().toString(), crontab.getUserId()) 142 | .withSchedule( 143 | CronScheduleBuilder.cronSchedule(crontab.getCrontab())) 144 | .build(); 145 | 146 | try { 147 | scheduler.scheduleJob(job, trigger); 148 | } catch (SchedulerException e) { 149 | e.printStackTrace(); 150 | l4j.error("QueryManager failed to schedule."); 151 | return false; 152 | } 153 | 154 | return true; 155 | } 156 | 157 | public boolean unschedule(MCrontab crontab) { 158 | if (scheduler == null) 159 | return false; 160 | 161 | if (crontab.getId() == null) 162 | return false; 163 | 164 | try { 165 | scheduler.deleteJob(new JobKey(crontab.getId().toString(), crontab 166 | .getUserId())); 167 | return true; 168 | } catch (SchedulerException e) { 169 | e.printStackTrace(); 170 | l4j.error("QueryManager unschedule failed."); 171 | return false; 172 | } 173 | } 174 | 175 | public boolean monitor(Running running) { 176 | if (runner == null) 177 | return false; 178 | 179 | runner.add(running); 180 | return true; 181 | } 182 | 183 | public void shutdown() { 184 | l4j.info("QueryManager shutting down."); 185 | 186 | if (scheduler != null) { 187 | try { 188 | scheduler.shutdown(); 189 | } catch (SchedulerException e) { 190 | e.printStackTrace(); 191 | l4j.error("scheduler failed to shutdown."); 192 | } 193 | } 194 | 195 | if (runner != null) 196 | runner.shutdown(); 197 | 198 | l4j.info("QueryManager shutdown complete."); 199 | } 200 | 201 | } 202 | -------------------------------------------------------------------------------- /src/test/java/org/apache/hadoop/hive/hwi/TestEverything.java: -------------------------------------------------------------------------------- 1 | package org.apache.hadoop.hive.hwi; 2 | 3 | import java.io.BufferedReader; 4 | import java.io.FileReader; 5 | import java.io.IOException; 6 | import java.io.InputStreamReader; 7 | import java.io.PrintWriter; 8 | import java.text.SimpleDateFormat; 9 | import java.util.ArrayList; 10 | import java.util.Calendar; 11 | import java.util.Date; 12 | import java.util.HashMap; 13 | import java.util.Locale; 14 | import java.util.Properties; 15 | 16 | import javax.jdo.PersistenceManager; 17 | import javax.jdo.PersistenceManagerFactory; 18 | import javax.jdo.Query; 19 | import javax.jdo.Transaction; 20 | 21 | import org.apache.hadoop.conf.Configuration; 22 | import org.apache.hadoop.hive.conf.HiveConf; 23 | import org.apache.hadoop.hive.hwi.model.MCrontab; 24 | import org.apache.hadoop.hive.hwi.model.MQuery; 25 | import org.apache.hadoop.hive.hwi.model.Pagination; 26 | import org.apache.hadoop.hive.hwi.query.QueryManager; 27 | import org.apache.hadoop.hive.hwi.query.QueryStore; 28 | import org.apache.hadoop.hive.hwi.util.HadoopUtil; 29 | import org.apache.hadoop.hive.hwi.util.QueryUtil; 30 | import org.apache.hadoop.hive.ql.session.SessionState; 31 | 32 | import java.net.URI; 33 | 34 | import javax.ws.rs.core.MultivaluedMap; 35 | import javax.ws.rs.core.UriBuilder; 36 | import com.sun.jersey.api.client.Client; 37 | import com.sun.jersey.api.client.ClientResponse; 38 | import com.sun.jersey.api.client.WebResource; 39 | import com.sun.jersey.core.util.MultivaluedMapImpl; 40 | 41 | public class TestEverything { 42 | 43 | public static void testAPI() { 44 | Client client = Client.create(); 45 | 46 | WebResource webResource = client .resource(getBaseURI()); 47 | 48 | MultivaluedMap formData = new MultivaluedMapImpl(); 49 | formData.add("name", "wanghuida"); 50 | formData.add("query", "select count(1) from pokes"); 51 | formData.add("callback", "http://localhost/abc.php"); 52 | ClientResponse response = webResource.path("queries").path("create").path("api").accept("application/json") 53 | .post(ClientResponse.class, formData); 54 | 55 | if (response.getStatus() != 200) { 56 | throw new RuntimeException("Failed : HTTP error code : " + response.getStatus()); 57 | } 58 | 59 | String output = response.getEntity(String.class); 60 | 61 | System.out.println("Output from Server .... \n"); 62 | System.out.println(output); 63 | } 64 | 65 | private static URI getBaseURI() { 66 | return UriBuilder.fromUri("http://localhost:9999/hwi").build(); 67 | } 68 | 69 | public static void testHistoryFile() throws IOException { 70 | QueryUtil 71 | .getHiveHistoryViewer("/tmp/hadoop/hive_job_log_hadoop_201212280953_1889961092.txt"); 72 | } 73 | 74 | public static String readFile(String path) throws IOException { 75 | @SuppressWarnings("resource") 76 | BufferedReader br = new BufferedReader(new FileReader(path)); 77 | StringBuffer str = new StringBuffer(); 78 | String line = br.readLine(); 79 | while (line != null) { 80 | str.append(line); 81 | str.append("\n"); 82 | line = br.readLine(); 83 | } 84 | 85 | return str.toString(); 86 | } 87 | 88 | public static void testGetJobTrackerURL() { 89 | System.out.println(HadoopUtil.getJobTrackerURL("aa")); 90 | } 91 | 92 | public static void testGetDataNodeURL() { 93 | System.out.println(HadoopUtil.getDataNodeURL("/tmp")); 94 | } 95 | 96 | public static void testHiveConf() { 97 | HiveConf conf = new HiveConf(SessionState.class); 98 | Properties p = conf.getAllProperties(); 99 | for (Object k : p.keySet()) { 100 | System.out.print((String) k + ","); 101 | System.out.println(p.get(k)); 102 | } 103 | } 104 | 105 | public static void testHadoopConf() throws IOException { 106 | Configuration conf = new Configuration(); 107 | conf.addResource("hdfs-default.xml"); 108 | conf.addResource("hdfs-site.xml"); 109 | Configuration.dumpConfiguration(conf, new PrintWriter(System.out)); 110 | } 111 | 112 | public static void testSlaves() throws IOException { 113 | ClassLoader classLoader = Thread.currentThread() 114 | .getContextClassLoader(); 115 | BufferedReader in = new BufferedReader(new InputStreamReader( 116 | classLoader.getResource("slaves").openStream())); 117 | String line = in.readLine(); 118 | System.out.println(line); 119 | } 120 | 121 | public static void testQueryCron() throws InterruptedException { 122 | QueryManager.getInstance(); 123 | MCrontab ct = new MCrontab("test-query", "select * from test", "", 124 | "*/10 * * * * ?", "hadoop", "hadoop"); 125 | QueryStore.getInstance().insertCrontab(ct); 126 | QueryManager.getInstance().schedule(ct); 127 | Thread.sleep(30000); 128 | QueryManager.getInstance().shutdown(); 129 | QueryManager.getInstance().shutdown(); 130 | } 131 | 132 | public static void testInt() { 133 | System.out.println(new Integer(10).toString()); 134 | } 135 | 136 | public static void testQuery() { 137 | QueryStore qs = QueryStore.getInstance(); 138 | Query query = qs.getPM().newQuery(MQuery.class); 139 | query.setOrdering("id DESC"); 140 | HashMap map = new HashMap(); 141 | map.put("crontabId", 3); 142 | query.setResult("COUNT(id)"); 143 | query.executeWithMap(null); 144 | // Pagination pagination = qs.paginate(query, map, 1, 2); 145 | // System.out.println(pagination.getTotal()); 146 | } 147 | 148 | public static void testCalendar() { 149 | Calendar c = Calendar.getInstance(); 150 | System.out.println(c.getDisplayName(Calendar.MONTH, Calendar.SHORT, 151 | Locale.CHINA)); 152 | } 153 | 154 | public static void testDate() { 155 | Date d = new Date(); 156 | SimpleDateFormat ft = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); 157 | System.out.println(ft.format(d)); 158 | } 159 | 160 | public static void testConcurrent() { 161 | ArrayList l = new ArrayList(); 162 | l.add("1"); 163 | for (String s : l) { 164 | l.remove(s); 165 | } 166 | } 167 | 168 | public static void testPersistenceManager() { 169 | PersistenceManagerFactory pmf = QueryStore.getInstance().getPMF(); 170 | 171 | PersistenceManager pm1 = pmf.getPersistenceManager(); 172 | System.out.println(pm1); 173 | 174 | MQuery mquery = pm1.getObjectById(MQuery.class, 204); 175 | mquery.setCallback("1"); 176 | 177 | Transaction tx1 = pm1.currentTransaction(); 178 | tx1.begin(); 179 | pm1.makePersistent(mquery); 180 | // tx1.commit(); 181 | 182 | PersistenceManager pm2 = pmf.getPersistenceManager(); 183 | System.out.println(pm2); 184 | 185 | mquery.setCallback("2"); 186 | MQuery mquery1 = pm2.getObjectById(MQuery.class, mquery.getId()); 187 | mquery1.copy(mquery); 188 | Transaction tx2 = pm2.currentTransaction(); 189 | tx2.begin(); 190 | pm2.makePersistent(mquery1); 191 | tx2.commit(); 192 | 193 | PersistenceManager pm3 = pmf.getPersistenceManager(); 194 | System.out.println(pm3); 195 | Query query = pm3.newQuery(MQuery.class); 196 | query.setOrdering("id DESC"); 197 | Pagination p = new Pagination(query, null, 1, 1); 198 | System.out.println(p.getItems().get(0).getId()); 199 | } 200 | 201 | public static void testValidateQuery() { 202 | try { 203 | // QueryUtil.validateQuery("add jar 1.jar;select * from test;select * from xx;"); 204 | } catch (Exception e) { 205 | e.printStackTrace(); 206 | } 207 | } 208 | 209 | public static void testStartStop() { 210 | QueryManager.getInstance().start(); 211 | QueryManager.getInstance().shutdown(); 212 | } 213 | 214 | public static void testProp() { 215 | Properties props = new Properties(); 216 | 217 | props.put("org.quartz.threadPool.threadCount", "20"); 218 | System.out.println(props 219 | .getProperty("org.quartz.threadPool.threadCount")); 220 | props.setProperty("org.quartz.threadPool.threadCount", "20"); 221 | System.out.println(props 222 | .getProperty("org.quartz.threadPool.threadCount")); 223 | } 224 | 225 | public static void main(String[] args) throws Exception { 226 | //testProp(); 227 | testAPI(); 228 | } 229 | } 230 | -------------------------------------------------------------------------------- /pom.xml: -------------------------------------------------------------------------------- 1 | 3 | 4.0.0 4 | 5 | org.apache.hive 6 | hive-hwi 7 | 1.0 8 | jar 9 | 10 | hwi 11 | http://maven.apache.org 12 | 13 | 14 | UTF-8 15 | 16 | 17 | 18 | 19 | junit 20 | junit 21 | 3.8.1 22 | test 23 | 24 | 25 | commons-logging 26 | commons-logging 27 | 1.1.1 28 | provided 29 | 30 | 31 | org.apache.hive 32 | hive-builtins 33 | 0.8.1 34 | provided 35 | 36 | 37 | org.apache.hive 38 | hive-common 39 | 0.8.1 40 | provided 41 | 42 | 43 | org.apache.hadoop 44 | hadoop-core 45 | 46 | 47 | 48 | 49 | org.apache.hive 50 | hive-exec 51 | 0.8.1 52 | provided 53 | 54 | 55 | javax.jdo 56 | jdo2-api 57 | 58 | 59 | org.apache.hadoop 60 | hadoop-core 61 | 62 | 63 | 64 | 65 | mysql 66 | mysql-connector-java 67 | 5.1.22 68 | provided 69 | 70 | 71 | javax.jdo 72 | jdo2-api 73 | 2.3-eb 74 | provided 75 | 76 | 77 | org.apache.hadoop 78 | hadoop-core 79 | 1.0.1 80 | provided 81 | 82 | 83 | org.apache.velocity 84 | velocity 85 | 1.7 86 | 87 | 88 | org.apache.velocity 89 | velocity-tools 90 | 2.0 91 | 92 | 93 | com.sun.jersey 94 | jersey-server 95 | 1.16 96 | 97 | 98 | com.sun.jersey 99 | jersey-servlet 100 | 1.16 101 | 102 | 103 | org.quartz-scheduler 104 | quartz 105 | 2.1.6 106 | 107 | 108 | com.sun.jersey 109 | jersey-client 110 | 1.16 111 | 112 | 113 | com.google.code.gson 114 | gson 115 | 2.2.2 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | org.apache.maven.plugins 124 | maven-compiler-plugin 125 | 126 | 1.6 127 | 1.6 128 | 129 | 130 | 131 | 132 | org.datanucleus 133 | maven-datanucleus-plugin 134 | 2.0.1 135 | 136 | 140 | src/main/resources/ 141 | 142 | 143 | 144 | process-classes 145 | 146 | enhance 147 | 148 | 149 | 150 | 151 | 152 | javax.jdo 153 | jdo2-api 154 | 2.3-eb 155 | 156 | 157 | 158 | 159 | 160 | maven-war-plugin 161 | 2.3 162 | 163 | true 164 | true 165 | WEB-INF/lib/*.jar 166 | 168 | 169 | 170 | 171 | 172 | org.apache.maven.plugins 173 | maven-shade-plugin 174 | 1.2.2 175 | 176 | 177 | package 178 | 179 | shade 180 | 181 | 182 | 183 | 184 | junit:junit 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | com.mycila.maven-license-plugin 194 | maven-license-plugin 195 | 196 |
header.txt
197 | 198 | src/main/java/** 199 | 200 | 201 | src/main/java/org/apache/hadoop/hive/hwi/HWIContextListener.java 202 | src/main/java/org/apache/hadoop/hive/hwi/HWIException.java 203 | src/main/java/org/apache/hadoop/hive/hwi/HWIServer.java 204 | src/main/java/org/apache/hadoop/hive/hwi/util/HWIHiveHistoryViewer.java 205 | 206 |
207 |
208 | 209 | 210 | 230 | 231 |
232 | 233 | 234 | 235 | ${basedir}/src/main/resources 236 | true 237 | 238 | hive-site.xml 239 | 240 | 241 | 242 | 243 |
244 | 245 |
246 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/query/QueryStore.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.query; 17 | 18 | import java.util.HashMap; 19 | import java.util.Iterator; 20 | import java.util.List; 21 | import java.util.Map; 22 | import java.util.Map.Entry; 23 | import java.util.Properties; 24 | 25 | import javax.jdo.JDOHelper; 26 | import javax.jdo.PersistenceManager; 27 | import javax.jdo.PersistenceManagerFactory; 28 | import javax.jdo.Query; 29 | import javax.jdo.Transaction; 30 | 31 | import org.apache.commons.logging.Log; 32 | import org.apache.commons.logging.LogFactory; 33 | import org.apache.hadoop.conf.Configuration; 34 | import org.apache.hadoop.hive.conf.HiveConf; 35 | import org.apache.hadoop.hive.hwi.model.MCrontab; 36 | import org.apache.hadoop.hive.hwi.model.MQuery; 37 | import org.apache.hadoop.hive.hwi.model.Pagination; 38 | import org.apache.hadoop.hive.ql.session.SessionState; 39 | 40 | public class QueryStore { 41 | private static final Log l4j = LogFactory 42 | .getLog(QueryStore.class.getName()); 43 | 44 | private static QueryStore instance; 45 | 46 | private PersistenceManagerFactory pmf; 47 | 48 | private ThreadLocal pm = new ThreadLocal(); 49 | 50 | private QueryStore() { 51 | HiveConf hiveConf = new HiveConf(SessionState.class); 52 | Properties props = getDataSourceProps(hiveConf); 53 | pmf = JDOHelper.getPersistenceManagerFactory(props); 54 | } 55 | 56 | public static QueryStore getInstance() { 57 | if (instance == null) { 58 | synchronized (QueryStore.class) { 59 | if (instance == null) { 60 | instance = new QueryStore(); 61 | } 62 | } 63 | } 64 | return instance; 65 | } 66 | 67 | 68 | /** 69 | * Properties specified in hive-default.xml override the properties 70 | * specified in jpox.properties. 71 | */ 72 | private static Properties getDataSourceProps(Configuration conf) { 73 | Properties prop = new Properties(); 74 | 75 | Iterator> iter = conf.iterator(); 76 | while (iter.hasNext()) { 77 | Map.Entry e = iter.next(); 78 | if (e.getKey().contains("datanucleus") 79 | || e.getKey().contains("jdo")) { 80 | Object prevVal = prop.setProperty(e.getKey(), 81 | conf.get(e.getKey())); 82 | if (l4j.isDebugEnabled() 83 | && !e.getKey().equals( 84 | HiveConf.ConfVars.METASTOREPWD.varname)) { 85 | l4j.debug("Overriding " + e.getKey() + " value " + prevVal 86 | + " from jpox.properties with " + e.getValue()); 87 | } 88 | } 89 | } 90 | 91 | if (l4j.isDebugEnabled()) { 92 | for (Entry e : prop.entrySet()) { 93 | if (!e.getKey().equals(HiveConf.ConfVars.METASTOREPWD.varname)) { 94 | l4j.debug(e.getKey() + " = " + e.getValue()); 95 | } 96 | } 97 | } 98 | return prop; 99 | } 100 | 101 | public PersistenceManagerFactory getPMF() { 102 | return pmf; 103 | } 104 | 105 | public PersistenceManager getPM() { 106 | if (pm.get() == null || pm.get().isClosed()) { 107 | pm.set(getPMF().getPersistenceManager()); 108 | } 109 | return pm.get(); 110 | } 111 | 112 | public void shutdown() { 113 | if (pm.get() != null) { 114 | pm.get().close(); 115 | } 116 | } 117 | 118 | /** 119 | * new query 120 | * 121 | * @param mquery 122 | */ 123 | public void insertQuery(MQuery mquery) { 124 | Transaction tx = getPM().currentTransaction(); 125 | 126 | try { 127 | tx.begin(); 128 | } catch (Exception e) { 129 | e.printStackTrace(); 130 | return; 131 | } 132 | 133 | try { 134 | getPM().makePersistent(mquery); 135 | } catch (Exception e) { 136 | e.printStackTrace(); 137 | } 138 | 139 | try { 140 | tx.commit(); 141 | } catch (Exception e) { 142 | e.printStackTrace(); 143 | tx.rollback(); 144 | } 145 | } 146 | 147 | /** 148 | * 149 | * @param mquery 150 | */ 151 | public void copyAndUpdateQuery(MQuery mquery) { 152 | 153 | try { 154 | l4j.debug("mquery classloader " + mquery.getClass().getClassLoader()); 155 | 156 | Transaction tx = getPM().currentTransaction(); 157 | MQuery query = getPM().getObjectById(MQuery.class, mquery.getId()); 158 | 159 | l4j.debug("query classloader " + query.getClass().getClassLoader()); 160 | 161 | query.copy(mquery); 162 | 163 | try { 164 | tx.begin(); 165 | } catch (Exception e) { 166 | e.printStackTrace(); 167 | return; 168 | } 169 | 170 | try { 171 | getPM().makePersistent(query); 172 | } catch (Exception e) { 173 | e.printStackTrace(); 174 | } 175 | 176 | try { 177 | tx.commit(); 178 | } catch (Exception e) { 179 | tx.rollback(); 180 | } 181 | } catch (Exception e) { 182 | e.printStackTrace(); 183 | } 184 | } 185 | 186 | public void updateQuery(MQuery mquery) { 187 | Transaction tx = null; 188 | try { 189 | tx = getPM().currentTransaction(); 190 | tx.begin(); 191 | } catch (Exception e) { 192 | e.printStackTrace(); 193 | return ; 194 | } 195 | 196 | try { 197 | getPM().makePersistent(mquery); 198 | } catch (Exception e) { 199 | e.printStackTrace(); 200 | } 201 | 202 | try { 203 | tx.commit(); 204 | } catch (Exception e) { 205 | tx.rollback(); 206 | } 207 | } 208 | 209 | /** 210 | * paginate by page and pageSize 211 | * 212 | * @param page 213 | * @param pageSize 214 | * @return 215 | */ 216 | public Pagination paginate(int page, int pageSize) { 217 | Query query = getPM().newQuery(MQuery.class); 218 | query.setOrdering("id DESC"); 219 | return paginate(query, null, page, pageSize); 220 | } 221 | 222 | public Pagination paginate(Query query, Map map, 223 | int page, int pageSize) { 224 | return new Pagination(query, map, page, pageSize); 225 | } 226 | 227 | /** 228 | * 229 | * @param queryId 230 | * @return 231 | */ 232 | public MQuery getById(Integer queryId) { 233 | Query query = getPM().newQuery(MQuery.class, "id == :id "); 234 | query.setUnique(true); 235 | 236 | Object obj = query.execute(queryId); 237 | 238 | l4j.debug("---- getById start ----"); 239 | 240 | l4j.debug("object class loader: " + obj.getClass().getClassLoader()); 241 | l4j.debug("Query class loader:" + Query.class.getClassLoader()); 242 | l4j.debug("query class loader:" + query.getClass().getClassLoader()); 243 | l4j.debug("MQuery class loader: " + MQuery.class.getClassLoader()); 244 | 245 | l4j.debug("---- getById end ----"); 246 | 247 | return (MQuery) obj; 248 | } 249 | 250 | public void insertCrontab(MCrontab crontab) { 251 | 252 | Transaction tx = getPM().currentTransaction(); 253 | 254 | try { 255 | tx.begin(); 256 | } catch (Exception e) { 257 | e.printStackTrace(); 258 | return; 259 | } 260 | 261 | try { 262 | getPM().makePersistent(crontab); 263 | } catch (Exception e) { 264 | e.printStackTrace(); 265 | } 266 | 267 | try { 268 | tx.commit(); 269 | } catch (Exception e) { 270 | tx.rollback(); 271 | } 272 | } 273 | 274 | public MCrontab getCrontabById(Integer crontabId) { 275 | Query query = getPM().newQuery(MCrontab.class, "id == :id "); 276 | query.setUnique(true); 277 | return (MCrontab) query.execute(crontabId); 278 | } 279 | 280 | public Pagination crontabPaginate(int page, int pageSize) { 281 | Query query = getPM().newQuery(MCrontab.class); 282 | query.setOrdering("id DESC"); 283 | return crontabPaginate(query, null, page, pageSize); 284 | } 285 | 286 | public Pagination crontabPaginate(Query query, 287 | Map map, int page, int pageSize) { 288 | return new Pagination(query, map, page, pageSize); 289 | } 290 | 291 | @SuppressWarnings("unchecked") 292 | public List runningCrontabs() { 293 | Query query = getPM().newQuery(MCrontab.class); 294 | query.setFilter("status == :status"); 295 | query.setOrdering("id DESC"); 296 | HashMap map = new HashMap(); 297 | map.put("status", MCrontab.Status.RUNNING); 298 | return (List) query.executeWithMap(map); 299 | } 300 | 301 | public void updateCrontab(MCrontab mcrontab) { 302 | Transaction tx = getPM().currentTransaction(); 303 | MCrontab crontab = getPM().getObjectById(MCrontab.class, 304 | mcrontab.getId()); 305 | crontab.copy(mcrontab); 306 | 307 | try { 308 | tx.begin(); 309 | } catch (Exception e) { 310 | e.printStackTrace(); 311 | return; 312 | } 313 | 314 | try { 315 | getPM().makePersistent(crontab); 316 | } catch (Exception e) { 317 | e.printStackTrace(); 318 | } 319 | 320 | try { 321 | tx.commit(); 322 | } catch (Exception e) { 323 | tx.rollback(); 324 | } 325 | } 326 | } 327 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RQuery.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import java.io.BufferedReader; 19 | import java.io.InputStreamReader; 20 | import java.io.PrintWriter; 21 | import java.net.URI; 22 | import java.text.SimpleDateFormat; 23 | import java.util.ArrayList; 24 | import java.util.Calendar; 25 | import java.util.Date; 26 | import java.util.HashMap; 27 | import java.util.List; 28 | import java.util.Map; 29 | import java.util.TimeZone; 30 | 31 | import javax.jdo.Query; 32 | import javax.ws.rs.DefaultValue; 33 | import javax.ws.rs.FormParam; 34 | import javax.ws.rs.GET; 35 | import javax.ws.rs.POST; 36 | import javax.ws.rs.Path; 37 | import javax.ws.rs.PathParam; 38 | import javax.ws.rs.Produces; 39 | import javax.ws.rs.QueryParam; 40 | import javax.ws.rs.WebApplicationException; 41 | import javax.ws.rs.core.MediaType; 42 | import javax.ws.rs.core.Response; 43 | 44 | import org.apache.commons.logging.Log; 45 | import org.apache.commons.logging.LogFactory; 46 | import org.apache.hadoop.fs.FileStatus; 47 | import org.apache.hadoop.fs.FileSystem; 48 | import org.apache.hadoop.hive.conf.HiveConf; 49 | import org.apache.hadoop.hive.hwi.model.MQuery; 50 | import org.apache.hadoop.hive.hwi.model.Pagination; 51 | import org.apache.hadoop.hive.hwi.query.QueryManager; 52 | import org.apache.hadoop.hive.hwi.query.QueryStore; 53 | import org.apache.hadoop.hive.hwi.util.HadoopUtil; 54 | import org.apache.hadoop.hive.ql.exec.Utilities; 55 | import org.apache.hadoop.hive.ql.session.SessionState; 56 | 57 | import com.sun.jersey.api.view.Viewable; 58 | import org.apache.hadoop.hive.hwi.util.APIResult; 59 | 60 | @Path("/queries") 61 | public class RQuery extends RBase { 62 | protected static final Log l4j = LogFactory.getLog(RQuery.class.getName()); 63 | 64 | @GET 65 | @Produces("text/html") 66 | public Viewable list( 67 | @QueryParam(value = "crontabId") Integer crontabId, 68 | @QueryParam(value = "queryName") String queryName, 69 | @QueryParam(value = "extra") String extra, 70 | @QueryParam(value = "page") @DefaultValue(value = "1") int page, 71 | @QueryParam(value = "pageSize") @DefaultValue(value = "20") int pageSize) { 72 | 73 | QueryStore qs = QueryStore.getInstance(); 74 | Pagination pagination = null; 75 | 76 | Query query = qs.getPM().newQuery(MQuery.class); 77 | query.setOrdering("id DESC"); 78 | 79 | HashMap map = new HashMap(); 80 | 81 | if (crontabId != null) { 82 | query.setFilter("crontabId == :crontabId"); 83 | map.put("crontabId", crontabId); 84 | } 85 | 86 | if (queryName != null) { 87 | queryName = queryName.trim(); 88 | if (!queryName.equals("")) { 89 | query.setFilter("name.matches('(?i).*" + queryName + ".*')"); 90 | } 91 | } 92 | 93 | if ("my".equals(extra)) { 94 | query.setFilter("userId == :userId"); 95 | request.setAttribute("extra", extra); 96 | map.put("userId", getUser()); 97 | } 98 | 99 | l4j.debug("------ list begin --------"); 100 | l4j.debug("query classloader: " + query.getClass().getClassLoader()); 101 | l4j.debug("MQuery classloader: " + MQuery.class.getClassLoader()); 102 | l4j.debug("------ list end --------"); 103 | 104 | pagination = qs.paginate(query, map, page, pageSize); 105 | 106 | request.setAttribute("crontabId", crontabId); 107 | request.setAttribute("queryName", queryName); 108 | request.setAttribute("pagination", pagination); 109 | 110 | return new Viewable("/query/list.vm"); 111 | } 112 | 113 | @GET 114 | @Path("{id}") 115 | @Produces("text/html") 116 | public Viewable info(@PathParam(value = "id") Integer id) { 117 | QueryStore qs = QueryStore.getInstance(); 118 | 119 | MQuery query = qs.getById(id); 120 | 121 | if (query == null) 122 | throw new WebApplicationException(404); 123 | 124 | 125 | l4j.debug("------ view info start ------"); 126 | l4j.debug("query: " + query.getClass().getClassLoader().getClass().getName()); 127 | l4j.debug("------ view info end ------"); 128 | 129 | request.setAttribute("query", query); 130 | 131 | if (query.getJobId() != null) { 132 | List> jobInfos = new ArrayList>(); 133 | for (String jobId : query.getJobId().split(";")) { 134 | if (jobId.equals("")) 135 | continue; 136 | HashMap map = new HashMap(); 137 | map.put("id", jobId); 138 | map.put("url", HadoopUtil.getJobTrackerURL(jobId)); 139 | jobInfos.add(map); 140 | } 141 | request.setAttribute("jobInfos", jobInfos); 142 | } 143 | 144 | 145 | SimpleDateFormat sf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); 146 | request.setAttribute("createdTime", sf.format(query.getCreated())); 147 | request.setAttribute("updatedTime", sf.format(query.getUpdated())); 148 | 149 | if (query.getCpuTime() != null) { 150 | request.setAttribute("cpuTime", 151 | Utilities.formatMsecToStr(query.getCpuTime())); 152 | } 153 | 154 | if (query.getTotalTime() != null) { 155 | request.setAttribute("totalTime", 156 | Utilities.formatMsecToStr(query.getTotalTime())); 157 | } 158 | 159 | if (query.getCpuTime() != null && query.getTotalTime() != null 160 | && query.getCpuTime() > query.getTotalTime()) { 161 | request.setAttribute( 162 | "savedTime", 163 | Utilities.formatMsecToStr(Math.abs(query.getCpuTime() 164 | - query.getTotalTime()))); 165 | } 166 | 167 | return new Viewable("/query/info.vm"); 168 | } 169 | 170 | @GET 171 | @Path("create") 172 | @Produces("text/html;charset=ISO-8859-1") 173 | public Viewable create(@QueryParam(value = "queryId") Integer queryId) { 174 | if (queryId != null) { 175 | MQuery mquery = QueryStore.getInstance().getById(queryId); 176 | if (mquery != null) { 177 | request.setAttribute("name", mquery.getName()); 178 | request.setAttribute("query", mquery.getQuery()); 179 | request.setAttribute("callback", mquery.getCallback()); 180 | } 181 | } 182 | return new Viewable("/query/create.vm"); 183 | } 184 | 185 | /** 186 | * 187 | * @param query 188 | * @param name 189 | * @param callback 190 | * @param mode "form" mean html, "api" mean interface 191 | * @return 192 | */ 193 | @POST 194 | @Path("create/{mode}") 195 | @Produces({"text/html;charset=ISO-8859-1", MediaType.APPLICATION_JSON}) 196 | public Response create(@FormParam(value = "query") String query, 197 | @FormParam(value = "name") String name, 198 | @FormParam(value = "callback") String callback, 199 | @PathParam("mode") String mode) { 200 | 201 | Viewable v = new Viewable("/query/create.vm"); 202 | boolean is_api = mode.equals("api") ? true : false; 203 | 204 | if (query == null || query.equals("")) { 205 | if(is_api) { 206 | APIResult ret = new APIResult(APIResult.ERR, "please input {query} parameter"); 207 | return Response.ok(ret.toJson(), MediaType.APPLICATION_JSON).build(); 208 | }else{ 209 | request.setAttribute("msg", "query can't be empty"); 210 | return Response.status(200).entity(v).build(); 211 | } 212 | } 213 | 214 | if (name == null || "".equals(name)) { 215 | Date created = Calendar.getInstance(TimeZone.getDefault()).getTime(); 216 | SimpleDateFormat sf = new SimpleDateFormat("yyyy-MM-dd_HH:mm:ss"); 217 | name = sf.format(created); 218 | } 219 | 220 | QueryStore qs = QueryStore.getInstance(); 221 | 222 | String user = getUser(); 223 | if(user == null || "".equals(user)) { 224 | user = "hadoop"; 225 | } 226 | 227 | l4j.debug("MQuery user " + user); 228 | 229 | MQuery mquery = new MQuery(name, query, callback, user, "hadoop"); 230 | qs.insertQuery(mquery); 231 | 232 | QueryManager.getInstance().submit(mquery); 233 | 234 | if (mquery == null || mquery.getId() == null) { 235 | request.setAttribute("msg", "save query failed"); 236 | return Response.status(200).entity(v).build(); 237 | } 238 | 239 | if(is_api) { 240 | APIResult ret = new APIResult(APIResult.OK, "command has been submitted", mquery.getId() ); 241 | return Response.ok(ret.toJson(), MediaType.APPLICATION_JSON).build(); 242 | }else { 243 | throw new WebApplicationException(Response.seeOther( 244 | URI.create("queries/" + mquery.getId())).build()); 245 | } 246 | } 247 | 248 | 249 | @GET 250 | @Path("{id}/result") 251 | @Produces("text/html") 252 | public Viewable result( 253 | @PathParam(value = "id") Integer id, 254 | @QueryParam(value = "raw") @DefaultValue(value = "false") boolean raw) { 255 | Viewable v = new Viewable("/query/result.vm"); 256 | 257 | QueryStore qs = QueryStore.getInstance(); 258 | 259 | MQuery query = qs.getById(id); 260 | 261 | if (query == null) { 262 | throw new WebApplicationException(404); 263 | } 264 | 265 | request.setAttribute("query", query); 266 | 267 | if (query.getStatus() != MQuery.Status.FINISHED) { 268 | throw new WebApplicationException(404); 269 | } 270 | 271 | ArrayList partialResult = new ArrayList(); 272 | 273 | try { 274 | 275 | org.apache.hadoop.fs.Path rPath = new org.apache.hadoop.fs.Path( 276 | query.getResultLocation()); 277 | HiveConf hiveConf = new HiveConf(SessionState.class); 278 | FileSystem fs = rPath.getFileSystem(hiveConf); 279 | 280 | if (!fs.getFileStatus(rPath).isDir()) { 281 | request.setAttribute("msg", rPath + "is not directory"); 282 | return v; 283 | } 284 | 285 | int readedLines = 0; 286 | String temp = null; 287 | 288 | FileStatus[] fss = fs.listStatus(rPath); 289 | for (FileStatus _fs : fss) { 290 | org.apache.hadoop.fs.Path _fsPath = _fs.getPath(); 291 | if (!fs.getFileStatus(_fsPath).isDir()) { 292 | BufferedReader bf = new BufferedReader( 293 | new InputStreamReader(fs.open(_fsPath), "UTF-8")); 294 | 295 | if (raw) { 296 | response.addHeader("Content-Disposition", 297 | "attachment; filename=hwi_result_" + id 298 | + ".txt"); 299 | response.addHeader("Content-Type", "text/plain"); 300 | response.setCharacterEncoding("utf-8"); 301 | PrintWriter writer = response.getWriter(); 302 | while ((temp = bf.readLine()) != null) { 303 | writer.println(temp.replace('\1', '\t')); 304 | } 305 | } else { 306 | while ((temp = bf.readLine()) != null 307 | && readedLines < 100) { 308 | partialResult.add(temp.replace('\1', '\t')); 309 | readedLines++; 310 | } 311 | 312 | if (readedLines >= 100) { 313 | break; 314 | } 315 | } 316 | 317 | bf.close(); 318 | } 319 | } 320 | FileSystem.closeAll(); 321 | } catch (Exception e) { 322 | throw new WebApplicationException(e); 323 | } 324 | 325 | if (raw) { 326 | throw new WebApplicationException(200); 327 | } else { 328 | request.setAttribute("partialResult", partialResult); 329 | return v; 330 | } 331 | } 332 | 333 | 334 | } 335 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/servlet/RCrontab.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.servlet; 17 | 18 | import java.net.URI; 19 | import java.text.SimpleDateFormat; 20 | import java.util.Calendar; 21 | import java.util.HashMap; 22 | import java.util.TimeZone; 23 | 24 | import javax.jdo.Query; 25 | import javax.ws.rs.DefaultValue; 26 | import javax.ws.rs.FormParam; 27 | import javax.ws.rs.GET; 28 | import javax.ws.rs.POST; 29 | import javax.ws.rs.Path; 30 | import javax.ws.rs.PathParam; 31 | import javax.ws.rs.Produces; 32 | import javax.ws.rs.QueryParam; 33 | import javax.ws.rs.WebApplicationException; 34 | import javax.ws.rs.core.Response; 35 | 36 | import org.apache.commons.logging.Log; 37 | import org.apache.commons.logging.LogFactory; 38 | import org.apache.hadoop.hive.hwi.model.MCrontab; 39 | import org.apache.hadoop.hive.hwi.model.MCrontab.Status; 40 | import org.apache.hadoop.hive.hwi.model.MQuery; 41 | import org.apache.hadoop.hive.hwi.model.Pagination; 42 | import org.apache.hadoop.hive.hwi.query.QueryManager; 43 | import org.apache.hadoop.hive.hwi.query.QueryStore; 44 | import org.quartz.CronExpression; 45 | 46 | import com.sun.jersey.api.view.Viewable; 47 | 48 | @Path("/crontabs") 49 | public class RCrontab extends RBase { 50 | protected static final Log l4j = LogFactory 51 | .getLog(RCrontab.class.getName()); 52 | 53 | @GET 54 | @Produces("text/html") 55 | public Viewable list( 56 | @QueryParam(value = "crontabName") String crontabName, 57 | @QueryParam(value = "page") @DefaultValue(value = "1") int page, 58 | @QueryParam(value = "pageSize") @DefaultValue(value = "20") int pageSize) { 59 | 60 | QueryStore qs = QueryStore.getInstance(); 61 | 62 | Query query = qs.getPM().newQuery(MCrontab.class); 63 | query.setOrdering("id DESC"); 64 | 65 | HashMap map = new HashMap(); 66 | 67 | query.setFilter("status != :status"); 68 | map.put("status", MCrontab.Status.DELETED); 69 | 70 | if (crontabName != null) { 71 | crontabName = crontabName.trim(); 72 | if (!crontabName.equals("")) { 73 | query.setFilter("name.matches('(?i).*" + crontabName + ".*')"); 74 | } 75 | } 76 | 77 | Pagination pagination = qs.crontabPaginate(query, map, page, 78 | pageSize); 79 | 80 | request.setAttribute("crontabName", crontabName); 81 | request.setAttribute("pagination", pagination); 82 | 83 | return new Viewable("/crontab/list.vm"); 84 | } 85 | 86 | @GET 87 | @Path("{id}") 88 | @Produces("text/html") 89 | public Viewable info(@PathParam(value = "id") Integer id) { 90 | QueryStore qs = QueryStore.getInstance(); 91 | 92 | MCrontab crontab = qs.getCrontabById(id); 93 | 94 | if (crontab == null) 95 | throw new WebApplicationException(404); 96 | 97 | request.setAttribute("crontab", crontab); 98 | 99 | SimpleDateFormat sf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); 100 | request.setAttribute("createdTime", sf.format(crontab.getCreated())); 101 | request.setAttribute("updatedTime", sf.format(crontab.getUpdated())); 102 | 103 | return new Viewable("/crontab/info.vm"); 104 | } 105 | 106 | @GET 107 | @Path("create") 108 | @Produces("text/html;charset=ISO-8859-1") 109 | public Viewable create(@QueryParam(value = "crontabId") Integer crontabId, 110 | @QueryParam(value = "queryId") Integer queryId) { 111 | 112 | if (crontabId != null) { 113 | MCrontab mcrontab = QueryStore.getInstance().getCrontabById( 114 | crontabId); 115 | if (mcrontab != null) { 116 | request.setAttribute("name", mcrontab.getName()); 117 | request.setAttribute("query", mcrontab.getQuery()); 118 | request.setAttribute("callback", mcrontab.getCallback()); 119 | 120 | String crontab = mcrontab.getCrontab(); 121 | String[] tokens = crontab.split("\\s+"); 122 | 123 | if (tokens.length == 6 || tokens.length == 7) { 124 | request.setAttribute("hour", tokens[2]); 125 | request.setAttribute("day", tokens[3]); 126 | request.setAttribute("month", tokens[4]); 127 | request.setAttribute("week", tokens[5]); 128 | } 129 | 130 | return new Viewable("/crontab/create.vm"); 131 | } 132 | } 133 | 134 | if (queryId != null) { 135 | MQuery mquery = QueryStore.getInstance().getById(queryId); 136 | if (mquery != null) { 137 | request.setAttribute("name", mquery.getName()); 138 | request.setAttribute("query", mquery.getQuery()); 139 | request.setAttribute("callback", mquery.getCallback()); 140 | 141 | return new Viewable("/crontab/create.vm"); 142 | } 143 | } 144 | 145 | return new Viewable("/crontab/create.vm"); 146 | } 147 | 148 | @POST 149 | @Path("create") 150 | @Produces("text/html;charset=ISO-8859-1") 151 | public Viewable create(@FormParam(value = "name") String name, 152 | @FormParam(value = "query") String query, 153 | @FormParam(value = "callback") String callback, 154 | @FormParam(value = "hour") String hour, 155 | @FormParam(value = "day") String day, 156 | @FormParam(value = "month") String month, 157 | @FormParam(value = "week") String week) { 158 | 159 | Viewable v = new Viewable("/crontab/create.vm"); 160 | 161 | request.setAttribute("name", name); 162 | request.setAttribute("query", query); 163 | request.setAttribute("callback", callback); 164 | request.setAttribute("hour", hour); 165 | request.setAttribute("day", day); 166 | request.setAttribute("month", month); 167 | request.setAttribute("week", week); 168 | 169 | if (name == null || name.equals("")) { 170 | request.setAttribute("msg", "name can't be empty"); 171 | return v; 172 | } 173 | 174 | if (query == null || query.equals("")) { 175 | request.setAttribute("msg", "query can't be empty"); 176 | return v; 177 | } 178 | 179 | if (hour == null || hour.equals("")) { 180 | request.setAttribute("msg", "hour can't be empty"); 181 | return v; 182 | } 183 | 184 | if (day == null || day.equals("")) { 185 | request.setAttribute("msg", "day can't be empty"); 186 | return v; 187 | } 188 | 189 | if (month == null || month.equals("")) { 190 | request.setAttribute("msg", "month can't be empty"); 191 | return v; 192 | } 193 | 194 | if (week == null || week.equals("")) { 195 | request.setAttribute("msg", "week can't be empty"); 196 | return v; 197 | } 198 | 199 | String crontab = "0 0 " + hour + " " + day + " " + month + " " + week 200 | + " *"; 201 | 202 | try { 203 | CronExpression.validateExpression(crontab); 204 | } catch (Exception e) { 205 | request.setAttribute("msg", "crontab: " + e.getMessage()); 206 | return v; 207 | } 208 | 209 | QueryStore qs = QueryStore.getInstance(); 210 | 211 | MCrontab mcrontab = new MCrontab(name, query, callback, crontab, 212 | "hadoop", "hadoop"); 213 | qs.insertCrontab(mcrontab); 214 | 215 | QueryManager.getInstance().schedule(mcrontab); 216 | 217 | throw new WebApplicationException(Response.seeOther( 218 | URI.create("crontabs/" + mcrontab.getId())).build()); 219 | } 220 | 221 | @GET 222 | @Path("{id}/changeStatus/{status}") 223 | @Produces("text/html") 224 | public Viewable changeStatus(@PathParam(value = "id") int id, 225 | @PathParam(value = "status") String status) { 226 | 227 | MCrontab crontab = QueryStore.getInstance().getCrontabById(id); 228 | if (crontab == null) { 229 | throw new WebApplicationException(404); 230 | } 231 | 232 | if (status == null) { 233 | throw new WebApplicationException(new Exception( 234 | "status can't be empty"), 403); 235 | } 236 | 237 | Status s = null; 238 | 239 | try { 240 | s = Status.valueOf(status.toUpperCase()); 241 | } catch (Exception e) { 242 | throw new WebApplicationException(e, 403); 243 | } 244 | 245 | crontab.setStatus(s); 246 | QueryStore.getInstance().updateCrontab(crontab); 247 | 248 | switch (s) { 249 | case RUNNING: 250 | QueryManager.getInstance().schedule(crontab); 251 | break; 252 | case PAUSED: 253 | QueryManager.getInstance().unschedule(crontab); 254 | break; 255 | case DELETED: 256 | QueryManager.getInstance().unschedule(crontab); 257 | break; 258 | } 259 | 260 | throw new WebApplicationException(Response.seeOther( 261 | URI.create("crontabs/" + crontab.getId())).build()); 262 | } 263 | 264 | @GET 265 | @Path("{id}/update") 266 | @Produces("text/html;charset=ISO-8859-1") 267 | public Viewable update(@PathParam(value = "id") Integer id) { 268 | 269 | if (id == null) { 270 | throw new WebApplicationException(404); 271 | } 272 | 273 | MCrontab crontab = QueryStore.getInstance().getCrontabById(id); 274 | if (crontab == null) { 275 | throw new WebApplicationException(404); 276 | } 277 | 278 | if (crontab.getStatus() == Status.DELETED) { 279 | throw new WebApplicationException(new Exception( 280 | "crontab has alreay been deleted"), 403); 281 | } 282 | 283 | request.setAttribute("crontab", crontab); 284 | 285 | return new Viewable("/crontab/update.vm"); 286 | } 287 | 288 | @POST 289 | @Path("{id}/update") 290 | @Produces("text/html;charset=ISO-8859-1") 291 | public Viewable update(@PathParam(value = "id") Integer id, 292 | @FormParam(value = "name") String name, 293 | @FormParam(value = "query") String query, 294 | @FormParam(value = "callback") String callback, 295 | @FormParam(value = "hour") String hour, 296 | @FormParam(value = "day") String day, 297 | @FormParam(value = "month") String month, 298 | @FormParam(value = "week") String week) { 299 | 300 | Viewable v = new Viewable("/crontab/update.vm"); 301 | 302 | request.setAttribute("name", name); 303 | request.setAttribute("query", query); 304 | request.setAttribute("callback", callback); 305 | request.setAttribute("hour", hour); 306 | request.setAttribute("day", day); 307 | request.setAttribute("month", month); 308 | request.setAttribute("week", week); 309 | 310 | if (name == null || name.equals("")) { 311 | request.setAttribute("msg", "name can't be empty"); 312 | return v; 313 | } 314 | 315 | if (query == null || query.equals("")) { 316 | request.setAttribute("msg", "query can't be empty"); 317 | return v; 318 | } 319 | 320 | if (hour == null || hour.equals("")) { 321 | request.setAttribute("msg", "hour can't be empty"); 322 | return v; 323 | } 324 | 325 | if (day == null || day.equals("")) { 326 | request.setAttribute("msg", "day can't be empty"); 327 | return v; 328 | } 329 | 330 | if (month == null || month.equals("")) { 331 | request.setAttribute("msg", "month can't be empty"); 332 | return v; 333 | } 334 | 335 | if (week == null || week.equals("")) { 336 | request.setAttribute("msg", "week can't be empty"); 337 | return v; 338 | } 339 | 340 | String crontab = "0 0 " + hour + " " + day + " " + month + " " + week; 341 | 342 | try { 343 | CronExpression.validateExpression(crontab); 344 | } catch (Exception e) { 345 | request.setAttribute("msg", "crontab: " + e.getMessage()); 346 | return v; 347 | } 348 | 349 | QueryStore qs = QueryStore.getInstance(); 350 | 351 | MCrontab mcrontab = QueryStore.getInstance().getCrontabById(id); 352 | if (mcrontab == null) { 353 | throw new WebApplicationException(404); 354 | } 355 | 356 | mcrontab.setName(name); 357 | mcrontab.setQuery(query); 358 | mcrontab.setCallback(callback); 359 | mcrontab.setCrontab(crontab); 360 | mcrontab.setUpdated(Calendar.getInstance(TimeZone.getDefault()) 361 | .getTime()); 362 | 363 | qs.updateCrontab(mcrontab); 364 | 365 | if (mcrontab.getStatus() == Status.RUNNING) { 366 | QueryManager.getInstance().unschedule(mcrontab); 367 | QueryManager.getInstance().schedule(mcrontab); 368 | } 369 | 370 | throw new WebApplicationException(Response.seeOther( 371 | URI.create("crontabs/" + mcrontab.getId())).build()); 372 | } 373 | 374 | } 375 | -------------------------------------------------------------------------------- /src/main/java/org/apache/hadoop/hive/hwi/query/QueryRunner.java: -------------------------------------------------------------------------------- 1 | /** 2 | * Copyright (C) [2013] [Anjuke Inc] 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | package org.apache.hadoop.hive.hwi.query; 17 | 18 | import java.io.OutputStreamWriter; 19 | import java.net.HttpURLConnection; 20 | import java.net.URL; 21 | import java.net.URLEncoder; 22 | import java.text.SimpleDateFormat; 23 | import java.util.ArrayList; 24 | import java.util.Arrays; 25 | import java.util.Calendar; 26 | import java.util.Date; 27 | import java.util.regex.Matcher; 28 | import java.util.regex.Pattern; 29 | 30 | import org.apache.commons.logging.Log; 31 | import org.apache.commons.logging.LogFactory; 32 | import org.apache.hadoop.hive.conf.HiveConf; 33 | import org.apache.hadoop.hive.hwi.model.MQuery; 34 | import org.apache.hadoop.hive.hwi.model.MQuery.Status; 35 | import org.apache.hadoop.hive.hwi.query.RunningRunner.Progress; 36 | import org.apache.hadoop.hive.hwi.query.RunningRunner.Running; 37 | import org.apache.hadoop.hive.hwi.util.HWIHiveHistoryViewer; 38 | import org.apache.hadoop.hive.hwi.util.QueryUtil; 39 | import org.apache.hadoop.hive.ql.CommandNeedRetryException; 40 | import org.apache.hadoop.hive.ql.Driver; 41 | import org.apache.hadoop.hive.ql.processors.CommandProcessor; 42 | import org.apache.hadoop.hive.ql.processors.CommandProcessorFactory; 43 | import org.apache.hadoop.hive.ql.processors.CommandProcessorResponse; 44 | import org.apache.hadoop.hive.ql.session.SessionState; 45 | import org.quartz.Job; 46 | import org.quartz.JobDataMap; 47 | import org.quartz.JobExecutionContext; 48 | import org.quartz.JobExecutionException; 49 | 50 | public class QueryRunner implements Job, Running { 51 | protected static final Log l4j = LogFactory.getLog(QueryRunner.class 52 | .getName()); 53 | 54 | private JobExecutionContext context; 55 | 56 | private MQuery mquery; 57 | 58 | private HiveConf hiveConf; 59 | 60 | private QueryStore qs; 61 | 62 | private String historyFile; 63 | 64 | private String resultDir; 65 | 66 | private String[] initHQLs = null; 67 | 68 | @Override 69 | public void execute(JobExecutionContext context) 70 | throws JobExecutionException { 71 | this.context = context; 72 | 73 | if (init()) { 74 | QueryManager.getInstance().monitor(this); 75 | 76 | runQuery(); 77 | finish(); 78 | } 79 | } 80 | 81 | protected boolean init() { 82 | hiveConf = new HiveConf(SessionState.class); 83 | 84 | SessionState.start(hiveConf); 85 | historyFile = SessionState.get().getHiveHistory().getHistFileName(); 86 | 87 | qs = QueryStore.getInstance(); 88 | 89 | JobDataMap map = context.getJobDetail().getJobDataMap(); 90 | 91 | int mqueryId = map.getIntValue("mqueryId"); 92 | 93 | mquery = QueryStore.getInstance().getById(mqueryId); 94 | 95 | if (mquery == null) { 96 | l4j.error("MQuery<" + mqueryId + "> is missing"); 97 | return false; 98 | } 99 | 100 | resultDir = hiveConf.get("hive.hwi.result", "/user/hive/result"); 101 | String initHQL = hiveConf.get("hive.hwi.inithqls", ""); 102 | if (!"".equals(initHQL)) { 103 | initHQLs = initHQL.split(";"); 104 | } 105 | 106 | return true; 107 | } 108 | 109 | /** 110 | * run user input queries 111 | */ 112 | public void runQuery() { 113 | mquery.setResultLocation(resultDir + "/" + mquery.getId() + "/"); 114 | mquery.setStatus(Status.RUNNING); 115 | qs.updateQuery(mquery); 116 | 117 | ArrayList cmds = queryToCmds(mquery); 118 | 119 | long start_time = System.currentTimeMillis(); 120 | for (String cmd : cmds) { 121 | try { 122 | cmd = cmd.trim(); 123 | if (cmd.equals("")) { 124 | continue; 125 | } 126 | 127 | CommandProcessorResponse resp = runCmd(cmd); 128 | mquery.setErrorMsg(resp.getErrorMessage()); 129 | mquery.setErrorCode(resp.getResponseCode()); 130 | } catch (Exception e) { 131 | mquery.setErrorMsg(e.getMessage()); 132 | mquery.setErrorCode(500); 133 | break; 134 | } 135 | } 136 | long end_time = System.currentTimeMillis(); 137 | mquery.setTotalTime((int) (end_time - start_time)); 138 | 139 | qs.updateQuery(mquery); 140 | } 141 | 142 | protected ArrayList queryToCmds(MQuery query) { 143 | 144 | ArrayList cmds = new ArrayList(); 145 | String resultLocation = query.getResultLocation(); 146 | 147 | // query is not safe ! safe it ! 148 | String safeQuery = QueryUtil.getSafeQuery(query.getQuery()); 149 | 150 | // set map reduce job name 151 | cmds.add("set mapred.job.name=HWI Query #" + query.getId() + " (" + query.getName() + ")"); 152 | 153 | if (initHQLs != null && initHQLs.length > 0) { 154 | cmds.addAll(Arrays.asList(initHQLs)); 155 | } 156 | 157 | // check user date settings 158 | Pattern setTimePattern = Pattern.compile( 159 | "set(\\s+)date(\\s)*=(\\s)*(\\+|\\-)?\\s*(\\d+)\\s*(day|hour)(s?)", 160 | Pattern.CASE_INSENSITIVE); 161 | String timeDiffUnit = null; 162 | int timeDiffValue = 0; 163 | 164 | for (String cmd : safeQuery.split(";")) { 165 | cmd = cmd.trim(); 166 | if (cmd.equals("")) { 167 | continue; 168 | } 169 | 170 | String prefix = cmd.split("\\s+")[0]; 171 | if ("select".equalsIgnoreCase(prefix)) { 172 | cmd = "INSERT OVERWRITE DIRECTORY '" + resultLocation + "' " 173 | + cmd; 174 | } else if ("set".equalsIgnoreCase(prefix)) { 175 | Matcher m = setTimePattern.matcher(cmd); 176 | if (m.matches()) { 177 | if (m.group(4) != null && !"".equals(m.group(4))) { 178 | timeDiffValue = Integer.parseInt(m.group(4) + m.group(5)); 179 | } else { 180 | timeDiffValue = Integer.parseInt(m.group(5)); 181 | } 182 | 183 | timeDiffUnit = m.group(6); 184 | } 185 | } 186 | 187 | cmds.add(cmd); 188 | } 189 | 190 | if (safeQuery.contains("hiveconf")) { 191 | 192 | Calendar gc = Calendar.getInstance(); 193 | if ("day".equalsIgnoreCase(timeDiffUnit)) { 194 | gc.add(Calendar.DATE, timeDiffValue); 195 | } else if("hour".equalsIgnoreCase(timeDiffUnit)) { 196 | gc.add(Calendar.HOUR, timeDiffValue); 197 | } 198 | 199 | SimpleDateFormat ft = new SimpleDateFormat("yyyyMMddHHmmss"); 200 | String dateStr = ft.format(gc.getTime()); 201 | 202 | cmds.add(0, "set year=" + dateStr.substring(0, 4)); 203 | cmds.add(0, "set month=" + dateStr.substring(4, 6)); 204 | cmds.add(0, "set day=" + dateStr.substring(6, 8)); 205 | cmds.add(0, "set hour=" + dateStr.substring(8, 10)); 206 | cmds.add(0, "set minute=" + dateStr.substring(10, 12)); 207 | cmds.add(0, "set second=" + dateStr.substring(12, 14)); 208 | } 209 | 210 | 211 | return cmds; 212 | } 213 | 214 | protected CommandProcessorResponse runCmd(String cmd) 215 | throws RuntimeException, CommandNeedRetryException { 216 | String[] tokens = cmd.split("\\s+"); 217 | 218 | CommandProcessor proc = CommandProcessorFactory 219 | .get(tokens[0], hiveConf); 220 | if (proc == null) 221 | throw new RuntimeException("CommandProcessor for " + tokens[0] 222 | + " was not found"); 223 | 224 | CommandProcessorResponse resp; 225 | 226 | if (proc instanceof Driver) { 227 | Driver qp = (Driver) proc; 228 | qp.setTryCount(Integer.MAX_VALUE); 229 | 230 | try { 231 | resp = qp.run(cmd); 232 | } catch (CommandNeedRetryException e) { 233 | throw e; 234 | } catch (Exception e) { 235 | e.printStackTrace(); 236 | return null; 237 | } finally { 238 | CommandProcessorFactory.clean((HiveConf) hiveConf); 239 | qp.close(); 240 | } 241 | } else { 242 | try { 243 | resp = proc.run(cmd.substring(tokens[0].length()).trim()); 244 | } catch (CommandNeedRetryException e) { 245 | throw e; 246 | }finally{ 247 | CommandProcessorFactory.clean((HiveConf) hiveConf); 248 | } 249 | } 250 | 251 | return resp; 252 | } 253 | 254 | /** 255 | * query finished 256 | * 257 | */ 258 | private void finish() { 259 | 260 | HWIHiveHistoryViewer hv = QueryUtil.getHiveHistoryViewer(historyFile); 261 | 262 | String jobId = QueryUtil.getJobId(hv); 263 | 264 | if (jobId != null && !jobId.equals("") 265 | && !jobId.equals(mquery.getJobId())) { 266 | mquery.setJobId(jobId); 267 | } 268 | 269 | Integer cpuTime = QueryUtil.getCpuTime(hv); 270 | 271 | if (cpuTime != null && cpuTime > 0) { 272 | mquery.setCpuTime(cpuTime); 273 | } 274 | 275 | if (mquery.getErrorCode() == null || mquery.getErrorCode() == 0) { 276 | mquery.setStatus(Status.FINISHED); 277 | } else { 278 | mquery.setStatus(Status.FAILED); 279 | } 280 | 281 | callback(); 282 | 283 | qs.updateQuery(mquery); 284 | } 285 | 286 | /** 287 | * when query is finished, callback is invoked. 288 | */ 289 | private void callback() { 290 | String callback = this.mquery.getCallback(); 291 | if (callback != null && !"".equals(callback)) { 292 | try { 293 | 294 | String errorCode = "0"; 295 | if (this.mquery.getErrorCode() != null) { 296 | errorCode = URLEncoder.encode(this.mquery.getErrorCode() 297 | .toString(), "UTF-8"); 298 | } 299 | 300 | String errorMsg = ""; 301 | if (this.mquery.getErrorMsg() != null) { 302 | errorMsg = URLEncoder.encode(this.mquery.getErrorMsg(), 303 | "UTF-8"); 304 | } 305 | 306 | String postData = "id=" 307 | + URLEncoder.encode(this.mquery.getId().toString(), 308 | "UTF-8") 309 | + "&status=" 310 | + URLEncoder.encode(this.mquery.getStatus().toString(), 311 | "UTF-8") 312 | + "&error_code=" 313 | + errorCode 314 | + "&error_msg=" 315 | + errorMsg 316 | + "&result_location=" 317 | + URLEncoder.encode(this.mquery.getResultLocation(), 318 | "UTF-8") 319 | + "&result_location_url=" 320 | + URLEncoder.encode( 321 | "/hwi/query_result.jsp?action=download&id=" 322 | + this.mquery.getId(), "UTF-8"); 323 | 324 | int trycallbacktimes = 0; 325 | do { 326 | URL callbackUrl = new URL(callback); 327 | 328 | HttpURLConnection urlConn = (HttpURLConnection) callbackUrl 329 | .openConnection(); 330 | urlConn.setDoOutput(true); 331 | urlConn.connect(); 332 | 333 | OutputStreamWriter out = new OutputStreamWriter( 334 | urlConn.getOutputStream(), "UTF-8"); 335 | out.write(postData); 336 | out.close(); 337 | 338 | int responseCode = urlConn.getResponseCode(); 339 | 340 | if (responseCode == 200) { 341 | break; 342 | } 343 | } while (++trycallbacktimes < 3); 344 | 345 | /* 346 | * l4j.debug(urlConn.getResponseMessage()); 347 | * l4j.debug(urlConn.getResponseCode()); BufferedReader bin = 348 | * new BufferedReader(new 349 | * InputStreamReader(urlConn.getInputStream(), "UTF-8")); String 350 | * temp; while ((temp = bin.readLine()) != null) { 351 | * System.out.println(temp); } 352 | */ 353 | 354 | } catch (Exception e) { 355 | e.printStackTrace(); 356 | } 357 | } 358 | 359 | l4j.debug(this.mquery.getName() + " state is now FINISHED"); 360 | } 361 | 362 | public Progress running() { 363 | switch (mquery.getStatus()) { 364 | case INITED: 365 | return Progress.CONTINUE; 366 | case RUNNING: 367 | HWIHiveHistoryViewer hv = QueryUtil 368 | .getHiveHistoryViewer(historyFile); 369 | 370 | String jobId = QueryUtil.getJobId(hv); 371 | 372 | if (jobId != null && !jobId.equals("") 373 | && !jobId.equals(mquery.getJobId())) { 374 | mquery.setJobId(jobId); 375 | QueryStore.getInstance().copyAndUpdateQuery(mquery); 376 | } 377 | return Progress.CONTINUE; 378 | default: 379 | return Progress.EXIT; 380 | } 381 | } 382 | 383 | } -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | 2 | Apache License 3 | Version 2.0, January 2004 4 | http://www.apache.org/licenses/ 5 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 7 | 8 | 1. Definitions. 9 | 10 | "License" shall mean the terms and conditions for use, reproduction, 11 | and distribution as defined by Sections 1 through 9 of this document. 12 | 13 | "Licensor" shall mean the copyright owner or entity authorized by 14 | the copyright owner that is granting the License. 15 | 16 | "Legal Entity" shall mean the union of the acting entity and all 17 | other entities that control, are controlled by, or are under common 18 | control with that entity. For the purposes of this definition, 19 | "control" means (i) the power, direct or indirect, to cause the 20 | direction or management of such entity, whether by contract or 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 22 | outstanding shares, or (iii) beneficial ownership of such entity. 23 | 24 | "You" (or "Your") shall mean an individual or Legal Entity 25 | exercising permissions granted by this License. 26 | 27 | "Source" form shall mean the preferred form for making modifications, 28 | including but not limited to software source code, documentation 29 | source, and configuration files. 30 | 31 | "Object" form shall mean any form resulting from mechanical 32 | transformation or translation of a Source form, including but 33 | not limited to compiled object code, generated documentation, 34 | and conversions to other media types. 35 | 36 | "Work" shall mean the work of authorship, whether in Source or 37 | Object form, made available under the License, as indicated by a 38 | copyright notice that is included in or attached to the work 39 | (an example is provided in the Appendix below). 40 | 41 | "Derivative Works" shall mean any work, whether in Source or Object 42 | form, that is based on (or derived from) the Work and for which the 43 | editorial revisions, annotations, elaborations, or other modifications 44 | represent, as a whole, an original work of authorship. For the purposes 45 | of this License, Derivative Works shall not include works that remain 46 | separable from, or merely link (or bind by name) to the interfaces of, 47 | the Work and Derivative Works thereof. 48 | 49 | "Contribution" shall mean any work of authorship, including 50 | the original version of the Work and any modifications or additions 51 | to that Work or Derivative Works thereof, that is intentionally 52 | submitted to Licensor for inclusion in the Work by the copyright owner 53 | or by an individual or Legal Entity authorized to submit on behalf of 54 | the copyright owner. For the purposes of this definition, "submitted" 55 | means any form of electronic, verbal, or written communication sent 56 | to the Licensor or its representatives, including but not limited to 57 | communication on electronic mailing lists, source code control systems, 58 | and issue tracking systems that are managed by, or on behalf of, the 59 | Licensor for the purpose of discussing and improving the Work, but 60 | excluding communication that is conspicuously marked or otherwise 61 | designated in writing by the copyright owner as "Not a Contribution." 62 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity 64 | on behalf of whom a Contribution has been received by Licensor and 65 | subsequently incorporated within the Work. 66 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of 68 | this License, each Contributor hereby grants to You a perpetual, 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 70 | copyright license to reproduce, prepare Derivative Works of, 71 | publicly display, publicly perform, sublicense, and distribute the 72 | Work and such Derivative Works in Source or Object form. 73 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of 75 | this License, each Contributor hereby grants to You a perpetual, 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 77 | (except as stated in this section) patent license to make, have made, 78 | use, offer to sell, sell, import, and otherwise transfer the Work, 79 | where such license applies only to those patent claims licensable 80 | by such Contributor that are necessarily infringed by their 81 | Contribution(s) alone or by combination of their Contribution(s) 82 | with the Work to which such Contribution(s) was submitted. If You 83 | institute patent litigation against any entity (including a 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work 85 | or a Contribution incorporated within the Work constitutes direct 86 | or contributory patent infringement, then any patent licenses 87 | granted to You under this License for that Work shall terminate 88 | as of the date such litigation is filed. 89 | 90 | 4. Redistribution. You may reproduce and distribute copies of the 91 | Work or Derivative Works thereof in any medium, with or without 92 | modifications, and in Source or Object form, provided that You 93 | meet the following conditions: 94 | 95 | (a) You must give any other recipients of the Work or 96 | Derivative Works a copy of this License; and 97 | 98 | (b) You must cause any modified files to carry prominent notices 99 | stating that You changed the files; and 100 | 101 | (c) You must retain, in the Source form of any Derivative Works 102 | that You distribute, all copyright, patent, trademark, and 103 | attribution notices from the Source form of the Work, 104 | excluding those notices that do not pertain to any part of 105 | the Derivative Works; and 106 | 107 | (d) If the Work includes a "NOTICE" text file as part of its 108 | distribution, then any Derivative Works that You distribute must 109 | include a readable copy of the attribution notices contained 110 | within such NOTICE file, excluding those notices that do not 111 | pertain to any part of the Derivative Works, in at least one 112 | of the following places: within a NOTICE text file distributed 113 | as part of the Derivative Works; within the Source form or 114 | documentation, if provided along with the Derivative Works; or, 115 | within a display generated by the Derivative Works, if and 116 | wherever such third-party notices normally appear. The contents 117 | of the NOTICE file are for informational purposes only and 118 | do not modify the License. You may add Your own attribution 119 | notices within Derivative Works that You distribute, alongside 120 | or as an addendum to the NOTICE text from the Work, provided 121 | that such additional attribution notices cannot be construed 122 | as modifying the License. 123 | 124 | You may add Your own copyright statement to Your modifications and 125 | may provide additional or different license terms and conditions 126 | for use, reproduction, or distribution of Your modifications, or 127 | for any such Derivative Works as a whole, provided Your use, 128 | reproduction, and distribution of the Work otherwise complies with 129 | the conditions stated in this License. 130 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, 132 | any Contribution intentionally submitted for inclusion in the Work 133 | by You to the Licensor shall be under the terms and conditions of 134 | this License, without any additional terms or conditions. 135 | Notwithstanding the above, nothing herein shall supersede or modify 136 | the terms of any separate license agreement you may have executed 137 | with Licensor regarding such Contributions. 138 | 139 | 6. Trademarks. This License does not grant permission to use the trade 140 | names, trademarks, service marks, or product names of the Licensor, 141 | except as required for reasonable and customary use in describing the 142 | origin of the Work and reproducing the content of the NOTICE file. 143 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or 145 | agreed to in writing, Licensor provides the Work (and each 146 | Contributor provides its Contributions) on an "AS IS" BASIS, 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 148 | implied, including, without limitation, any warranties or conditions 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 150 | PARTICULAR PURPOSE. You are solely responsible for determining the 151 | appropriateness of using or redistributing the Work and assume any 152 | risks associated with Your exercise of permissions under this License. 153 | 154 | 8. Limitation of Liability. In no event and under no legal theory, 155 | whether in tort (including negligence), contract, or otherwise, 156 | unless required by applicable law (such as deliberate and grossly 157 | negligent acts) or agreed to in writing, shall any Contributor be 158 | liable to You for damages, including any direct, indirect, special, 159 | incidental, or consequential damages of any character arising as a 160 | result of this License or out of the use or inability to use the 161 | Work (including but not limited to damages for loss of goodwill, 162 | work stoppage, computer failure or malfunction, or any and all 163 | other commercial damages or losses), even if such Contributor 164 | has been advised of the possibility of such damages. 165 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing 167 | the Work or Derivative Works thereof, You may choose to offer, 168 | and charge a fee for, acceptance of support, warranty, indemnity, 169 | or other liability obligations and/or rights consistent with this 170 | License. However, in accepting such obligations, You may act only 171 | on Your own behalf and on Your sole responsibility, not on behalf 172 | of any other Contributor, and only if You agree to indemnify, 173 | defend, and hold each Contributor harmless for any liability 174 | incurred by, or claims asserted against, such Contributor by reason 175 | of your accepting any such warranty or additional liability. 176 | 177 | END OF TERMS AND CONDITIONS 178 | 179 | APPENDIX: How to apply the Apache License to your work. 180 | 181 | To apply the Apache License to your work, attach the following 182 | boilerplate notice, with the fields enclosed by brackets "[]" 183 | replaced with your own identifying information. (Don't include 184 | the brackets!) The text should be enclosed in the appropriate 185 | comment syntax for the file format. We also recommend that a 186 | file or class name and description of purpose be included on the 187 | same "printed page" as the copyright notice for easier 188 | identification within third-party archives. 189 | 190 | Copyright [yyyy] [name of copyright owner] 191 | 192 | Licensed under the Apache License, Version 2.0 (the "License"); 193 | you may not use this file except in compliance with the License. 194 | You may obtain a copy of the License at 195 | 196 | http://www.apache.org/licenses/LICENSE-2.0 197 | 198 | Unless required by applicable law or agreed to in writing, software 199 | distributed under the License is distributed on an "AS IS" BASIS, 200 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 201 | See the License for the specific language governing permissions and 202 | limitations under the License. 203 | 204 | 205 | 206 | HWI SUBCOMPONENTS: 207 | 208 | The project contains subcomponents with separate copyright 209 | notices and license terms. Your use of the source code for the these 210 | subcomponents is subject to the terms and conditions of the following 211 | licenses. 212 | 213 | For the jersey library: 214 | 215 | COMMON DEVELOPMENT AND DISTRIBUTION LICENSE (CDDL)Version 1.1 216 | 217 | 1. Definitions. 218 | 219 | 1.1. “Contributor” means each individual or entity that creates or contributes to the creation of Modifications. 220 | 221 | 1.2. “Contributor Version” means the combination of the Original Software, prior Modifications used by a Contributor (if any), and the Modifications made by that particular Contributor. 222 | 223 | 1.3. “Covered Software” means (a) the Original Software, or (b) Modifications, or (c) the combination of files containing Original Software with files containing Modifications, in each case including portions thereof. 224 | 225 | 1.4. “Executable” means the Covered Software in any form other than Source Code. 226 | 227 | 1.5. “Initial Developer” means the individual or entity that first makes Original Software available under this License. 228 | 229 | 1.6. “Larger Work” means a work which combines Covered Software or portions thereof with code not governed by the terms of this License. 230 | 231 | 1.7. “License” means this document. 232 | 233 | 1.8. “Licensable” means having the right to grant, to the maximum extent possible, whether at the time of the initial grant or subsequently acquired, any and all of the rights conveyed herein. 234 | 235 | 1.9. “Modifications” means the Source Code and Executable form of any of the following: 236 | 237 | A. Any file that results from an addition to, deletion from or modification of the contents of a file containing Original Software or previous Modifications; 238 | 239 | B. Any new file that contains any part of the Original Software or previous Modification; or 240 | 241 | C. Any new file that is contributed or otherwise made available under the terms of this License. 242 | 243 | 1.10. “Original Software” means the Source Code and Executable form of computer software code that is originally released under this License. 244 | 245 | 1.11. “Patent Claims” means any patent claim(s), now owned or hereafter acquired, including without limitation, method, process, and apparatus claims, in any patent Licensable by grantor. 246 | 247 | 1.12. “Source Code” means (a) the common form of computer software code in which modifications are made and (b) associated documentation included in or with such code. 248 | 249 | 1.13. “You” (or “Your”) means an individual or a legal entity exercising rights under, and complying with all of the terms of, this License. For legal entities, “You” includes any entity which controls, is controlled by, or is under common control with You. For purposes of this definition, “control” means (a) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (b) ownership of more than fifty percent (50%) of the outstanding shares or beneficial ownership of such entity. 250 | 251 | 2. License Grants. 252 | 253 | 2.1. The Initial Developer Grant. 254 | 255 | Conditioned upon Your compliance with Section 3.1 below and subject to third party intellectual property claims, the Initial Developer hereby grants You a world-wide, royalty-free, non-exclusive license: 256 | 257 | (a) under intellectual property rights (other than patent or trademark) Licensable by Initial Developer, to use, reproduce, modify, display, perform, sublicense and distribute the Original Software (or portions thereof), with or without Modifications, and/or as part of a Larger Work; and 258 | 259 | (b) under Patent Claims infringed by the making, using or selling of Original Software, to make, have made, use, practice, sell, and offer for sale, and/or otherwise dispose of the Original Software (or portions thereof). 260 | 261 | (c) The licenses granted in Sections 2.1(a) and (b) are effective on the date Initial Developer first distributes or otherwise makes the Original Software available to a third party under the terms of this License. 262 | 263 | (d) Notwithstanding Section 2.1(b) above, no patent license is granted: (1) for code that You delete from the Original Software, or (2) for infringements caused by: (i) the modification of the Original Software, or (ii) the combination of the Original Software with other software or devices. 264 | 265 | 2.2. Contributor Grant. 266 | 267 | Conditioned upon Your compliance with Section 3.1 below and subject to third party intellectual property claims, each Contributor hereby grants You a world-wide, royalty-free, non-exclusive license: 268 | 269 | (a) under intellectual property rights (other than patent or trademark) Licensable by Contributor to use, reproduce, modify, display, perform, sublicense and distribute the Modifications created by such Contributor (or portions thereof), either on an unmodified basis, with other Modifications, as Covered Software and/or as part of a Larger Work; and 270 | 271 | (b) under Patent Claims infringed by the making, using, or selling of Modifications made by that Contributor either alone and/or in combination with its Contributor Version (or portions of such combination), to make, use, sell, offer for sale, have made, and/or otherwise dispose of: (1) Modifications made by that Contributor (or portions thereof); and (2) the combination of Modifications made by that Contributor with its Contributor Version (or portions of such combination). 272 | 273 | (c) The licenses granted in Sections 2.2(a) and 2.2(b) are effective on the date Contributor first distributes or otherwise makes the Modifications available to a third party. 274 | 275 | (d) Notwithstanding Section 2.2(b) above, no patent license is granted: (1) for any code that Contributor has deleted from the Contributor Version; (2) for infringements caused by: (i) third party modifications of Contributor Version, or (ii) the combination of Modifications made by that Contributor with other software (except as part of the Contributor Version) or other devices; or (3) under Patent Claims infringed by Covered Software in the absence of Modifications made by that Contributor. 276 | 277 | 3. Distribution Obligations. 278 | 279 | 3.1. Availability of Source Code. 280 | 281 | Any Covered Software that You distribute or otherwise make available in Executable form must also be made available in Source Code form and that Source Code form must be distributed only under the terms of this License. You must include a copy of this License with every copy of the Source Code form of the Covered Software You distribute or otherwise make available. You must inform recipients of any such Covered Software in Executable form as to how they can obtain such Covered Software in Source Code form in a reasonable manner on or through a medium customarily used for software exchange. 282 | 283 | 3.2. Modifications. 284 | 285 | The Modifications that You create or to which You contribute are governed by the terms of this License. You represent that You believe Your Modifications are Your original creation(s) and/or You have sufficient rights to grant the rights conveyed by this License. 286 | 287 | 3.3. Required Notices. 288 | 289 | You must include a notice in each of Your Modifications that identifies You as the Contributor of the Modification. You may not remove or alter any copyright, patent or trademark notices contained within the Covered Software, or any notices of licensing or any descriptive text giving attribution to any Contributor or the Initial Developer. 290 | 291 | 3.4. Application of Additional Terms. 292 | 293 | You may not offer or impose any terms on any Covered Software in Source Code form that alters or restricts the applicable version of this License or the recipients' rights hereunder. You may choose to offer, and to charge a fee for, warranty, support, indemnity or liability obligations to one or more recipients of Covered Software. However, you may do so only on Your own behalf, and not on behalf of the Initial Developer or any Contributor. You must make it absolutely clear that any such warranty, support, indemnity or liability obligation is offered by You alone, and You hereby agree to indemnify the Initial Developer and every Contributor for any liability incurred by the Initial Developer or such Contributor as a result of warranty, support, indemnity or liability terms You offer. 294 | 295 | 3.5. Distribution of Executable Versions. 296 | 297 | You may distribute the Executable form of the Covered Software under the terms of this License or under the terms of a license of Your choice, which may contain terms different from this License, provided that You are in compliance with the terms of this License and that the license for the Executable form does not attempt to limit or alter the recipient's rights in the Source Code form from the rights set forth in this License. If You distribute the Covered Software in Executable form under a different license, You must make it absolutely clear that any terms which differ from this License are offered by You alone, not by the Initial Developer or Contributor. You hereby agree to indemnify the Initial Developer and every Contributor for any liability incurred by the Initial Developer or such Contributor as a result of any such terms You offer. 298 | 299 | 3.6. Larger Works. 300 | 301 | You may create a Larger Work by combining Covered Software with other code not governed by the terms of this License and distribute the Larger Work as a single product. In such a case, You must make sure the requirements of this License are fulfilled for the Covered Software. 302 | 303 | 4. Versions of the License. 304 | 305 | 4.1. New Versions. 306 | 307 | Oracle is the initial license steward and may publish revised and/or new versions of this License from time to time. Each version will be given a distinguishing version number. Except as provided in Section 4.3, no one other than the license steward has the right to modify this License. 308 | 309 | 4.2. Effect of New Versions. 310 | 311 | You may always continue to use, distribute or otherwise make the Covered Software available under the terms of the version of the License under which You originally received the Covered Software. If the Initial Developer includes a notice in the Original Software prohibiting it from being distributed or otherwise made available under any subsequent version of the License, You must distribute and make the Covered Software available under the terms of the version of the License under which You originally received the Covered Software. Otherwise, You may also choose to use, distribute or otherwise make the Covered Software available under the terms of any subsequent version of the License published by the license steward. 312 | 313 | 4.3. Modified Versions. 314 | 315 | When You are an Initial Developer and You want to create a new license for Your Original Software, You may create and use a modified version of this License if You: (a) rename the license and remove any references to the name of the license steward (except to note that the license differs from this License); and (b) otherwise make it clear that the license contains terms which differ from this License. 316 | 317 | 5. DISCLAIMER OF WARRANTY. 318 | 319 | COVERED SOFTWARE IS PROVIDED UNDER THIS LICENSE ON AN “AS IS” BASIS, WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, WITHOUT LIMITATION, WARRANTIES THAT THE COVERED SOFTWARE IS FREE OF DEFECTS, MERCHANTABLE, FIT FOR A PARTICULAR PURPOSE OR NON-INFRINGING. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE COVERED SOFTWARE IS WITH YOU. SHOULD ANY COVERED SOFTWARE PROVE DEFECTIVE IN ANY RESPECT, YOU (NOT THE INITIAL DEVELOPER OR ANY OTHER CONTRIBUTOR) ASSUME THE COST OF ANY NECESSARY SERVICING, REPAIR OR CORRECTION. THIS DISCLAIMER OF WARRANTY CONSTITUTES AN ESSENTIAL PART OF THIS LICENSE. NO USE OF ANY COVERED SOFTWARE IS AUTHORIZED HEREUNDER EXCEPT UNDER THIS DISCLAIMER. 320 | 321 | 6. TERMINATION. 322 | 323 | 6.1. This License and the rights granted hereunder will terminate automatically if You fail to comply with terms herein and fail to cure such breach within 30 days of becoming aware of the breach. Provisions which, by their nature, must remain in effect beyond the termination of this License shall survive. 324 | 325 | 6.2. If You assert a patent infringement claim (excluding declaratory judgment actions) against Initial Developer or a Contributor (the Initial Developer or Contributor against whom You assert such claim is referred to as “Participant”) alleging that the Participant Software (meaning the Contributor Version where the Participant is a Contributor or the Original Software where the Participant is the Initial Developer) directly or indirectly infringes any patent, then any and all rights granted directly or indirectly to You by such Participant, the Initial Developer (if the Initial Developer is not the Participant) and all Contributors under Sections 2.1 and/or 2.2 of this License shall, upon 60 days notice from Participant terminate prospectively and automatically at the expiration of such 60 day notice period, unless if within such 60 day period You withdraw Your claim with respect to the Participant Software against such Participant either unilaterally or pursuant to a written agreement with Participant. 326 | 327 | 6.3. If You assert a patent infringement claim against Participant alleging that the Participant Software directly or indirectly infringes any patent where such claim is resolved (such as by license or settlement) prior to the initiation of patent infringement litigation, then the reasonable value of the licenses granted by such Participant under Sections 2.1 or 2.2 shall be taken into account in determining the amount or value of any payment or license. 328 | 329 | 6.4. In the event of termination under Sections 6.1 or 6.2 above, all end user licenses that have been validly granted by You or any distributor hereunder prior to termination (excluding licenses granted to You by any distributor) shall survive termination. 330 | 331 | 7. LIMITATION OF LIABILITY. 332 | 333 | UNDER NO CIRCUMSTANCES AND UNDER NO LEGAL THEORY, WHETHER TORT (INCLUDING NEGLIGENCE), CONTRACT, OR OTHERWISE, SHALL YOU, THE INITIAL DEVELOPER, ANY OTHER CONTRIBUTOR, OR ANY DISTRIBUTOR OF COVERED SOFTWARE, OR ANY SUPPLIER OF ANY OF SUCH PARTIES, BE LIABLE TO ANY PERSON FOR ANY INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY CHARACTER INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER FAILURE OR MALFUNCTION, OR ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSSES, EVEN IF SUCH PARTY SHALL HAVE BEEN INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. THIS LIMITATION OF LIABILITY SHALL NOT APPLY TO LIABILITY FOR DEATH OR PERSONAL INJURY RESULTING FROM SUCH PARTY'S NEGLIGENCE TO THE EXTENT APPLICABLE LAW PROHIBITS SUCH LIMITATION. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OR LIMITATION OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS EXCLUSION AND LIMITATION MAY NOT APPLY TO YOU. 334 | 335 | 8. U.S. GOVERNMENT END USERS. 336 | 337 | The Covered Software is a “commercial item,” as that term is defined in 48 C.F.R. 2.101 (Oct. 1995), consisting of “commercial computer software” (as that term is defined at 48 C.F.R. § 252.227-7014(a)(1)) and “commercial computer software documentation” as such terms are used in 48 C.F.R. 12.212 (Sept. 1995). Consistent with 48 C.F.R. 12.212 and 48 C.F.R. 227.7202-1 through 227.7202-4 (June 1995), all U.S. Government End Users acquire Covered Software with only those rights set forth herein. This U.S. Government Rights clause is in lieu of, and supersedes, any other FAR, DFAR, or other clause or provision that addresses Government rights in computer software under this License. 338 | 339 | 9. MISCELLANEOUS. 340 | 341 | This License represents the complete agreement concerning subject matter hereof. If any provision of this License is held to be unenforceable, such provision shall be reformed only to the extent necessary to make it enforceable. This License shall be governed by the law of the jurisdiction specified in a notice contained within the Original Software (except to the extent applicable law, if any, provides otherwise), excluding such jurisdiction's conflict-of-law provisions. Any litigation relating to this License shall be subject to the jurisdiction of the courts located in the jurisdiction and venue specified in a notice contained within the Original Software, with the losing party responsible for costs, including, without limitation, court costs and reasonable attorneys' fees and expenses. The application of the United Nations Convention on Contracts for the International Sale of Goods is expressly excluded. Any law or regulation which provides that the language of a contract shall be construed against the drafter shall not apply to this License. You agree that You alone are responsible for compliance with the United States export administration regulations (and the export control laws and regulation of any other countries) when You use, distribute or otherwise make available any Covered Software. 342 | 343 | 10. RESPONSIBILITY FOR CLAIMS. 344 | 345 | As between Initial Developer and the Contributors, each party is responsible for claims and damages arising, directly or indirectly, out of its utilization of rights under this License and You agree to work with Initial Developer and Contributors to distribute such responsibility on an equitable basis. Nothing herein is intended or shall be deemed to constitute any admission of liability. 346 | 347 | NOTICE PURSUANT TO SECTION 9 OF THE COMMON DEVELOPMENT AND DISTRIBUTION LICENSE (CDDL) 348 | 349 | The code released under the CDDL shall be governed by the laws of the State of California (excluding conflict-of-law provisions). Any litigation relating to this License shall be subject to the jurisdiction of the Federal Courts of the Northern District of California and the state courts of the State of California, with venue lying in Santa Clara County, California. 350 | 351 | --------------------------------------------------------------------------------