您好,登錄后才能下訂單哦!
第三方應用如何通過HTTP接入Linkis,相信很多沒有經驗的人對此束手無策,為此本文總結了問題出現的原因和解決方法,通過這篇文章希望你能解決這個問題。
一、環境說明
環境說明:dss-0.7.0,linkis-0.9.3二、登錄 login
發送請求采用過okhttp,RestTemplate,HTTPClient,json與java bean采用過fastjson,gson,Jackson;因為dss的登錄狀態保持在cookie中,在經過n次踩雷后最終采用RestTemplate(spring-web:5.0.7)+fastjson(1.2.58) 的方式進行請求,并將登錄cookie傳遞給后續其他請求;
POST /api/rest_j/v1/user/login請求參數 { "userName": "hadoop", "password": "hadoop" }返回示例(返回json可能我略有修改,以官方為準) { "method": null, "status": 0, "message": "login successful(登錄成功)!", "data": { "isFirstLogin": false, "isAdmin": true, "userName": "hadoop" }}
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;
import org.springframework.web.client.RestTemplate;
/**
* 簡單工具類,可以根據自己需求擴展
*/
public class HttpUtil {
public static RestTemplate getRestClient() {
// 下面代碼HttpClientBuilder會自動管理cookie。
// 如果 HttpClientBuilder.create().disableCookieManagement,則禁用cookie的管理
CloseableHttpClient build = HttpClientBuilder.create().useSystemProperties().build();
return new RestTemplate(new HttpComponentsClientHttpRequestFactory(build));
}
}
import org.springframework.http.ResponseEntity; /** * 發送post請求進行登錄 * @param restClient * @return ResponseEntity */private ResponseEntity<JSONObject> login(RestTemplate restClient) { JSONObject postData = new JSONObject(); postData.put("userName", "hadoop"); postData.put("password", "hadoop"); String loginUrl = "http://ip:port/api/rest_j/v1/user/login"; return restClient.postForEntity(loginUrl, postData, JSONObject.class); }
三、執行任務 execute
源代碼在linkis模塊的EntranceRestfulApi
POST /api/rest_j/v1/entrance/execute
請求參數
{
"method":"/api/rest_j/v1/entrance/execute",
"params": {
"variable":{
"k1":"v1"
},
"configuration":{
"special":{
"k2":"v2"
},
"runtime":{
"k3":"v3"
},
"startup":{
"k4":"v4"
}
}
},
"executeApplicationName":"spark",
"executionCode":"show tables",
"runType":"sql",
"source":{
"scriptPath": "/home/Linkis/Linkis.sql"
}
}
返回示例
{
"method": "/api/rest_j/v1/entrance/execute",
"status": 0,
"message": "請求執行成功",
"data": {
"execID": "030418IDEhivebdpdwc010004:10087IDE_johnnwang_21",//執行id,后面要根據它獲取任務狀態
"taskID": "123" // 任務id,后面要根據它或者執行文件
}
}
/** * * @param restClient * @param sql 要執行的sql代碼 * @return */ private ResponseEntity<JSONObject> executeSql(RestTemplate restClient, String sql) { String url = "/api/rest_j/v1/entrance/execute"; JSONObject map = new JSONObject(); map.put("method", url); map.put("params", new HashMap<>()); //用戶指定的運行服務程序的參數,必填,里面的值可以為空 map.put("executeApplicationName", "hive");//執行引擎,我用的hive map.put("executionCode", sql); map.put("runType", "sql");//當用戶執行如spark服務時,可以選擇python、R、SQL等,不能為空 //因為我沒有執行文件腳本,所以沒有scriptPath參數 String executeSql = "http://ip:port" + url; return restClient.postForEntity(executeSql, map, JSONObject.class); }
四、查看任務狀態 status
源代碼在linkis模塊的EntranceRestfulApi
GET /api/rest_j/v1/entrance/${execID}/status
返回示例
{
"method": "/api/rest_j/v1/entrance/{execID}/status",
"status": 0,
"message": "獲取狀態成功",
"data": {
"execID": "${execID}",
"status": "Running"
}
}
String statusUrl = "http://ip:port/api/rest_j/v1/entrance/" + execID + "/status";ResponseEntity<JSONObject> statusResp = restTemplate.getForEntity(statusUrl, JSONObject.class);if (statusResp != null && statusResp.getStatusCode().value() == HttpStatus.SC_OK) { String status; for (; ; ) { statusResp = restTemplate.getForEntity(statusUrl, JSONObject.class); status = statusResp.getBody().getJSONObject("data").getString("status"); //死循環查看任務狀態,如果任務成功或者失敗,則退出循環 if ("Succeed".equals(status) || "Failed".equals(status)) { break; } } if ("Succeed".equals(status)) { // do something }}
五、獲取執行結果文件 get
源代碼在linkis模塊 QueryRestfulApi
GET /api/rest_j/v1/jobhistory/${taskId}/get
返回示例
{
"method": "/api/jobhistory/{id}/get",
"status": 0,
"message": "OK",
"data": {
"task": {
"taskID": 3111,
"instance": "test-dn2:9108",
"execId": "IDE_hadoop_46",
"umUser": "hadoop",
"engineInstance": "test-dn2:37301",
"executionCode": "show databases", //執行的sql
"progress": 1.0,
"logPath": "file:///linkis/hadoop/log/IDE/2020-09-08/3111.log",// 日志路徑
"resultLocation": "hdfs:///linkis2/hadoop/dwc/20200908/IDE/3111",//sql執行結果所存儲的文件路徑
"status": "Succeed",
"createdTime": 1599551337000,
"updatedTime": 1599551339000,
"engineType": null,
"errCode": null,
"errDesc": null,
"executeApplicationName": "hive",
"requestApplicationName": "IDE",
"scriptPath": null,
"runType": "sql",
"paramsJson": "{}",
"costTime": 2000,
"strongerExecId": "030413IDEhivetest-dn2:9108IDE_hadoop_46",
"sourceJson": "{\"scriptPath\":null}"
}
}
}
String historyUrl = "http://ip:port/api/rest_j/v1/jobhistory/" + taskID + "/get";ResponseEntity<JSONObject> hisResp = restTemplate.getForEntity(historyUrl, JSONObject.class);if (hisResp != null && hisResp.getStatusCode().value() == HttpStatus.SC_OK) { String resultLocation = hisResp.getBody().getJSONObject("data").getJSONObject("task").getString("resultLocation");}
六、打開結果文件 openFile
源代碼在linkis模塊 FsRestfulApi
GET /api/rest_j/v1/filesystem/openFile?path=${resultLocation}/_0.dolphin
返回示例
{
"method": "/api/filesystem/openFile",
"status": 0,
"message": "OK",
"data": {
"sql查詢結果數據"
}
}
String resUrl = "http://ip:port/api/rest_j/v1/filesystem/openFile?path=" + resultLocation + "/_0.dolphin";ResponseEntity<JSONObject> resResp = restTemplate.getForEntity(resUrl, JSONObject.class); if (resResp!= null && resResp.getStatusCode().value() == HttpStatus.SC_OK) { //do something}
看完上述內容,你們掌握第三方應用如何通過HTTP接入Linkis的方法了嗎?如果還想學到更多技能或想了解更多相關內容,歡迎關注億速云行業資訊頻道,感謝各位的閱讀!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。