- 使用 Spring Initializr 创建 Spring Boot 应用程序
- 在Spring Boot中配置Cassandra
- 在 Spring Boot 上配置 Tomcat 连接池
- 将Camel消息路由到嵌入WildFly的Artemis上
本文整理了Java中co.cask.cdap.api.workflow.WorkflowSpecification.getLocalDatasetSpecs()
方法的一些代码示例,展示了WorkflowSpecification.getLocalDatasetSpecs()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。WorkflowSpecification.getLocalDatasetSpecs()
方法的具体详情如下:
包路径:co.cask.cdap.api.workflow.WorkflowSpecification
类名称:WorkflowSpecification
方法名:getLocalDatasetSpecs
[英]Return the map of local dataset names and associated specifications required for dataset instance creation.
[中]返回创建数据集实例所需的本地数据集名称和相关规范的映射。
代码示例来源:origin: co.cask.cdap/cdap-proto
@Override
public JsonElement serialize(WorkflowSpecification src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject jsonObj = new JsonObject();
jsonObj.add("className", new JsonPrimitive(src.getClassName()));
jsonObj.add("name", new JsonPrimitive(src.getName()));
jsonObj.add("description", new JsonPrimitive(src.getDescription()));
jsonObj.add("plugins", serializeMap(src.getPlugins(), context, Plugin.class));
jsonObj.add("properties", serializeMap(src.getProperties(), context, String.class));
jsonObj.add("nodes", serializeList(src.getNodes(), context, WorkflowNode.class));
jsonObj.add("localDatasetSpecs", serializeMap(src.getLocalDatasetSpecs(), context, DatasetCreationSpec.class));
return jsonObj;
}
代码示例来源:origin: cdapio/cdap
@DELETE
@Path("/apps/{app-id}/workflows/{workflow-id}/runs/{run-id}/localdatasets")
public void deleteWorkflowLocalDatasets(HttpRequest request, HttpResponder responder,
@PathParam("namespace-id") String namespaceId,
@PathParam("app-id") String applicationId,
@PathParam("workflow-id") String workflowId,
@PathParam("run-id") String runId) throws NotFoundException {
WorkflowSpecification workflowSpec = getWorkflowSpecForValidRun(namespaceId, applicationId, workflowId, runId);
Set<String> errorOnDelete = new HashSet<>();
for (Map.Entry<String, DatasetCreationSpec> localDatasetEntry : workflowSpec.getLocalDatasetSpecs().entrySet()) {
String mappedDatasetName = localDatasetEntry.getKey() + "." + runId;
// try best to delete the local datasets.
try {
datasetFramework.deleteInstance(new DatasetId(namespaceId, mappedDatasetName));
} catch (InstanceNotFoundException e) {
// Dataset instance is already deleted. so its no-op.
} catch (Throwable t) {
errorOnDelete.add(mappedDatasetName);
LOG.error("Failed to delete the Workflow local dataset {}. Reason - {}", mappedDatasetName, t.getMessage());
}
}
if (errorOnDelete.isEmpty()) {
responder.sendStatus(HttpResponseStatus.OK);
return;
}
String errorMessage = "Failed to delete Workflow local datasets - " + Joiner.on(",").join(errorOnDelete);
throw new RuntimeException(errorMessage);
}
代码示例来源:origin: caskdata/cdap
@Override
public JsonElement serialize(WorkflowSpecification src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject jsonObj = new JsonObject();
jsonObj.add("className", new JsonPrimitive(src.getClassName()));
jsonObj.add("name", new JsonPrimitive(src.getName()));
jsonObj.add("description", new JsonPrimitive(src.getDescription()));
jsonObj.add("plugins", serializeMap(src.getPlugins(), context, Plugin.class));
jsonObj.add("properties", serializeMap(src.getProperties(), context, String.class));
jsonObj.add("nodes", serializeList(src.getNodes(), context, WorkflowNode.class));
jsonObj.add("localDatasetSpecs", serializeMap(src.getLocalDatasetSpecs(), context, DatasetCreationSpec.class));
return jsonObj;
}
代码示例来源:origin: co.cask.cdap/cdap-app-fabric
@DELETE
@Path("/apps/{app-id}/workflows/{workflow-id}/runs/{run-id}/localdatasets")
public void deleteWorkflowLocalDatasets(HttpRequest request, HttpResponder responder,
@PathParam("namespace-id") String namespaceId,
@PathParam("app-id") String applicationId,
@PathParam("workflow-id") String workflowId,
@PathParam("run-id") String runId) throws NotFoundException {
WorkflowSpecification workflowSpec = getWorkflowSpecForValidRun(namespaceId, applicationId, workflowId, runId);
Set<String> errorOnDelete = new HashSet<>();
for (Map.Entry<String, DatasetCreationSpec> localDatasetEntry : workflowSpec.getLocalDatasetSpecs().entrySet()) {
String mappedDatasetName = localDatasetEntry.getKey() + "." + runId;
// try best to delete the local datasets.
try {
datasetFramework.deleteInstance(new DatasetId(namespaceId, mappedDatasetName));
} catch (InstanceNotFoundException e) {
// Dataset instance is already deleted. so its no-op.
} catch (Throwable t) {
errorOnDelete.add(mappedDatasetName);
LOG.error("Failed to delete the Workflow local dataset {}. Reason - {}", mappedDatasetName, t.getMessage());
}
}
if (errorOnDelete.isEmpty()) {
responder.sendStatus(HttpResponseStatus.OK);
return;
}
String errorMessage = "Failed to delete Workflow local datasets - " + Joiner.on(",").join(errorOnDelete);
throw new RuntimeException(errorMessage);
}
代码示例来源:origin: co.cask.cdap/cdap-app-fabric
@GET
@Path("/apps/{app-id}/workflows/{workflow-id}/runs/{run-id}/localdatasets")
public void getWorkflowLocalDatasets(HttpRequest request, HttpResponder responder,
@PathParam("namespace-id") String namespaceId,
@PathParam("app-id") String applicationId,
@PathParam("workflow-id") String workflowId,
@PathParam("run-id") String runId)
throws NotFoundException, DatasetManagementException {
WorkflowSpecification workflowSpec = getWorkflowSpecForValidRun(namespaceId, applicationId, workflowId, runId);
Map<String, DatasetSpecificationSummary> localDatasetSummaries = new HashMap<>();
for (Map.Entry<String, DatasetCreationSpec> localDatasetEntry : workflowSpec.getLocalDatasetSpecs().entrySet()) {
String mappedDatasetName = localDatasetEntry.getKey() + "." + runId;
String datasetType = localDatasetEntry.getValue().getTypeName();
Map<String, String> datasetProperties = localDatasetEntry.getValue().getProperties().getProperties();
if (datasetFramework.hasInstance(new DatasetId(namespaceId, mappedDatasetName))) {
localDatasetSummaries.put(localDatasetEntry.getKey(),
new DatasetSpecificationSummary(mappedDatasetName, datasetType, datasetProperties));
}
}
responder.sendJson(HttpResponseStatus.OK, GSON.toJson(localDatasetSummaries));
}
代码示例来源:origin: cdapio/cdap
@GET
@Path("/apps/{app-id}/workflows/{workflow-id}/runs/{run-id}/localdatasets")
public void getWorkflowLocalDatasets(HttpRequest request, HttpResponder responder,
@PathParam("namespace-id") String namespaceId,
@PathParam("app-id") String applicationId,
@PathParam("workflow-id") String workflowId,
@PathParam("run-id") String runId)
throws NotFoundException, DatasetManagementException {
WorkflowSpecification workflowSpec = getWorkflowSpecForValidRun(namespaceId, applicationId, workflowId, runId);
Map<String, DatasetSpecificationSummary> localDatasetSummaries = new HashMap<>();
for (Map.Entry<String, DatasetCreationSpec> localDatasetEntry : workflowSpec.getLocalDatasetSpecs().entrySet()) {
String mappedDatasetName = localDatasetEntry.getKey() + "." + runId;
String datasetType = localDatasetEntry.getValue().getTypeName();
Map<String, String> datasetProperties = localDatasetEntry.getValue().getProperties().getProperties();
if (datasetFramework.hasInstance(new DatasetId(namespaceId, mappedDatasetName))) {
localDatasetSummaries.put(localDatasetEntry.getKey(),
new DatasetSpecificationSummary(mappedDatasetName, datasetType, datasetProperties));
}
}
responder.sendJson(HttpResponseStatus.OK, GSON.toJson(localDatasetSummaries));
}
代码示例来源:origin: cdapio/cdap
/**
* Creates a new instance based on the given {@link WorkflowProgramInfo}.
*/
public static NameMappedDatasetFramework createFromWorkflowProgramInfo(DatasetFramework datasetFramework,
WorkflowProgramInfo info,
ApplicationSpecification appSpec) {
Set<String> localDatasets = appSpec.getWorkflows().get(info.getName()).getLocalDatasetSpecs().keySet();
return new NameMappedDatasetFramework(datasetFramework, localDatasets, info.getRunId().getId());
}
代码示例来源:origin: co.cask.cdap/cdap-app-fabric
/**
* Creates a new instance based on the given {@link WorkflowProgramInfo}.
*/
public static NameMappedDatasetFramework createFromWorkflowProgramInfo(DatasetFramework datasetFramework,
WorkflowProgramInfo info,
ApplicationSpecification appSpec) {
Set<String> localDatasets = appSpec.getWorkflows().get(info.getName()).getLocalDatasetSpecs().keySet();
return new NameMappedDatasetFramework(datasetFramework, localDatasets, info.getRunId().getId());
}
代码示例来源:origin: cdapio/cdap
private void createLocalDatasets() throws IOException, DatasetManagementException {
final KerberosPrincipalId principalId = ProgramRunners.getApplicationPrincipal(programOptions);
for (final Map.Entry<String, String> entry : datasetFramework.getDatasetNameMapping().entrySet()) {
final String localInstanceName = entry.getValue();
final DatasetId instanceId = new DatasetId(workflowRunId.getNamespace(), localInstanceName);
final DatasetCreationSpec instanceSpec = workflowSpec.getLocalDatasetSpecs().get(entry.getKey());
LOG.debug("Adding Workflow local dataset instance: {}", localInstanceName);
try {
Retries.callWithRetries(new Retries.Callable<Void, Exception>() {
@Override
public Void call() throws Exception {
DatasetProperties properties = addLocalDatasetProperty(instanceSpec.getProperties(),
keepLocal(entry.getKey()));
// we have to do this check since addInstance method can only be used when app impersonation is enabled
if (principalId != null) {
datasetFramework.addInstance(instanceSpec.getTypeName(), instanceId, properties, principalId);
} else {
datasetFramework.addInstance(instanceSpec.getTypeName(), instanceId, properties);
}
return null;
}
}, RetryStrategies.fixDelay(Constants.Retry.LOCAL_DATASET_OPERATION_RETRY_DELAY_SECONDS, TimeUnit.SECONDS));
} catch (IOException | DatasetManagementException e) {
throw e;
} catch (Exception e) {
// this should never happen
throw new IllegalStateException(e);
}
}
}
代码示例来源:origin: co.cask.cdap/cdap-app-fabric
private void createLocalDatasets() throws IOException, DatasetManagementException {
final KerberosPrincipalId principalId = ProgramRunners.getApplicationPrincipal(programOptions);
for (final Map.Entry<String, String> entry : datasetFramework.getDatasetNameMapping().entrySet()) {
final String localInstanceName = entry.getValue();
final DatasetId instanceId = new DatasetId(workflowRunId.getNamespace(), localInstanceName);
final DatasetCreationSpec instanceSpec = workflowSpec.getLocalDatasetSpecs().get(entry.getKey());
LOG.debug("Adding Workflow local dataset instance: {}", localInstanceName);
try {
Retries.callWithRetries(new Retries.Callable<Void, Exception>() {
@Override
public Void call() throws Exception {
DatasetProperties properties = addLocalDatasetProperty(instanceSpec.getProperties(),
keepLocal(entry.getKey()));
// we have to do this check since addInstance method can only be used when app impersonation is enabled
if (principalId != null) {
datasetFramework.addInstance(instanceSpec.getTypeName(), instanceId, properties, principalId);
} else {
datasetFramework.addInstance(instanceSpec.getTypeName(), instanceId, properties);
}
return null;
}
}, RetryStrategies.fixDelay(Constants.Retry.LOCAL_DATASET_OPERATION_RETRY_DELAY_SECONDS, TimeUnit.SECONDS));
} catch (IOException | DatasetManagementException e) {
throw e;
} catch (Exception e) {
// this should never happen
throw new IllegalStateException(e);
}
}
}
代码示例来源:origin: cdapio/cdap
"SP1")));
Map<String, DatasetCreationSpec> localDatasetSpecs = spec.getLocalDatasetSpecs();
Assert.assertEquals(5, localDatasetSpecs.size());
由于其中一个项目使用的是 Java 1.7,我希望该版本能够手动运行 mrunit 测试用例。我在我的机器上安装了 java8,还想要 java7。当我运行 brew cask install jav
我刚刚安装了一个新的 Cask,然后运行了 brew doctor,它返回了 Error: Cask 'java' is unreadable: undefined method undent' fo
我正在尝试使用brew-cask安装virtual box,但这会返回这种错误,并且我无法安装virtualbox。我使用 MacOSX Lion 10.7.5 和 homebrew 0.9.5,ru
最近更新 cask 时,我开始出现以下错误: Error: Cask 'java' definition is invalid: Token '{:v1=>"java"}' in header lin
如何使用 ansible 安装特定版本的自制 cask 配方?例如,vagrant 2.2.6 而最新可用的是 2.2.7。 最佳答案 使用较新版本的 Homebrew cask命令已被删除并导致错误
我按照 http://caskroom.io/ 中的描述安装了 Homebrew Cask 用于踢球.它归结为以下命令: $ brew install caskroom/cask/brew-cask
本文整理了Java中co.cask.tephra.runtime.ZKModule类的一些代码示例,展示了ZKModule类的具体用法。这些代码示例主要来源于Github/Stackoverflow/
本文整理了Java中co.cask.tigon.guice.ZKClientModule类的一些代码示例,展示了ZKClientModule类的具体用法。这些代码示例主要来源于Github/Stack
我使用 brew cask 安装了信号(brew cask install signal) 我希望能够通过键入 signal 从终端启动 GUI 应用程序(例如:要运行 emacs 的 GUI [使用
我最近安装了 homebrew-cask,我看到的一件事是它默认将应用程序安装到以下目录中: 版本包下载 => /opt/homebrew-cask/Caskroom/ 示例:/opt/homebre
我想知道我的哪些应用程序可以使用 brew cask 安装命令。 我该怎么做? 规范 我想要做的是从 /Applications 中的所有应用程序中提取 brew-cask 上也可用的应用程序并列出他
我安装了一个 Homebrew 桶,当我尝试卸载它时,它给了我一个错误: $ brew cask uninstall julia Error: Cask 'julia' definition is i
我使用Homebrew Cask在 OS X 上安装应用程序。如何升级所有已安装的 casks? 最佳答案 现在 Homebrew Cask 终于有了官方升级机制(具体实现请参见Issue 3396)
嗯,我刚刚更新了我的 brew 并在执行 brew tap phinze/cask 命令后在我的机器上安装了 brew-cask 公式。然后我用brew cask install vagrant来安装
哪个目录是brew cask install使用的真实位置? 我想找到应用程序的真实位置,而不是/Application中的符号链接(symbolic link) 最佳答案 您可以在可用的 Casks
本文整理了Java中co.cask.cdap.api.workflow.WorkflowToken类的一些代码示例,展示了WorkflowToken类的具体用法。这些代码示例主要来源于Github/S
本文整理了Java中co.cask.cdap.api.workflow.WorkflowSpecification类的一些代码示例,展示了WorkflowSpecification类的具体用法。这些代
本文整理了Java中co.cask.cdap.common.guice.ZKClientModule类的一些代码示例,展示了ZKClientModule类的具体用法。这些代码示例主要来源于Github
本文整理了Java中co.cask.cdap.common.guice.ZKDiscoveryModule类的一些代码示例,展示了ZKDiscoveryModule类的具体用法。这些代码示例主要来源于
本文整理了Java中co.cask.cdap.common.zookeeper.ZKExtOperations类的一些代码示例,展示了ZKExtOperations类的具体用法。这些代码示例主要来源于
我是一名优秀的程序员,十分优秀!