- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
Elasticsearch版本:v5.0.1
安装的插件:[repository-hdfs]
JVM 版本:Java 版本“1.8.0_92”Java(TM) SE 运行时环境(build 1.8.0_92-b14)Java HotSpot(TM) 64 位服务器 VM(构建 25.92-b14,混合模式)
操作系统版本:CentOS 6.7 版(最终版)Linux 版本 2.6.32-573.26.1.el6.x86_64 (mockbuild@c6b8.bsys.dev.centos.org) (gcc 版本 4.4.7 20120313 (Red Hat 4.4.7-16) (GCC) ) #1 SMP Wed 2016 年 5 月 4 日 00:57:44
问题描述,包括预期行为与实际行为:
当我创建存储库时,ES 响应“{ “承认”:真实}",但是当我创建索引快照时,它抛出异常:
[2016-12-12T11:38:04,417][WARN ][r.suppressed ] path: /_snapshot/my_hdfs_repo/20161209-snapshot, params: {repository=my_hdfs_repo, snapshot=20161209-snapshot}
org.elasticsearch.transport.RemoteTransportException: [node-2][10.90.6.234:9340][cluster:admin/snapshot/create]
Caused by: org.elasticsearch.repositories.RepositoryException: [my_hdfs_repo] could not read repository data from index blob
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:751) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:520) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.0.1.jar:5.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_92]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_92]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_92]
Caused by: java.io.IOException: com.google.protobuf.ServiceException: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read")
at org.apache.hadoop.ipc.ProtobufHelper.getRemoteException(ProtobufHelper.java:47) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:580) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?]
at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?]
at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?]
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_92]
at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_92]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:849) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:818) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:721) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:520) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.0.1.jar:5.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_92]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_92]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_92]
Caused by: org.elasticsearch.common.io.stream.NotSerializableExceptionWrapper: service_exception: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read")
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:243) ~[?:?]
at com.sun.proxy.$Proxy33.getListing(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:573) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?]
at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?]
at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?]
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_92]
at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_92]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:849) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:818) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:721) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:520) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.0.1.jar:5.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_92]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_92]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_92]
Caused by: java.lang.SecurityException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read")
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:1.8.0_92]
at java.security.AccessControlContext.checkPermission2(AccessControlContext.java:538) ~[?:1.8.0_92]
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:481) ~[?:1.8.0_92]
at java.security.AccessController.checkPermission(AccessController.java:884) ~[?:1.8.0_92]
at java.lang.SecurityManager.checkPermission(SecurityManager.java:549) ~[?:1.8.0_92]
at javax.security.auth.Subject$ClassSet.populateSet(Subject.java:1414) ~[?:1.8.0_92]
at javax.security.auth.Subject$ClassSet.<init>(Subject.java:1372) ~[?:1.8.0_92]
at javax.security.auth.Subject.getPrivateCredentials(Subject.java:767) ~[?:1.8.0_92]
at org.apache.hadoop.security.UserGroupInformation.getCredentialsInternal(UserGroupInformation.java:1499) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.getTokens(UserGroupInformation.java:1464) ~[?:?]
at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:436) ~[?:?]
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1519) ~[?:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1446) ~[?:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1407) ~[?:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[?:?]
at com.sun.proxy.$Proxy33.getListing(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:573) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?]
at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?]
at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?]
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?]
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_92]
at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_92]
at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?]
at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:849) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:818) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:721) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:520) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.0.1.jar:5.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_92]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_92]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_92]
重现步骤:
1.创建存储库
PUT /_snapshot/my_backup
{
"type": "hdfs",
"settings": {
"path": "/path/on/hadoop",
"uri": "hdfs://hadoop_cluster_domain:[port]",
"conf_location":"/hadoop/hdfs-site.xml,/hadoop/core-site.xml",
"user":"hadoop"
}
}
2.快照我的索引
PUT /_snapshot/my_backup/snapshot_1?wait_for_completion=true
3.抛出异常
最佳答案
解决方案:更改HDFS插件的默认java安全策略。
策略文件位于ES_HOME/plugins/repository-hdfs/plugin-security.policy
,其中ES_HOME
是Elasticsearch的安装路径。添加以下政策:
grant {
// ... existing policies
permission javax.security.auth.PrivateCredentialPermission "org.apache.hadoop.security.Credentials * \"*\"", "read";
}
然后,在 Elasticsearch 的启动脚本 (ES_HOME/bin/elasticsearch
) 中,将以下 Java 选项添加到命令行参数中:
-Djava.security.policy=ES_HOME/plugins/repository-hdfs/plugin-security.policy
将修改后的文件同步到所有集群节点。最后,重启所有集群节点。
引用
关于java - ES-v5.0.1 在快照时抛出 java.lang.SecurityException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41136179/
Maven – 快照 大型软件应用程序通常由多个模块组成,这是多个团队工作于同一应用程序的不同模块的常见场景。例如一个团队工作负责应用程序的前端应用用户接口工程(app-ui.jar:1.0)),同
我的 ES 快照不起作用或看起来是空的。 首先,我在我的 Ubuntu 服务器上做了这个: 1.创建备份目录 mkdir /home/admin/dumps/elasticsearch 2. 将此目录
您好,我想将折线图保存为预定义尺寸 (x, y) 的图像。有没有简单的方法来做到这一点?我找不到任何合适的 Snapshot 参数。 WritableImage image = lineCha
我将 Genymotion 与 Oracle VirtualBox 一起使用,但是我有 250 GB 的 SSD,并且我遇到了(快照)问题 我在这里搜索并搜索,我找不到任何可能的方法来禁用自动快照,因
我们想使用 Grafana 来显示测量数据。现在,我们的测量设置创建了大量数据并保存在文件中。我们按原样保留文件,并直接使用 Spark(“数据湖”方法)对其进行后处理。 我们现在想要创建一些可视化,
我不知道如何制作节点组的快照。也就是说,我想制作一个覆盖有数字的 PNG 图标图像(例如未读消息)。 我的代码: int totalNumberOfUnreadMessages = 1; ImageV
我们想使用 Grafana 来显示测量数据。现在,我们的测量设置创建了大量数据并保存在文件中。我们按原样保留文件,并直接使用 Spark(“数据湖”方法)对其进行后处理。 我们现在想要创建一些可视化,
这个问题在这里已经有了答案: How do I return the response from an asynchronous call? (41 个回答) 关闭 4 年前。 function i
我试图在运行我的程序时保存所有内容。我想保存每个游戏对象及其脚本和变量。我知道可以序列化所有内容并将其保存为 XML(以及其他方式/格式,如 JSON)。这将需要大量的工作和时间。该程序将来可能会发生
我正在玩 Screeps ( http://screeps.com/ ) 模拟房间模式。我已经测试了一些东西,我不想失去我的进步。 我可以在模拟房间模式下拍摄快照并保存我的房间状态,这样我就不必从头开
我的应用程序的测试人员报告:“最近的应用程序列表中的应用程序缩略图根本没有调整。在我看来,它要么像主屏幕壁纸(tolikdru:可能,只是透明的矩形),要么像应用程序屏幕的绿色背景,但从来没有真正的应
这个问题在这里已经有了答案: JavaFX I want to save Chart-Image completely (1 个回答) 关闭 3 年前。 我正在尝试获取“Java 弹出窗口”的快照。
我正在使用 pymongo 从 MongoDB 中插入和检索数据。这两个操作可以同时执行。问题是我什么时候做 rows = db..find()在pymongo中,每次rows.count()返回不同
如何获取通过 选择的视频文件的快照在视频中的特定时间在后台静默(即没有可见元素、闪烁、声音等)? 最佳答案 主要有四个步骤: 创建 和 元素。 加载src URL.createObjectURL 生
您好,我正在使用 JavaFx WebView 创建 HTML 页面的屏幕截图,它工作正常,但我想知道是否可以在不在图形窗口中启动应用程序的情况下执行此操作!!我的意思是没有比这更轻量级的方法来获取屏
所以我有一个 horizontalscrollview,我想尝试添加一个对齐效果,它基本上使元素居中。到目前为止,我基本上都是用 XML 完成的。 然后我在里面有一个 LinearLayout。
调用 URL http:///gitweb.cgi?p=;a=tree;f=;hb=HEAD将显示 的树从 开始. 调用 URL http:///gitweb.cgi?p=;a=snapshot;
我们知道,Maven 项目第一次构建时,会自动从远程仓库搜索依赖项,并将其下载到本地仓库中。当项目再进行构建时,会直接从本地仓库搜索依赖项并引用,而不会再次向远程仓库获取。这样的设计能够避免项目每次构
这个问题在这里已经有了答案: Freeze screen in chrome debugger / DevTools panel for popover inspection? (9 个回答) 关闭
我正在开发一个向 DVR 和 IP 摄像机请求快照的应用程序。我正在开发的设备只提供 RTSP 请求。然后我实现了必要的 RTSP 方法来开始接收流数据包,然后通过建立的 UDP 连接开始接收。我的疑
我是一名优秀的程序员,十分优秀!