- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
INFO ConnectionHandler: onConnectionLocalClose: hostname[edgeeventhub.servicebus.windows.net:5671], errorCondition[null, null]
19/01/09 02:34:00 INFO ConnectionHandler: onConnectionUnbound: hostname[edgeeventhub.servicebus.windows.net:5671], state[CLOSED], remoteState[ACTIVE]
19/01/09 02:34:00 INFO SessionHandler: entityName[mgmt-session], condition[Error{condition=null, description='null', info=null}]
19/01/09 02:34:00 INFO SessionHandler: entityName[cbs-session], condition[Error{condition=null, description='null', info=null}]
19/01/09 02:34:00 INFO SessionHandler: entityName[mgmt-session]
19/01/09 02:34:00 INFO SessionHandler: entityName[cbs-session]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/17]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/28]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/19]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/29]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/1]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/8]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/16]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/9]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/14]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/24]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/25]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/15]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/26]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/18]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/27]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/4]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/5]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/7]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/3]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/6]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/22]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/0]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/21]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/20]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/23]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/2]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/10]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/13]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/31]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/12]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/11]
19/01/09 02:34:00 INFO SessionHandler: entityName[edgeeventhub/ConsumerGroups/$Default/Partitions/30]
19/01/09 02:34:00 ERROR StreamExecution: Query [id = e5b72043-d85a-4004-9f1c-dc3aaa77a0bc, runId = 75261962-7648-4d7d-90e9-b2ed0906d2b7] terminated with error
java.util.concurrent.ExecutionException: com.microsoft.azure.eventhubs.EventHubException: connection aborted
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.spark.eventhubs.client.EventHubsClient.getRunTimeInfo(EventHubsClient.scala:112)
at org.apache.spark.eventhubs.client.EventHubsClient.boundedSeqNos(EventHubsClient.scala:149)
at org.apache.spark.sql.eventhubs.EventHubsSource$$anonfun$6.apply(EventHubsSource.scala:130)
at org.apache.spark.sql.eventhubs.EventHubsSource$$anonfun$6.apply(EventHubsSource.scala:128)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.sql.eventhubs.EventHubsSource.getOffset(EventHubsSource.scala:128)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$10$$anonfun$apply$6.apply(StreamExecution.scala:521)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$10$$anonfun$apply$6.apply(StreamExecution.scala:521)
at org.apache.spark.sql.execution.streaming.ProgressReporter$class.reportTimeTaken(ProgressReporter.scala:279)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$10.apply(StreamExecution.scala:520)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$10.apply(StreamExecution.scala:518)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$constructNextBatch(StreamExecution.scala:518)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1$$anonfun$apply$mcZ$sp$1.apply$mcV$sp(StreamExecution.scala:301)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1$$anonfun$apply$mcZ$sp$1.apply(StreamExecution.scala:294)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1$$anonfun$apply$mcZ$sp$1.apply(StreamExecution.scala:294)
at org.apache.spark.sql.execution.streaming.ProgressReporter$class.reportTimeTaken(ProgressReporter.scala:279)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1.apply$mcZ$sp(StreamExecution.scala:294)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches(StreamExecution.scala:290)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:206)
Caused by: com.microsoft.azure.eventhubs.EventHubException: connection aborted
at com.microsoft.azure.eventhubs.impl.ExceptionUtil.toException(ExceptionUtil.java:58)
at com.microsoft.azure.eventhubs.impl.RequestResponseChannel$ResponseHandler.onClose(RequestResponseChannel.java:250)
at com.microsoft.azure.eventhubs.impl.BaseLinkHandler.processOnClose(BaseLinkHandler.java:50)
at com.microsoft.azure.eventhubs.impl.MessagingFactory.onConnectionError(MessagingFactory.java:266)
at com.microsoft.azure.eventhubs.impl.ConnectionHandler.onTransportError(ConnectionHandler.java:105)
at org.apache.qpid.proton.engine.BaseHandler.handle(BaseHandler.java:191)
at org.apache.qpid.proton.engine.impl.EventImpl.dispatch(EventImpl.java:108)
at org.apache.qpid.proton.reactor.impl.ReactorImpl.dispatch(ReactorImpl.java:324)
at org.apache.qpid.proton.reactor.impl.ReactorImpl.process(ReactorImpl.java:291)
at com.microsoft.azure.eventhubs.impl.MessagingFactory$RunReactor.run(MessagingFactory.java:462)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/01/09 02:34:00 INFO EventHubsClient: close: Closing EventHubsClient.
19/01/09 02:34:00 INFO ClientConnectionPool: Client returned. EventHub name: edgeeventhub. Total clients: 3. Available clients: 3
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: com.microsoft.azure.eventhubs.EventHubException: connection aborted
=== Streaming Query ===
Identifier: [id = e5b72043-d85a-4004-9f1c-dc3aaa77a0bc, runId = 75261962-7648-4d7d-90e9-b2ed0906d2b7]
Current Committed Offsets: {org.apache.spark.sql.eventhubs.EventHubsSource@e8217e7: {"edgeeventhub":{"23":16394,"8":16404,"17":16406,"26":16396,"11":16403,"29":16400,"2":16408,"20":16400,"5":16401,"14":16401,"4":16405,"13":16404,"31":16400,"22":16394,"7":16399,"16":16406,"25":16400,"10":16405,"1":16402,"28":16411,"19":16405,"27":16406,"9":16400,"18":16408,"12":16401,"3":16403,"21":16397,"30":16398,"15":16407,"6":16405,"24":16391,"0":16406}}}
Current Available Offsets: {org.apache.spark.sql.eventhubs.EventHubsSource@e8217e7: {"edgeeventhub":{"23":16394,"8":16404,"17":16406,"26":16396,"11":16403,"29":16400,"2":16408,"20":16400,"5":16401,"14":16401,"4":16405,"13":16404,"31":16400,"22":16394,"7":16399,"16":16406,"25":16400,"10":16405,"1":16402,"28":16411,"19":16405,"27":16406,"9":16400,"18":16408,"12":16401,"3":16403,"21":16397,"30":16398,"15":16407,"6":16405,"24":16391,"0":16406}}}
Current State: ACTIVE
Thread State: RUNNABLE
我在长时间 Spark 运行时遇到异常,它从事件中心获取数据并同时保存到 Redis 中,之后我有调度程序来提取整个数据并保存到数据库中。这个过程完美地持续了 8-10 小时,之后我发现了上述问题,但是我正在使用以下 sdk。 请提出您的建议,因为我在生产环境中面临上述问题。azure-eventhubs-spark_2.11 版本 2.2.1
var maxEventTrigger: Long = Constants.maxEventTrigger.toLong;
val customEventhubParameters = EventHubsConf(connStr).setMaxEventsPerTrigger(maxEventTrigger);
val incomingStream = spark.readStream.format("eventhubs").options(customEventhubParameters.toMap).load();
logger.info("Data has been fetched from event hub successfully");
val messages = incomingStream.withColumn("Offset", $"offset".cast(LongType)).withColumn("Time (readable)", $"enqueuedTime".cast(TimestampType)).withColumn("Timestamp", $"enqueuedTime".cast(LongType)).withColumn("Body", $"body".cast(StringType)).select("Offset", "Time (readable)", "Timestamp", "Body")
implicit val formats = DefaultFormats;
val ob = new EventhubMaster();
ob.execute(messages);
最佳答案
在 eventhub 中为每个 Spark 作业创建多个组解决了该问题。
关于azure - 使用 Spark 订阅事件中心时出现异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54103321/
我正在用 Java 创建一组小部件,用于解码和显示在串行接口(interface)接收到的消息。 消息类型由唯一标识符定义。每个小部件只对特定标识符感兴趣。 如何对应用程序进行编程,以便将消息正确分发
我有以下代码,其中包含多个订阅。我需要实现的是这样的: 订阅activateRoute 以获取用户和产品数据。 返回商品数据后,使用商品数据订阅getSeller服务。 使用返回的卖家数据订阅 get
我已经使用 Fitbit 的 PHP 库 (www.disciplinexgames.com/fitbit) 在我的网站中成功集成了 FitBit api。它工作正常,但我现在想使用订阅 API,以便
在我的 Angular 7 应用程序中,我有下一个功能: getUserData(uid) { return this.fireStore.collection('users').doc(
我正在尝试在 Node 中实现发布/订阅模式,但不使用 Redis。功能应该是相同的;您可以发布到 channel ,订阅 channel 并收听数据(如果您已订阅);以下是 Redis 功能: pu
这是我当前的应用程序结构: /client/client.js /server/server.js collection.js 有 HTML 和 CSS 文件,但这些与我的问题无关。在将我的应用程序拆
我们正在使用OpenTok建立视频聊天室体验,并且在基本工作正常的同时,我发现当 session 室中有很多参与者发布音频时,本底噪声非常高。像Zoom这样的浏览器外解决方案似乎没有这种高水平的“白噪
RabbitMQ 是点对点还是发布-订阅?或者两者都取决于配置选项? 我一直在查看配置,它们似乎都支持点对点模型而不是发布-订阅。即消息一旦被消费就会从队列中删除,并且不可用于第二个消费者。 最佳答案
我是 Angular 6 和 ngrx 商店的新人。我尝试在从商店订阅数据后调度操作,但它会导致无限循环并使浏览器崩溃?我错了什么。我发现它使用 rxjs 的 do/tap 运算符但仍然不起作用的一些
这个问题已经有答案了: Property '...' has no initializer and is not definitely assigned in the constructor (37
这个问题已经有答案了: Property '...' has no initializer and is not definitely assigned in the constructor (37
我正在使用 Visual Studio 2017 v15.6.2 和 Azure Services Authentication Extension 为支持 MSI 的应用程序进行本地 azure 功
我想知道如何确定给定的 WC_Product 对象 $product 是否是订阅产品。 最佳答案 您可以使用他们的辅助函数,这可能是最完整的: if( class_exists( 'WC_Subscr
我正在研究使用服务器发送的事件作为支持 api 来实现“订阅”类型。 我正在苦苦挣扎的是接口(interface),更准确地说,是这种操作的 http 层。 问题: 使用原生 EventSource不
我会根据每个用户的订阅类型向我的用户发送通知。 例如: 用户 A 订阅了所有新闻文章 用户 B 订阅了所有评论 用户 C 订阅了网站上的所有新内容 我有一个每 5 分钟运行一次的脚本(除非该脚本仍在运
我正在使用 Ionic2/Angular2,并且需要使用参数 authData 调用函数,如下所示。 public auth: FirebaseAuth this.auth.subscrib
已结束。此问题正在寻求书籍、工具、软件库等的推荐。它不满足Stack Overflow guidelines 。目前不接受答案。 我们不允许提出寻求书籍、工具、软件库等推荐的问题。您可以编辑问题,以便
我们现有的系统可以持续处理大量文件。粗略地说,每天大约有 300 万个文件,大小从几千字节到超过 50 MB。这些文件从接收到完成使用会经历几个不同的处理阶段,具体取决于它们所采用的路径。由于这些文件
我有一项服务,我使用 Paypal 订阅。 Paypal 有 webhooks。问题是我不知道我需要使用哪个,不知道用户是否为下个月付款。 我使用了 Billing subscription rene
我目前正在为一个网站整理一个处理脚本,遇到了一个我似乎无法找到明确答案的问题。 Paypal 的文档充其量是不确定的,我对 Paypal 的使用还不够多,无法从他们提供的信息中轻松辨别答案。 当通过
我是一名优秀的程序员,十分优秀!