- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我正在尝试使用 kafka 流中的 kafka connect api 使用 json 消息。我尝试在谷歌中搜索,但我找不到任何关于如何在流 api 中读取 json 消息的实质性信息。
因此,由于知识有限,我尝试了以下方法。
package com.kafka.api.serializers.json;
import java.util.Properties;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.ForeachAction;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.KStreamBuilder;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
public class ConsumerUtilities {
private static ObjectMapper om = new ObjectMapper();
public static Properties getProperties() {
Properties configs = new Properties();
configs.put(StreamsConfig.APPLICATION_ID_CONFIG,
"Kafka test application");
configs.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configs.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG,
"org.apache.kafka.common.serialization.ByteArraySerializer");
configs.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonDeserializer");
return configs;
}
public static KStreamBuilder getStreamingConsumer() {
KStreamBuilder builder = new KStreamBuilder();
return builder;
}
public static void printStreamData() {
KStreamBuilder builder = getStreamingConsumer();
KStream<String, JsonNode> kStream = builder.stream("test");
kStream.foreach(new ForeachAction<String, JsonNode>() {
@Override
public void apply(String key, JsonNode value) {
try {
System.out.println(key + " : " + om.treeToValue(value, Person.class));
} catch (JsonProcessingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
KafkaStreams kafkaStreams = new KafkaStreams(builder, getProperties());
kafkaStreams.start();
}
}
package com.kafka.api.serializers.json;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Properties;
public class ProducerUtilities {
private static ObjectMapper om = new ObjectMapper();
public static org.apache.kafka.clients.producer.Producer<String, JsonNode> getProducer() {
Properties configProperties = new Properties();
configProperties.put(ProducerConfig.CLIENT_ID_CONFIG,
"kafka json producer");
configProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
"localhost:9092");
configProperties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.ByteArraySerializer");
configProperties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonSerializer");
org.apache.kafka.clients.producer.Producer<String, JsonNode> producer = new KafkaProducer<String, JsonNode>(
configProperties);
return producer;
}
public ProducerRecord<String,JsonNode> createRecord(Person person){
JsonNode jsonNode = om.valueToTree(person);
ProducerRecord<String,JsonNode> record = new ProducerRecord<String,JsonNode>("test",jsonNode);
return record;
}
}
当我执行代码时出现如下异常
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] ERROR org.apache.kafka.clients.consumer.internals.ConsumerCoordinator - User provided listener org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener for group Kafka test application failed on partition assignment
org.apache.kafka.streams.errors.StreamsException: Failed to configure value serde class org.apache.kafka.connect.json.JsonDeserializer
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:770)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init>(AbstractProcessorContext.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init>(ProcessorContextImpl.java:40)
at org.apache.kafka.streams.processor.internals.StreamTask.<init>(StreamTask.java:138)
at org.apache.kafka.streams.processor.internals.StreamThread.createStreamTask(StreamThread.java:1078)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask(StreamThread.java:255)
at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks(StreamThread.java:245)
at org.apache.kafka.streams.processor.internals.StreamThread.addStreamTasks(StreamThread.java:1147)
at org.apache.kafka.streams.processor.internals.StreamThread.access$800(StreamThread.java:68)
at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned(StreamThread.java:184)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete(ConsumerCoordinator.java:265)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded(AbstractCoordinator.java:367)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup(AbstractCoordinator.java:316)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:297)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1078)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1043)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:536)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:490)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:480)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:457)
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.connect.json.JsonDeserializer is not an instance of org.apache.kafka.common.serialization.Serde
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:248)
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:764)
... 19 more
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.streams.processor.internals.StreamThread - stream-thread [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] Shutting down
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.streams.processor.internals.StreamThread - stream-thread [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] State transition from PARTITIONS_ASSIGNED to PENDING_SHUTDOWN.
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.clients.producer.KafkaProducer - Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.streams.processor.internals.StreamThread - stream-thread [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] Stream thread shutdown complete
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.streams.processor.internals.StreamThread - stream-thread [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] State transition from PENDING_SHUTDOWN to DEAD.
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] WARN org.apache.kafka.streams.KafkaStreams - stream-client [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29] All stream threads have died. The Kafka Streams instance will be in an error state and should be closed.
[Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] INFO org.apache.kafka.streams.KafkaStreams - stream-client [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29] State transition from REBALANCING to ERROR.
Exception in thread "Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: stream-thread [Kafka test application-d3a307c9-d998-421f-829c-85532efc8b29-StreamThread-1] Failed to rebalance.
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:543)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:490)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:480)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:457)
Caused by: org.apache.kafka.streams.errors.StreamsException: Failed to configure value serde class org.apache.kafka.connect.json.JsonDeserializer
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:770)
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init>(AbstractProcessorContext.java:59)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init>(ProcessorContextImpl.java:40)
at org.apache.kafka.streams.processor.internals.StreamTask.<init>(StreamTask.java:138)
at org.apache.kafka.streams.processor.internals.StreamThread.createStreamTask(StreamThread.java:1078)
at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask(StreamThread.java:255)
at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks(StreamThread.java:245)
at org.apache.kafka.streams.processor.internals.StreamThread.addStreamTasks(StreamThread.java:1147)
at org.apache.kafka.streams.processor.internals.StreamThread.access$800(StreamThread.java:68)
at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned(StreamThread.java:184)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete(ConsumerCoordinator.java:265)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded(AbstractCoordinator.java:367)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup(AbstractCoordinator.java:316)
at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:297)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1078)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1043)
at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:536)
... 3 more
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.connect.json.JsonDeserializer is not an instance of org.apache.kafka.common.serialization.Serde
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:248)
at org.apache.kafka.streams.StreamsConfig.defaultValueSerde(StreamsConfig.java:764)
... 19 more
我正在寻找解决问题的指导。
根据 Matthias 的建议创建自定义序列化器和反序列化器
package com.kafka.api.utilities;
import java.util.Properties;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.ForeachAction;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.KStreamBuilder;
import com.kafka.api.models.Person;
import com.kafka.api.serdes.JsonDeserializer;
import com.kafka.api.serdes.JsonSerializer;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.common.serialization.Serde;
public class ConsumerUtilities {
//private static ObjectMapper om = new ObjectMapper();
public static Properties getProperties() {
Properties configs = new Properties();
configs.put(StreamsConfig.APPLICATION_ID_CONFIG,
"Kafka test application");
configs.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
// configs.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG,
// "org.apache.kafka.common.serialization.ByteArraySerializer");
// configs.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG,
// "org.apache.kafka.connect.json.JsonDeserializer");
return configs;
}
public static KStreamBuilder getStreamingConsumer() {
KStreamBuilder builder = new KStreamBuilder();
return builder;
}
public static void printStreamData() {
JsonSerializer<Person> personJsonSerializer = new JsonSerializer<>();
JsonDeserializer<Person> personJsonDeserializer = new JsonDeserializer<>(Person.class);
Serde<Person> personSerde = Serdes.serdeFrom(personJsonSerializer, personJsonDeserializer);
KStreamBuilder builder = getStreamingConsumer();
KStream<String, Person> kStream = builder.stream(Serdes.String(),personSerde , "test");
kStream.foreach(new ForeachAction<String, Person>() {
@Override
public void apply(String key, Person value) {
System.out.println(key + " : " + value.toString());
}
});
KafkaStreams kafkaStreams = new KafkaStreams(builder, getProperties());
kafkaStreams.start();
}
}
package com.kafka.api.serdes;
import java.util.Map;
import org.apache.kafka.common.errors.SerializationException;
import org.apache.kafka.common.serialization.Deserializer;
import com.fasterxml.jackson.databind.ObjectMapper;
public class JsonDeserializer<T> implements Deserializer<T>{
private ObjectMapper om = new ObjectMapper();
private Class<T> type;
/*
* Default constructor needed by kafka
*/
public JsonDeserializer(Class<T> type) {
this.type = type;
}
@Override
public void close() {
// TODO Auto-generated method stub
}
@SuppressWarnings("unchecked")
@Override
public void configure(Map<String, ?> map, boolean arg1) {
if(type == null){
type = (Class<T>) map.get("type");
}
}
@Override
public T deserialize(String undefined, byte[] bytes) {
if(bytes == null || bytes.length == 0){
return null;
}
try{
return om.readValue(bytes, type);
}catch(Exception e){
throw new SerializationException(e);
}
}
protected Class<T> getType(){
return type;
}
}
package com.kafka.api.serdes;
import java.util.Map;
import org.apache.kafka.common.errors.SerializationException;
import org.apache.kafka.common.serialization.Serializer;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
public class JsonSerializer<T> implements Serializer<T> {
private ObjectMapper om = new ObjectMapper();
@Override
public void close() {
// TODO Auto-generated method stub
}
@Override
public void configure(Map<String, ?> config, boolean isKey) {
// TODO Auto-generated method stub
}
@Override
public byte[] serialize(String topic, T data) {
// TODO Auto-generated method stub
try {
return om.writeValueAsBytes(data);
} catch (JsonProcessingException e) {
throw new SerializationException();
}
}
}
异常:执行流应用程序后,出现以下异常。我很困惑。
[Kafka test application-cee84455-78ca-4a2f-881a-89b3c3a00e4b-StreamThread-1] INFO org.apache.kafka.streams.KafkaStreams - stream-client [Kafka test application-cee84455-78ca-4a2f-881a-89b3c3a00e4b] State transition from RUNNING to ERROR.
Exception in thread "Kafka test application-cee84455-78ca-4a2f-881a-89b3c3a00e4b-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Failed to deserialize value for record. topic=test, partition=0, offset=0
at org.apache.kafka.streams.processor.internals.SourceNodeRecordDeserializer.deserialize(SourceNodeRecordDeserializer.java:46)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:84)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:483)
at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:604)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:512)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:480)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:457)
Caused by: org.apache.kafka.common.errors.SerializationException: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'hello': was expecting ('true', 'false' or 'null')
at [Source: [B@5ee179dc; line: 1, column: 11]
Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'hello': was expecting ('true', 'false' or 'null')
at [Source: [B@5ee179dc; line: 1, column: 11]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1702)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:558)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidToken(UTF8StreamJsonParser.java:3524)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2686)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:878)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:772)
at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3834)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3783)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2929)
at com.kafka.api.serdes.JsonDeserializer.deserialize(JsonDeserializer.java:43)
at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:65)
at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:55)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:56)
at org.apache.kafka.streams.processor.internals.SourceNodeRecordDeserializer.deserialize(SourceNodeRecordDeserializer.java:44)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:84)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:483)
at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:604)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:512)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:480)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:457)
最佳答案
Streams API 需要读写数据,因此,它使用了Serde
的抽象,同时作为序列化器和反序列化器的包装器。这就是异常的基本含义。
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.connect.json.JsonDeserializer is not an instance of org.apache.kafka.common.serialization.Serde
因此,你需要将JsonSerializer
和JsonDeserialzier
包装成一个JsonSerde
,并在中使用这个
。JsonSerde
>StreamsConfig
最简单的方法是使用 Serdes.serdeFrom(...)
方法(注意:Serdes
-- 复数)。作为替代方案,您还可以直接实现 Serde
接口(interface)(注意 Serde
-- 单数)。您可以在 Serdes
类中找到有关如何实现 Serde
接口(interface)的示例。
关于java - 在 kafka 流 : JAVA 中使用 kafka connect json api 使用 JSON 值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46565829/
最近开始学习MongoDB。今天老师教了我们 mongoexport 命令。在练习时,我遇到了一个典型的问题,包括教练在内的其他同学都没有遇到过。我在我的 Windows 10 机器上使用 Mongo
我是 JSON Schema 的新手,读过什么是 JSON Schema 等等。但我不知道如何将 JSON Schema 链接到 JSON 以针对该 JSON Schema 进行验证。谁能解释一下?
在 xml 中,我可以在另一个 xml 文件中包含一个文件并使用它。如果您的软件从 xml 获取配置文件但没有任何方法来分离配置,如 apache/ngnix(nginx.conf - site-av
我有一个 JSON 对象,其中包含一个本身是 JSON 对象的字符串。我如何反序列化它? 我希望能够做类似的事情: #[derive(Deserialize)] struct B { c: S
考虑以下 JSON { "a": "{\"b\": 12, \"c\": \"test\"}" } 我想定义一个泛型读取 Reads[Outer[T]]对于这种序列化的 Json import
关闭。这个问题不满足Stack Overflow guidelines .它目前不接受答案。 想改善这个问题吗?更新问题,使其成为 on-topic对于堆栈溢出。 11 个月前关闭。 Improve
我的旧项目在 MySQL 中有 Standard JSON 格式的数据。 对于我在 JS (Node.js) 和 DynamoDB 中的全新项目,关于 Standard JSON格式: 是否建议将其转
JSON 值字符串、数字、true、false、null 是否是有效的 JSON? 即,是 true 一个有效的 JSON 文档?还是必须是数组/对象? 一些验证器接受这个(例如 http://jso
我有一个 JSON 字符串,其中一个字段是文本字段。这个文本字段可以包含用户在 UI 中输入的文本,如果他们输入的文本是 JSON 文本,也许是为了说明一些编码,我需要对他们的文本进行编码,以便它不会
我正在通过 IBM MQ 调用处理数据,当由 ColdFusion 10 (10,0,11,285437) 序列化时,0 将作为 +0.0 返回,它会导致无效的 JSON并且无法反序列化。 stPol
我正在从三个数组中生成一个散列,然后尝试构建一个 json。我通过 json object has array 成功了。 require 'json' A = [['A1', 'A2', 'A3'],
我从 API 接收 JSON,响应可以是 30 种类型之一。每种类型都有一组唯一的字段,但所有响应都有一个字段 type 说明它是哪种类型。 我的方法是使用serde .我为每种响应类型创建一个结构并
我正在下载一个 JSON 文件,我已将其检查为带有“https://jsonlint.com”的有效 JSON 到文档目录。然后我打开文件并再次检查,结果显示为无效的 JSON。这怎么可能????这是
我正在尝试根据从 API 接收到的数据动态创建一个 JSON 对象。 收到的示例数据:将数据解码到下面给出的 CiItems 结构中 { "class_name": "test", "
我想从字符串转换为对象。 来自 {"key1": "{\n \"key2\": \"value2\",\n \"key3\": {\n \"key4\": \"value4\"\n }\n
目前我正在使用以下代码将嵌套的 json 转换为扁平化的 json: import ( "fmt" "github.com/nytlabs/gojsonexplode" ) func
我有一个使用来自第三方 API 的数据的应用程序。我需要将 json 解码为一个结构,这需要该结构具有“传入”json 字段的 json 标签。传出的 json 字段具有不同的命名约定,因此我需要不同
我想使用 JSON 架构来验证某些值。我有两个对象,称它们为 trackedItems 和 trackedItemGroups。 trackedItemGroups 是组名称和 trackedItem
考虑以下案例类模式, case class Y (a: String, b: String) case class X (dummy: String, b: Y) 字段b是可选的,我的一些数据集没有字
我正在存储 cat ~/path/to/file/blah | 的输出jq tojson 在一个变量中,稍后在带有 JSON 内容的 curl POST 中使用。它运作良好,但它删除了所有换行符。我知
我是一名优秀的程序员,十分优秀!