gpt4 book ai didi

java - 在 Swing 中使用 Libjitsi 库处理 H264 编码的 RTP 视频流 - 如何渲染流?

转载 作者:行者123 更新时间:2023-12-01 11:32:42 24 4
gpt4 key购买 nike

我正在使用 https://jitsi.org/Projects/LibJitsi 处的 Java 库.

我想通过 RTP 流式传输 H264 视频(在本例中视频是桌面/屏幕流),然后渲染它。我可以弄清楚如何流式传输它,但不知道如何渲染流。给定以下代码(完全可编译并可使用 Libjitsi Jars 和 native 库运行),接下来我该怎么做才能将视频流渲染到 Swing JFrame 或 JPanel 中?显然有某种 JMF JAWTRenderer 或者我可以使用 Java Media Framework (JMF)、Freedom for Media in Java (FMJ)、Swing 嵌入中的 JavaFX 或带有 VLCj 库的 VLC 媒体播放器 Swing 嵌入。将该 RTP 视频流渲染到 Java Swing 应用程序中的最佳(最简单、性能良好、无错误、不弃用)方法是什么?

此外,在最底部,我还有一些相关问题。

import java.io.IOException;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.net.SocketException;
import java.util.List;
import java.util.Map;
import org.jitsi.service.libjitsi.LibJitsi;
import org.jitsi.service.neomedia.DefaultStreamConnector;
import org.jitsi.service.neomedia.MediaDirection;
import org.jitsi.service.neomedia.MediaService;
import org.jitsi.service.neomedia.MediaStream;
import org.jitsi.service.neomedia.MediaStreamTarget;
import org.jitsi.service.neomedia.MediaType;
import org.jitsi.service.neomedia.MediaUseCase;
import org.jitsi.service.neomedia.StreamConnector;
import org.jitsi.service.neomedia.device.MediaDevice;
import org.jitsi.service.neomedia.format.MediaFormat;

/**
* This class streams screen recorded video. It can either send an H264 encoded
* RTP stream or receive one depending on the value of the variable
* isReceivingVideo_.
*/
public class VideoStreamer {

// Set to false if sending video, set to true if receiving video.
private static final boolean isReceivingVideo_ = true;

public final MediaService mediaService_;
private final Map<MediaFormat, Byte> RTP_payload_number_map_;

public static final int LOCAL_BASE_PORT_NUMBER = 15000;
public static final String REMOTE_HOST_IP_ADDRESS = "127.0.0.1";
public static final int REMOTE_BASE_PORT_NUMBER = 10000;

private MediaStream videoMediaStream_;
private final int localBasePort_;
private final InetAddress remoteAddress_;
private final int remoteBasePort_;

/**
* Initializes a new VideoStreamer instance which is to send or receive
* video from a specific host and a specific port.
*
* @param isReceiver - true if this instance of VideoStreamer is receiving a
* video stream, false if it is sending a video stream.
*/
public VideoStreamer(boolean isReceiver) throws IOException {
this.remoteAddress_ = InetAddress.getByName(REMOTE_HOST_IP_ADDRESS);
mediaService_ = LibJitsi.getMediaService();
RTP_payload_number_map_ = mediaService_.getDynamicPayloadTypePreferences();
if (isReceiver) {
this.localBasePort_ = LOCAL_BASE_PORT_NUMBER;
this.remoteBasePort_ = REMOTE_BASE_PORT_NUMBER;
startVideoStream(MediaDirection.RECVONLY);
} else {
// switch the local and remote ports for the transmitter so they hook up with the receiver.
this.localBasePort_ = REMOTE_BASE_PORT_NUMBER;
this.remoteBasePort_ = LOCAL_BASE_PORT_NUMBER;
startVideoStream(MediaDirection.SENDONLY);
}
}

/**
* Initializes the receipt of video, starts it, and tries to record any
* incoming packets.
*
* @param intended_direction either sending or receiving an RTP video
* stream.
*/
public final void startVideoStream(final MediaDirection intended_direction) throws SocketException {
final MediaType video_media_type = MediaType.VIDEO;
final int local_video_port = localBasePort_;
final int remote_video_port = remoteBasePort_;
MediaDevice video_media_device = mediaService_.getDefaultDevice(video_media_type, MediaUseCase.DESKTOP);
final MediaStream video_media_stream = mediaService_.createMediaStream(video_media_device);
video_media_stream.setDirection(intended_direction);
// Obtain the list of formats that are available for a specific video_media_device and pick H264 if availible.
MediaFormat video_format = null;
final List<MediaFormat> supported_video_formats = video_media_device.getSupportedFormats();
for (final MediaFormat availible_video_format : supported_video_formats) {
final String encoding = availible_video_format.getEncoding();
final double clock_rate = availible_video_format.getClockRate();
if (encoding.equals("H264") && clock_rate == 90000) {
video_format = availible_video_format;
}
}
if (video_format == null) {
System.out.println("You do not have the H264 video codec");
System.exit(-1);
}
final byte dynamic_RTP_payload_type_for_H264 = getRTPDynamicPayloadType(video_format);
if (dynamic_RTP_payload_type_for_H264 < 96 || dynamic_RTP_payload_type_for_H264 > 127) {
System.out.println("Invalid RTP payload type number");
System.exit(-1);
}
video_media_stream.addDynamicRTPPayloadType(dynamic_RTP_payload_type_for_H264, video_format);
video_media_stream.setFormat(video_format);
final int local_RTP_video_port = local_video_port + 0;
final int local_RTCP_video_port = local_video_port + 1;
final StreamConnector video_connector = new DefaultStreamConnector(
new DatagramSocket(local_RTP_video_port),
new DatagramSocket(local_RTCP_video_port)
);
video_media_stream.setConnector(video_connector);
final int remote_RTP_video_port = remote_video_port + 0;
final int remote_RTCP_video_port = remote_video_port + 1;
video_media_stream.setTarget(new MediaStreamTarget(
new InetSocketAddress(remoteAddress_, remote_RTP_video_port),
new InetSocketAddress(remoteAddress_, remote_RTCP_video_port))
);
video_media_stream.setName(video_media_type.toString());
this.videoMediaStream_ = video_media_stream;
videoMediaStream_.start();
listenForVideoPackets(video_connector.getDataSocket());
}

public void listenForVideoPackets(final DatagramSocket videoDataSocket) {
new Thread(new Runnable() {
@Override
public void run() {
boolean socket_is_closed = false;
while (!socket_is_closed) {
final byte[] buffer = new byte[5000];
final DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
try {
videoDataSocket.receive(packet);
final byte[] packet_data = new byte[packet.getLength()];
System.arraycopy(packet.getData(), packet.getOffset(), packet_data, 0, packet.getLength());
final StringBuilder string_builder = new StringBuilder();
for (int i = 0; i < ((packet_data.length > 30) ? 30 : packet_data.length); ++i) {
byte b = packet_data[i];
string_builder.append(String.format("%02X ", b));
}
System.out.println("First thirty (or fewer) bytes of packet in hex: " + string_builder.toString());
} catch (SocketException socket_closed) {
System.out.println("Socket is closed");
socket_is_closed = true;
} catch (IOException exception) {
exception.printStackTrace();
}
}
}
}).start();
}

/**
* Checks if the given format exists in the list of formats with listed
* dynamic RTP payload numbers and returns that number.
*
* @param format - format to look up an RTP payload number for
* @return - RTP payload on success or -1 either if payload number cannot be
* found or if payload number is static.
*/
public byte getRTPDynamicPayloadType(final MediaFormat format) {
for (Map.Entry<MediaFormat, Byte> entry : RTP_payload_number_map_.entrySet()) {
final MediaFormat map_format = (MediaFormat) entry.getKey();
final Byte rtp_payload_type = (Byte) entry.getValue();
if (map_format.getClockRate() == format.getClockRate() && map_format.getEncoding().equals(format.getEncoding())) {
return rtp_payload_type;
}
}
return -1;
}

/**
* Close the MediaStream.
*/
public void close() {
try {
this.videoMediaStream_.stop();
} finally {
this.videoMediaStream_.close();
this.videoMediaStream_ = null;
}
}

public static void main(String[] args) throws Exception {
LibJitsi.start();
try {
VideoStreamer rtp_streamer
= new VideoStreamer(isReceivingVideo_);
try {
/*
* Wait for the media to be received and (hopefully) played back.
* Transmits for 1 minute and receives for 30 seconds to allow the
* tranmission to have a delay (if necessary).
*/
final long then = System.currentTimeMillis();
final long waiting_period;
if (isReceivingVideo_) {
waiting_period = 30000;
} else {
waiting_period = 60000;
}
try {
while (System.currentTimeMillis() - then < waiting_period) {
Thread.sleep(1000);
}
} catch (InterruptedException ie) {
}
} finally {
rtp_streamer.close();
}
System.err.println("Exiting VideoStreamer");
} finally {
LibJitsi.stop();
}
}
}

当我运行上述代码时,首先链接 Libjitsi jar 文件(将它们列在“库”下)并通过“-Djava.library.path=/path”指定 native (.so、.dll)库的位置/to/native/libraries”,我首先使用 Final boolean isReceivingVideo = true 运行它,然后使用 Final boolean isReceivingVideo = false 运行另一个实例,然后该应用程序的两个实例相互传输。此外,我还有一个函数 public void ListenForVideoPackets,它以十六进制格式打印出每个数据包的前 30 个字节。当我运行它时,我得到以下十六进制字节值:

Hex values for RTP byte array stream

我只是一名本科生,所以我的网络知识有限。有人可以解释所有这些十六进制模式的含义吗?为什么RTP数据包的第四个字节总是增加(33、35、37、39等)?为什么第一个数据包只有 16 个字节,而其他数据包却更长?第一个数据包是什么意思?为什么所有数据包的前 12 个左右字节都是相同的,除了第四个字节,它总是在增加?这些数字意味着什么以及如何处理此 RTP 流?

最佳答案

我在一个人的 Libjitsi 示例文件夹(不是库附带的文件夹)中找到了一个名为“PacketPlayer”的文件夹。他们的 git 可能包含一些有用的提示... https://github.com/Metaswitch/libjitsi/tree/master/src/org/jitsi/examples/PacketPlayer

请注意,有一个可能有用的“VideoContainer”类。请参阅https://github.com/jitsi/libjitsi/blob/master/src/org/jitsi/util/swing/VideoContainer.java

此外,前 12 个字节是 RTP header 。使用http://www.siptutorial.net/RTP/header.html处的标题图事实上,上述代码中的 RTP 负载类型为 99,上面的 RTP header 可分解为:

RTP版本:2,填充:0,扩展名:0,CSRC计数:0,[第一个字节]

标记:0,负载类型:99,[第二个字节]

序列号:-11221[第3、第4字节]

时间戳:1082411848

SSRC来源:-504863636

奇怪的是,序列号绝对没有按应有的方式增加 1。它增加了 2。这可能意味着您的数据报套接字正在获取每个其他数据包,而不是每个数据包。

关于java - 在 Swing 中使用 Libjitsi 库处理 H264 编码的 RTP 视频流 - 如何渲染流?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30283485/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com