- iOS/Objective-C 元类和类别
- objective-c - -1001 错误,当 NSURLSession 通过 httpproxy 和/etc/hosts
- java - 使用网络类获取 url 地址
- ios - 推送通知中不播放声音
我正在研究如何使用 Java 从图像创建 mp4 视频。经过几天的研究,我知道 JCodec 可以做到(http://jcodec.org/)。这是我在 Android make animated video from list of images 上找到的演示(我只改了输入输出链接):
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceImagesEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
for (int i = 0; i < 3; i++)
Arrays.fill(toEncode.getData()[i], 0);
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
public static void main(String[] args) throws IOException {
SequenceImagesEncoder encoder = new SequenceImagesEncoder(new File("D:/workspace/JCodecMakeMP4/out.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("D:/workspace/JCodecMakeMP4/bin/frame" + i + ".jpeg", i)));
encoder.encodeImage(bi);
}
encoder.finish();
}
当我使用 jcodec-0.1.0.jar 时,类 NIOUtils 没有函数:writableFileChannel(File file)。
当我使用 jcodec-0.1.3.jar 时,一切似乎都正常,但我调试代码时,当我移动到行时它导致“找不到源”:muxer = new MP4Muxer(ch, Brand.MP4 );
有谁知道如何修复它。
提前致谢!
最佳答案
这只是分享我的经验。我使用 JpegImagesToMovie
来解决您的问题。
更多引用JpegImagesToMovie .
示例程序
public static void makeVideo(String fileName) throws MalformedURLException {
Vector<String> imgLst = get images path list.
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
MediaLocator oml;
if ((oml = imageToMovie.createMediaLocator(fileName)) == null) {
System.err.println("Cannot build media locator from: " + fileName);
System.exit(0);
}
int interval = 50;
imageToMovie.doIt(screenWidth, screenHeight, (1000 / interval), imgLst, oml);
}
JpegImagesToMovie.java
/*
* @(#)JpegImagesToMovie.java 1.3 01/03/13
*
* Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
*
* Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
* modify and redistribute this software in source and binary code form,
* provided that i) this copyright notice and license appear on all copies of
* the software; and ii) Licensee does not utilize the software in a manner
* which is disparaging to Sun.
*
* This software is provided "AS IS," without a warranty of any kind. ALL
* EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
* IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
* NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
* LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
* OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
* LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
* INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
* CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
* OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* This software is not designed or intended for use in on-line control of
* aircraft, air traffic, aircraft navigation or aircraft communications; or in
* the design, construction, operation or maintenance of any nuclear
* facility. Licensee represents and warrants that it will not use or
* redistribute the Software for such purposes.
*/
import java.awt.Dimension;
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.net.MalformedURLException;
import java.util.Vector;
import javax.media.Buffer;
import javax.media.ConfigureCompleteEvent;
import javax.media.ControllerEvent;
import javax.media.ControllerListener;
import javax.media.DataSink;
import javax.media.EndOfMediaEvent;
import javax.media.Format;
import javax.media.Manager;
import javax.media.MediaLocator;
import javax.media.PrefetchCompleteEvent;
import javax.media.Processor;
import javax.media.RealizeCompleteEvent;
import javax.media.ResourceUnavailableEvent;
import javax.media.Time;
import javax.media.control.TrackControl;
import javax.media.datasink.DataSinkErrorEvent;
import javax.media.datasink.DataSinkEvent;
import javax.media.datasink.DataSinkListener;
import javax.media.datasink.EndOfStreamEvent;
import javax.media.format.VideoFormat;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.DataSource;
import javax.media.protocol.FileTypeDescriptor;
import javax.media.protocol.PullBufferDataSource;
import javax.media.protocol.PullBufferStream;
/**
* This program takes a list of JPEG image files and convert them into a
* QuickTime movie.
*/
public class JpegImagesToMovie implements ControllerListener, DataSinkListener {
public boolean doIt(int width, int height, int frameRate, Vector inFiles,
MediaLocator outML) throws MalformedURLException {
ImageDataSource ids = new ImageDataSource(width, height, frameRate,
inFiles);
Processor p;
try {
//System.err
// .println("- create processor for the image datasource ...");
p = Manager.createProcessor(ids);
} catch (Exception e) {
System.err
.println("Yikes! Cannot create a processor from the data source.");
return false;
}
p.addControllerListener(this);
// Put the Processor into configured state so we can set
// some processing options on the processor.
p.configure();
if (!waitForState(p, p.Configured)) {
System.err.println("Failed to configure the processor.");
return false;
}
// Set the output content descriptor to QuickTime.
p.setContentDescriptor(new ContentDescriptor(
FileTypeDescriptor.QUICKTIME));
// Query for the processor for supported formats.
// Then set it on the processor.
TrackControl tcs[] = p.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
if (f == null || f.length <= 0) {
System.err.println("The mux does not support the input format: "
+ tcs[0].getFormat());
return false;
}
tcs[0].setFormat(f[0]);
//System.err.println("Setting the track format to: " + f[0]);
// We are done with programming the processor. Let's just
// realize it.
p.realize();
if (!waitForState(p, p.Realized)) {
System.err.println("Failed to realize the processor.");
return false;
}
// Now, we'll need to create a DataSink.
DataSink dsink;
if ((dsink = createDataSink(p, outML)) == null) {
System.err
.println("Failed to create a DataSink for the given output MediaLocator: "
+ outML);
return false;
}
dsink.addDataSinkListener(this);
fileDone = false;
System.out.println("Generating the video : "+outML.getURL().toString());
// OK, we can now start the actual transcoding.
try {
p.start();
dsink.start();
} catch (IOException e) {
System.err.println("IO error during processing");
return false;
}
// Wait for EndOfStream event.
waitForFileDone();
// Cleanup.
try {
dsink.close();
} catch (Exception e) {
}
p.removeControllerListener(this);
System.out.println("Video creation completed!!!!!");
return true;
}
/**
* Create the DataSink.
*/
DataSink createDataSink(Processor p, MediaLocator outML) {
DataSource ds;
if ((ds = p.getDataOutput()) == null) {
System.err
.println("Something is really wrong: the processor does not have an output DataSource");
return null;
}
DataSink dsink;
try {
//System.err.println("- create DataSink for: " + outML);
dsink = Manager.createDataSink(ds, outML);
dsink.open();
} catch (Exception e) {
System.err.println("Cannot create the DataSink: " + e);
return null;
}
return dsink;
}
Object waitSync = new Object();
boolean stateTransitionOK = true;
/**
* Block until the processor has transitioned to the given state. Return
* false if the transition failed.
*/
boolean waitForState(Processor p, int state) {
synchronized (waitSync) {
try {
while (p.getState() < state && stateTransitionOK)
waitSync.wait();
} catch (Exception e) {
}
}
return stateTransitionOK;
}
/**
* Controller Listener.
*/
public void controllerUpdate(ControllerEvent evt) {
if (evt instanceof ConfigureCompleteEvent
|| evt instanceof RealizeCompleteEvent
|| evt instanceof PrefetchCompleteEvent) {
synchronized (waitSync) {
stateTransitionOK = true;
waitSync.notifyAll();
}
} else if (evt instanceof ResourceUnavailableEvent) {
synchronized (waitSync) {
stateTransitionOK = false;
waitSync.notifyAll();
}
} else if (evt instanceof EndOfMediaEvent) {
evt.getSourceController().stop();
evt.getSourceController().close();
}
}
Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;
/**
* Block until file writing is done.
*/
boolean waitForFileDone() {
synchronized (waitFileSync) {
try {
while (!fileDone)
waitFileSync.wait();
} catch (Exception e) {
}
}
return fileSuccess;
}
/**
* Event handler for the file writer.
*/
public void dataSinkUpdate(DataSinkEvent evt) {
if (evt instanceof EndOfStreamEvent) {
synchronized (waitFileSync) {
fileDone = true;
waitFileSync.notifyAll();
}
} else if (evt instanceof DataSinkErrorEvent) {
synchronized (waitFileSync) {
fileDone = true;
fileSuccess = false;
waitFileSync.notifyAll();
}
}
}
/*public static void main(String args[]) {
if (args.length == 0)
prUsage();
// Parse the arguments.
int i = 0;
int width = -1, height = -1, frameRate = 1;
Vector inputFiles = new Vector();
String outputURL = null;
while (i < args.length) {
if (args[i].equals("-w")) {
i++;
if (i >= args.length)
prUsage();
width = new Integer(args[i]).intValue();
} else if (args[i].equals("-h")) {
i++;
if (i >= args.length)
prUsage();
height = new Integer(args[i]).intValue();
} else if (args[i].equals("-f")) {
i++;
if (i >= args.length)
prUsage();
frameRate = new Integer(args[i]).intValue();
} else if (args[i].equals("-o")) {
i++;
if (i >= args.length)
prUsage();
outputURL = args[i];
} else {
inputFiles.addElement(args[i]);
}
i++;
}
if (outputURL == null || inputFiles.size() == 0)
prUsage();
// Check for output file extension.
if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
System.err
.println("The output file extension should end with a .mov extension");
prUsage();
}
if (width < 0 || height < 0) {
System.err.println("Please specify the correct image size.");
prUsage();
}
// Check the frame rate.
if (frameRate < 1)
frameRate = 1;
// Generate the output media locators.
MediaLocator oml;
if ((oml = createMediaLocator(outputURL)) == null) {
System.err.println("Cannot build media locator from: " + outputURL);
System.exit(0);
}
JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
imageToMovie.doIt(width, height, frameRate, inputFiles, oml);
System.exit(0);
}*/
static void prUsage() {
System.err
.println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
System.exit(-1);
}
/**
* Create a media locator from the given string.
*/
static MediaLocator createMediaLocator(String url) {
MediaLocator ml;
if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
return ml;
if (url.startsWith(File.separator)) {
if ((ml = new MediaLocator("file:" + url)) != null)
return ml;
} else {
String file = "file:" + System.getProperty("user.dir")
+ File.separator + url;
if ((ml = new MediaLocator(file)) != null)
return ml;
}
return null;
}
// /////////////////////////////////////////////
//
// Inner classes.
// /////////////////////////////////////////////
/**
* A DataSource to read from a list of JPEG image files and turn that into a
* stream of JMF buffers. The DataSource is not seekable or positionable.
*/
class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
public void setLocator(MediaLocator source) {
}
public MediaLocator getLocator() {
return null;
}
/**
* Content type is of RAW since we are sending buffers of video frames
* without a container format.
*/
public String getContentType() {
return ContentDescriptor.RAW;
}
public void connect() {
}
public void disconnect() {
}
public void start() {
}
public void stop() {
}
/**
* Return the ImageSourceStreams.
*/
public PullBufferStream[] getStreams() {
return streams;
}
/**
* We could have derived the duration from the number of frames and
* frame rate. But for the purpose of this program, it's not necessary.
*/
public Time getDuration() {
return DURATION_UNKNOWN;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate,
Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new VideoFormat(VideoFormat.JPEG, new Dimension(width,
height), Format.NOT_SPECIFIED, Format.byteArray,
(float) frameRate);
}
/**
* We should never need to block assuming data are read from files.
*/
public boolean willReadBlock() {
return false;
}
/**
* This is called from the Processor to read a frame worth of video
* data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
//System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
String imageFile = (String) images.elementAt(nextImage);
nextImage++;
//System.err.println(" - reading image file: " + imageFile);
// Open a random access file for the next image.
RandomAccessFile raFile;
raFile = new RandomAccessFile(imageFile, "r");
byte data[] = null;
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[]) buf.getData();
// Check to see the given buffer is big enough for the frame.
if (data == null || data.length < raFile.length()) {
data = new byte[(int) raFile.length()];
buf.setData(data);
}
// Read the entire JPEG image from the file.
raFile.readFully(data, 0, (int) raFile.length());
//System.err.println(" read " + raFile.length() + " bytes.");
buf.setOffset(0);
buf.setLength((int) raFile.length());
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
// Close the random access file.
raFile.close();
}
/**
* Return the format of each video frame. That will be JPEG.
*/
public Format getFormat() {
return format;
}
public ContentDescriptor getContentDescriptor() {
return new ContentDescriptor(ContentDescriptor.RAW);
}
public long getContentLength() {
return 0;
}
public boolean endOfStream() {
return ended;
}
public Object[] getControls() {
return new Object[0];
}
public Object getControl(String type) {
return null;
}
}
}
关于java - 如何使用 JCodec 库从 Java 中的图像创建 mp4 视频?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24544361/
我正在使用以下代码播放声音,过一会儿它将停止播放声音,这是因为我相信有太多的Mediaplayer打开实例,所以我添加了一个额外的mp.release();,这只会使我的应用程序崩溃(目前已被注释掉)
我正在查看 XV-6 代码,它通过它识别 MP 结构。它首先在 EBDA 的第一个 kb 中搜索。代码是这样的 static struct mp* mpsearch(void) { uchar *
我在我的应用程序中使用 Mp 饼图。它显示非常小的尺寸,我试图增加它的尺寸但它没有增加它的尺寸。我无法找出问题所在。请告诉我们如何增加尺寸。 这是我的代码: public class MPpiecha
如何使用 MPAndroidChart 实现此目的? 使用版本:com.github.PhilJay:MPAndroidChart:v3.1.0-alpha 添加图例和饼图边距的代码: private
亲爱的社区,我面临以下问题,我正在使用此处提供的 MP android 图表库创建条形图:https://github.com/PhilJay/MPAndroidChart . 我想为我的条设置渐变背
我正在使用 SAS MP Connect 开发我的第一段代码,以运行同一个 sas 作业的并行线程。 我知道 MP CONNECT 仅受可用 CPU 数量的物理限制,但理想情况下我不想在我的工作中使用
我最近购买了在 Linux 服务器上运行的 Stata MP12(8 核)许可证。 有没有人写过 Stata 程序,比如说模拟研究来测试 Stata MP 的性能?我想监视在作业处理过程中实际使用的内
我将不胜感激任何“一步一步”指南,说明如何更改 PHP/HTML 页面上的动态数据库连接/连接字符串/等上的代码,使其“即插即用”工作通过 ftp 将页面和 MySQL 数据库托管在“Azure 网站
试图在我的应用程序中放置一个“暂停”按钮,以播放一些声音片段循环播放。 当我打电话mp.pause();一切都破了,我完全迷路了! 这是我正在使用的方法。 protected void man
我想使用 Mp Chart 创建折线图 我想要实现的是这张图片 但是到目前为止我已经得到了这个。 我使用的代码是这个 private fun setData() { val entries
通常,我可能会编写一个类似simd的循环: float * x = (float *) malloc(10 * sizeof(float)); float * y = (float *) malloc
在与堆栈空间、OpenMP 以及如何处理这些问题相关的其他帖子上,有很多回复。但是,我找不到信息来真正理解 OpenMP 调整编译器选项的原因: 原因是什么-fopenmp在 gfortran 中暗示
我有一段代码,可以根据漂移、波动性和随机数计算任意给定日期的股票价格。但是当我检查输出列表时 - 它们是算术级数,而不是几何级数(幂函数)。我共享的变量有问题吗? 代码如下: #include #i
我正在尝试在 C++11 中并行化动态编程算法使用这种方法: void buildBaseCases() { cout << "Building base cases" << endl
我正在 open MP 中实现并行点积 我有这个代码: #include #include #include #include #include #include #define SIZE
我有一台服务器已经将近 4 年了,直到现在我都没有遇到任何问题(主机端)。我一直在更换主机,因为 ddos 的东西试图找到最适合我的东西。现在我买了一个 VPS(这不是我的第一个)并尝试运行我的服
所以我有两个内部平行区域的外部平行区域。是否可以将 2 个线程放入外部平行线,将 4 个线程放入每个内部平行线?我做了这样的东西,但它似乎无法按照我想要的方式工作。有什么建议吗? start_r =
我希望有人指出我们遇到的问题或解决方法。 使用/MP 编译项目时,似乎只有同一文件夹中的文件会同时编译。我使用 Process Explorer 滑动命令行并确认行为。 项目过滤器似乎对并发编译的内容
本文整理了Java中me.chanjar.weixin.mp.api.WxMpMessageRouter类的一些代码示例,展示了WxMpMessageRouter类的具体用法。这些代码示例主要来源于G
我正在监视 Stata/MP(Stata/SE 的多核版本)的 CPU 和内存使用情况,但我不是 Stata 程序员(更像是 Perl 人)。 任何人都可以发布一些代码,利用公共(public)数据集
我是一名优秀的程序员,十分优秀!