- Java 双重比较
- java - 比较器与 Apache BeanComparator
- Objective-C 完成 block 导致额外的方法调用?
- database - RESTful URI 是否应该公开数据库主键?
我编写了一个简单的类来在一个简单的游戏中播放音频文件。它适用于枪声或爆炸等小声音,但当我尝试将它用于背景音乐时,出现此错误:“无法分配剪辑数据:请求的缓冲区太大。”我假设这意味着文件太大,但我该如何解决这个问题?来源:
import java.io.File;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
public class Sound{
private Clip clip;
public Sound(String filepath){
System.out.println(filepath);
File file = new File(filepath);
try {
clip = AudioSystem.getClip();
AudioInputStream inputStream = AudioSystem.getAudioInputStream(file);
clip.open(inputStream);
} catch (Exception e) {
System.err.println(e.getMessage());
}
}
public void play(){
System.out.println("play");
if(clip.isActive()){
clip.stop();
}
clip.setFramePosition(0);
clip.start();
}
public void stop(){
clip.stop();
}
public void loop(){
if(!clip.isActive()){
clip.setFramePosition(0);
clip.loop(Clip.LOOP_CONTINUOUSLY);
}else{
System.out.println("ALREADY PLAYING");
}
}
public boolean getActive(){return clip.isActive();}
}
最佳答案
使用BigClip
.这是我放在一起播放 12-18 分钟(或更多1)MP3 的类(class)。
它需要运行时类路径上的 mp3plugin.jar
才能实际加载 MP3 格式的声音,但这不是重点。重点是:
BigClip
将在 OutOfMemoryError
之前将声音文件加载到 JVM 允许的最大内存。import java.awt.Component;
import javax.swing.*;
import javax.sound.sampled.*;
import java.io.*;
import java.util.logging.*;
import java.util.Arrays;
import java.net.URL;
import javax.swing.JOptionPane;
class BigClipExample {
public static void main(String[] args) throws Exception {
URL url = new URL("http://pscode.org/media/leftright.wav");
BigClip clip = new BigClip();
AudioInputStream ais = AudioSystem.getAudioInputStream(url);
clip.open(ais);
clip.start();
JOptionPane.showMessageDialog(null, "BigClip.start()");
clip.loop(4);
JOptionPane.showMessageDialog(null, "BigClip.loop(4)");
clip.setFastForward(true);
clip.loop(8);
// the looping/FF combo. reveals a bug..
// there is a slight 'click' in the sound that should not be audible
JOptionPane.showMessageDialog(null, "Are you on speed?");
}
}
/** An implementation of the javax.sound.sampled.Clip that is designed
to handle Clips of arbitrary size, limited only by the amount of memory
available to the app. It uses the post 1.4 thread behaviour (daemon thread)
that will stop the sound running after the main has exited.
<ul>
<li>2012-02-29 - Reworked play/loop to fix several bugs.
<li>2009-09-01 - Fixed bug that had clip ..clipped at the end, by calling drain() (before
calling stop()) on the dataline after the play loop was complete. Improvement to frame
and microsecond position determination.
<li>2009-08-17 - added convenience constructor that accepts a Clip. Changed the private
convertFrameToM..seconds methods from 'micro' to 'milli' to reflect that they were dealing
with units of 1000/th of a second.
<li>2009-08-14 - got rid of flush() after the sound loop, as it was cutting off tracks just
before the end, and was found to be not needed for the fast-forward/rewind functionality it
was introduced to support.
<li>2009-08-11 - First binary release.
</ul>
N.B. Remove @Override notation and logging to use in 1.3+
@since 1.5
@version 2012-02-29
@author Andrew Thompson
@author Alejandro Garcia */
class BigClip implements Clip, LineListener {
/** The DataLine used by this Clip. */
private SourceDataLine dataLine;
/** The raw bytes of the audio data. */
private byte[] audioData;
/** The stream wrapper for the audioData. */
private ByteArrayInputStream inputStream;
/** Loop count set by the calling code. */
private int loopCount = 1;
/** Internal count of how many loops to go. */
private int countDown = 1;
/** The start of a loop point. Defaults to 0. */
private int loopPointStart;
/** The end of a loop point. Defaults to the end of the Clip. */
private int loopPointEnd;
/** Stores the current frame position of the clip. */
private int framePosition;
/** Thread used to run() sound. */
private Thread thread;
/** Whether the sound is currently playing or active. */
private boolean active;
/** Stores the last time bytes were dumped to the audio stream. */
private long timelastPositionSet;
private int bufferUpdateFactor = 2;
/** The parent Component for the loading progress dialog. */
Component parent = null;
/** Used for reporting messages. */
private Logger logger = Logger.getAnonymousLogger();
/** Default constructor for a BigClip. Does nothing. Information from the
AudioInputStream passed in open() will be used to get an appropriate SourceDataLine. */
public BigClip() {}
/** There are a number of AudioSystem methods that will return a configured Clip. This
convenience constructor allows us to obtain a SourceDataLine for the BigClip that uses
the same AudioFormat as the original Clip.
@param clip Clip The Clip used to configure the BigClip. */
public BigClip(Clip clip) throws LineUnavailableException {
dataLine = AudioSystem.getSourceDataLine( clip.getFormat() );
}
/** Provides the entire audio buffer of this clip.
@return audioData byte[] The bytes of the audio data that is loaded in this Clip. */
public byte[] getAudioData() {
return audioData;
}
/** Sets a parent component to act as owner of a "Loading track.." progress dialog.
If null, there will be no progress shown. */
public void setParentComponent(Component parent) {
this.parent = parent;
}
/** Converts a frame count to a duration in milliseconds. */
private long convertFramesToMilliseconds(int frames) {
return (frames/(long)dataLine.getFormat().getSampleRate())*1000;
}
/** Converts a duration in milliseconds to a frame count. */
private int convertMillisecondsToFrames(long milliseconds) {
return (int)(milliseconds/dataLine.getFormat().getSampleRate());
}
@Override
public void update(LineEvent le) {
logger.log(Level.FINEST, "update: " + le );
}
@Override
public void loop(int count) {
logger.log(Level.FINEST, "loop(" + count + ") - framePosition: " + framePosition);
loopCount = count;
countDown = count;
active = true;
inputStream.reset();
start();
}
@Override
public void setLoopPoints(int start, int end) {
if (
start<0 ||
start>audioData.length-1 ||
end<0 ||
end>audioData.length
) {
throw new IllegalArgumentException(
"Loop points '" +
start +
"' and '" +
end +
"' cannot be set for buffer of size " +
audioData.length);
}
if (start>end) {
throw new IllegalArgumentException(
"End position " +
end +
" preceeds start position " + start);
}
loopPointStart = start;
framePosition = loopPointStart;
loopPointEnd = end;
}
@Override
public void setMicrosecondPosition(long milliseconds) {
framePosition = convertMillisecondsToFrames(milliseconds);
}
@Override
public long getMicrosecondPosition() {
return convertFramesToMilliseconds(getFramePosition());
}
@Override
public long getMicrosecondLength() {
return convertFramesToMilliseconds(getFrameLength());
}
@Override
public void setFramePosition(int frames) {
framePosition = frames;
int offset = framePosition*format.getFrameSize();
try {
inputStream.reset();
inputStream.read(new byte[offset]);
} catch(Exception e) {
e.printStackTrace();
}
}
@Override
public int getFramePosition() {
long timeSinceLastPositionSet = System.currentTimeMillis() - timelastPositionSet;
int size = dataLine.getBufferSize()*(format.getChannels()/2)/bufferUpdateFactor;
int framesSinceLast = (int)((timeSinceLastPositionSet/1000f)*
dataLine.getFormat().getFrameRate());
int framesRemainingTillTime = size - framesSinceLast;
return framePosition
- framesRemainingTillTime;
}
@Override
public int getFrameLength() {
return audioData.length/format.getFrameSize();
}
AudioFormat format;
@Override
public void open(AudioInputStream stream) throws
IOException,
LineUnavailableException {
AudioInputStream is1;
format = stream.getFormat();
if (format.getEncoding()!=AudioFormat.Encoding.PCM_SIGNED) {
is1 = AudioSystem.getAudioInputStream(
AudioFormat.Encoding.PCM_SIGNED, stream );
} else {
is1 = stream;
}
format = is1.getFormat();
InputStream is2;
if (parent!=null) {
ProgressMonitorInputStream pmis = new ProgressMonitorInputStream(
parent,
"Loading track..",
is1);
pmis.getProgressMonitor().setMillisToPopup(0);
is2 = pmis;
} else {
is2 = is1;
}
byte[] buf = new byte[ 2^16 ];
int totalRead = 0;
int numRead = 0;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
numRead = is2.read( buf );
while (numRead>-1) {
baos.write( buf, 0, numRead );
numRead = is2.read( buf, 0, buf.length );
totalRead += numRead;
}
is2.close();
audioData = baos.toByteArray();
AudioFormat afTemp;
if (format.getChannels()<2) {
afTemp = new AudioFormat(
format.getEncoding(),
format.getSampleRate(),
format.getSampleSizeInBits(),
2,
format.getSampleSizeInBits()*2/8, // calculate frame size
format.getFrameRate(),
format.isBigEndian()
);
} else {
afTemp = format;
}
setLoopPoints(0,audioData.length);
dataLine = AudioSystem.getSourceDataLine(afTemp);
dataLine.open();
inputStream = new ByteArrayInputStream( audioData );
}
@Override
public void open(AudioFormat format,
byte[] data,
int offset,
int bufferSize)
throws LineUnavailableException {
byte[] input = new byte[bufferSize];
for (int ii=0; ii<input.length; ii++) {
input[ii] = data[offset+ii];
}
ByteArrayInputStream inputStream = new ByteArrayInputStream(input);
try {
AudioInputStream ais1 = AudioSystem.getAudioInputStream(inputStream);
AudioInputStream ais2 = AudioSystem.getAudioInputStream(format, ais1);
open(ais2);
} catch( UnsupportedAudioFileException uafe ) {
throw new IllegalArgumentException(uafe);
} catch( IOException ioe ) {
throw new IllegalArgumentException(ioe);
}
// TODO - throw IAE for invalid frame size, format.
}
@Override
public float getLevel() {
return dataLine.getLevel();
}
@Override
public long getLongFramePosition() {
return dataLine.getLongFramePosition()*2/format.getChannels();
}
@Override
public int available() {
return dataLine.available();
}
@Override
public int getBufferSize() {
return dataLine.getBufferSize();
}
@Override
public AudioFormat getFormat() {
return format;
}
@Override
public boolean isActive() {
return dataLine.isActive();
}
@Override
public boolean isRunning() {
return dataLine.isRunning();
}
@Override
public boolean isOpen() {
return dataLine.isOpen();
}
@Override
public void stop() {
logger.log(Level.FINEST, "BigClip.stop()");
active = false;
// why did I have this commented out?
dataLine.stop();
if (thread!=null) {
try {
active = false;
thread.join();
} catch(InterruptedException wakeAndContinue) {
}
}
}
public byte[] convertMonoToStereo(byte[] data, int bytesRead) {
byte[] tempData = new byte[bytesRead*2];
if (format.getSampleSizeInBits()==8) {
for(int ii=0; ii<bytesRead; ii++) {
byte b = data[ii];
tempData[ii*2] = b;
tempData[ii*2+1] = b;
}
} else {
for(int ii=0; ii<bytesRead-1; ii+=2) {
//byte b2 = is2.read();
byte b1 = data[ii];
byte b2 = data[ii+1];
tempData[ii*2] = b1;
tempData[ii*2+1] = b2;
tempData[ii*2+2] = b1;
tempData[ii*2+3] = b2;
}
}
return tempData;
}
boolean fastForward;
boolean fastRewind;
public void setFastForward(boolean fastForward) {
logger.log(Level.FINEST, "FastForward " + fastForward);
this.fastForward = fastForward;
fastRewind = false;
flush();
}
public boolean getFastForward() {
return fastForward;
}
public void setFastRewind(boolean fastRewind) {
logger.log(Level.FINEST, "FastRewind " + fastRewind);
this.fastRewind = fastRewind;
fastForward = false;
flush();
}
public boolean getFastRewind() {
return fastRewind;
}
/** TODO - fix bug in LOOP_CONTINUOUSLY */
@Override
public void start() {
Runnable r = new Runnable() {
public void run() {
try {
/* Should these open()/close() calls be here, or explicitly
called by user program? The JavaDocs for line suggest that
Clip should throw an IllegalArgumentException, so we'll
stick with that and call it explicitly. */
dataLine.open();
dataLine.start();
active = true;
int bytesRead = 0;
int frameSize = dataLine.getFormat().getFrameSize();
int bufSize = dataLine.getBufferSize();
boolean startOrMove = true;
byte[] data = new byte[bufSize];
int offset = framePosition*frameSize;
int totalBytes = offset;
bytesRead = inputStream.read(new byte[offset], 0, offset);
logger.log(Level.FINE, "bytesRead " + bytesRead );
bytesRead = inputStream.read(data,0,data.length);
logger.log(Level.FINE, "loopCount " + loopCount );
logger.log(Level.FINE, "countDown " + countDown );
logger.log(Level.FINE, "bytesRead " + bytesRead );
while (bytesRead != -1 &&
(loopCount==Clip.LOOP_CONTINUOUSLY ||
countDown>0) &&
active ) {
logger.log(Level.FINEST,
"BigClip.start() loop " + framePosition );
totalBytes += bytesRead;
int framesRead;
byte[] tempData;
if (format.getChannels()<2) {
tempData = convertMonoToStereo(data, bytesRead);
framesRead = bytesRead/
format.getFrameSize();
bytesRead*=2;
} else {
framesRead = bytesRead/
dataLine.getFormat().getFrameSize();
tempData = Arrays.copyOfRange(data, 0, bytesRead);
}
framePosition += framesRead;
if (framePosition>=loopPointEnd) {
framePosition = loopPointStart;
inputStream.reset();
countDown--;
logger.log(Level.FINEST,
"Loop Count: " + countDown );
}
timelastPositionSet = System.currentTimeMillis();
byte[] newData;
if (fastForward) {
newData = getEveryNthFrame(tempData, 2);
} else if (fastRewind) {
byte[] temp = getEveryNthFrame(tempData, 2);
newData = reverseFrames(temp);
inputStream.reset();
totalBytes -= 2*bytesRead;
framePosition -= 2*framesRead;
if (totalBytes<0) {
setFastRewind(false);
totalBytes = 0;
}
inputStream.skip(totalBytes);
logger.log(Level.FINE, "totalBytes " + totalBytes);
} else {
newData = tempData;
}
dataLine.write(newData, 0, newData.length);
if (startOrMove) {
data = new byte[bufSize/
bufferUpdateFactor];
startOrMove = false;
}
bytesRead = inputStream.read(data,0,data.length);
if (bytesRead<0 && countDown-->1) {
inputStream.read(new byte[offset], 0, offset);
logger.log(Level.FINE, "loopCount " + loopCount );
logger.log(Level.FINE, "countDown " + countDown );
inputStream.reset();
bytesRead = inputStream.read(data,0,data.length);
}
}
logger.log(Level.FINEST,
"BigClip.start() loop ENDED" + framePosition );
active = false;
countDown = 1;
framePosition = 0;
inputStream.reset();
dataLine.drain();
dataLine.stop();
/* should these open()/close() be here, or explicitly
called by user program? */
dataLine.close();
} catch (LineUnavailableException lue) {
logger.log( Level.SEVERE,
"No sound line available!", lue );
if (parent!=null) {
JOptionPane.showMessageDialog(
parent,
"Clear the sound lines to proceed",
"No audio lines available!",
JOptionPane.ERROR_MESSAGE);
}
}
}
};
thread= new Thread(r);
// makes thread behaviour compatible with JavaSound post 1.4
thread.setDaemon(true);
thread.start();
}
/** Assume the frame size is 4. */
public byte[] reverseFrames(byte[] data) {
byte[] reversed = new byte[data.length];
byte[] frame = new byte[4];
for (int ii=0; ii<data.length/4; ii++) {
int first = (data.length)-((ii+1)*4)+0;
int last = (data.length)-((ii+1)*4)+3;
frame[0] = data[first];
frame[1] = data[(data.length)-((ii+1)*4)+1];
frame[2] = data[(data.length)-((ii+1)*4)+2];
frame[3] = data[last];
reversed[ii*4+0] = frame[0];
reversed[ii*4+1] = frame[1];
reversed[ii*4+2] = frame[2];
reversed[ii*4+3] = frame[3];
if (ii<5 || ii>(data.length/4)-5) {
logger.log(Level.FINER, "From \t" + first + " \tlast " + last );
logger.log(Level.FINER, "To \t" + ((ii*4)+0) + " \tlast " + ((ii*4)+3) );
}
}
/*
for (int ii=0; ii<data.length; ii++) {
reversed[ii] = data[data.length-1-ii];
}
*/
return reversed;
}
/** Assume the frame size is 4. */
public byte[] getEveryNthFrame(byte[] data, int skip) {
int length = data.length/skip;
length = (length/4)*4;
logger.log(Level.FINEST, "length " + data.length + " \t" + length);
byte[] b = new byte[length];
//byte[] frame = new byte[4];
for (int ii=0; ii<b.length/4; ii++) {
b[ii*4+0] = data[ii*skip*4+0];
b[ii*4+1] = data[ii*skip*4+1];
b[ii*4+2] = data[ii*skip*4+2];
b[ii*4+3] = data[ii*skip*4+3];
}
return b;
}
@Override
public void flush() {
dataLine.flush();
}
@Override
public void drain() {
dataLine.drain();
}
@Override
public void removeLineListener(LineListener listener) {
dataLine.removeLineListener(listener);
}
@Override
public void addLineListener(LineListener listener) {
dataLine.addLineListener(listener);
}
@Override
public Control getControl(Control.Type control) {
return dataLine.getControl(control);
}
@Override
public Control[] getControls() {
if (dataLine==null) {
return new Control[0];
} else {
return dataLine.getControls();
}
}
@Override
public boolean isControlSupported(Control.Type control) {
return dataLine.isControlSupported(control);
}
@Override
public void close() {
dataLine.close();
}
@Override
public void open() throws LineUnavailableException {
throw new IllegalArgumentException("illegal call to open() in interface Clip");
}
@Override
public Line.Info getLineInfo() {
return dataLine.getLineInfo();
}
/** Determines the single largest sample size of all channels of the current clip.
This can be handy for determining a fraction to scal visual representations.
@return Double between 0 & 1 representing the maximum signal level of any channel. */
public double getLargestSampleSize() {
int largest = 0;
int current;
boolean signed = (format.getEncoding()==AudioFormat.Encoding.PCM_SIGNED);
int bitDepth = format.getSampleSizeInBits();
boolean bigEndian = format.isBigEndian();
int samples = audioData.length*8/bitDepth;
if (signed) {
if (bitDepth/8==2) {
if (bigEndian) {
for (int cc = 0; cc < samples; cc++) {
current = (audioData[cc*2]*256 + (audioData[cc*2+1] & 0xFF));
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
} else {
for (int cc = 0; cc < samples; cc++) {
current = (audioData[cc*2+1]*256 + (audioData[cc*2] & 0xFF));
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
}
} else {
for (int cc = 0; cc < samples; cc++) {
current = (audioData[cc] & 0xFF);
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
}
} else {
if (bitDepth/8==2) {
if (bigEndian) {
for (int cc = 0; cc < samples; cc++) {
current = (audioData[cc*2]*256 + (audioData[cc*2+1] - 0x80));
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
} else {
for (int cc = 0; cc < samples; cc++) {
current = (audioData[cc*2+1]*256 + (audioData[cc*2] - 0x80));
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
}
} else {
for (int cc = 0; cc < samples; cc++) {
if ( audioData[cc]>0 ) {
current = (audioData[cc] - 0x80);
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
} else {
current = (audioData[cc] + 0x80);
if (Math.abs(current)>largest) {
largest = Math.abs(current);
}
}
}
}
}
// audioData
logger.log(Level.FINEST, "Max signal level: " + (double)largest/(Math.pow(2, bitDepth-1)));
return (double)largest/(Math.pow(2, bitDepth-1));
}
}
关于java - 如何播放较长的 AudioClip?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9470148/
我做了一个项目,使用两个不同的textview进行触摸来播放两个音频。 这是一个文本 View 的简单代码 tv.setOnTouchListener(new OnTouchListener() {
我正在使用 pygame 模块在 python 中操作声音文件。它在交互式 python session 中工作正常,但相同的代码在 bash 中不会产生任何结果: 交互式Python $ sudo
请注意它只能是 JavaScript。请参阅下面我当前的 HTML。我需要像当前代码一样在页面之间旋转。但是,我需要能够在页面之间暂停/播放。
我有一个带有一堆音频链接的html。我正在尝试使所有音频链接都在单击时播放/暂停,并且尝试了here解决方案。这正是我所追求的,只是我现在不得不修改此功能以应用于代码中的所有音频链接(因为我不能为每个
在尝试进入我的代码中的下一个文件之前,我尝试随机播放.wav文件数毫秒。最好的方法是什么? 我目前有以下代码: #!/usr/bin/env python from random import ran
我有2个回调函数,一个播放音频,另一个停止音频。 function Play_Callback(hObject, eventdata, handles) global path; global pla
我有一个电台应用程序,并与carplay集成。在Carplay仪表板中,我仅看到专辑封面图像和停止按钮。我想在仪表板上显示播放/暂停和跳过按钮。如果您对该站有任何了解,可以帮我吗? 最佳答案 您需要使
我正在使用 ffmpeg 创建一个非常基本的视频播放器。库,我有所有的解码和重新编码,但我坚持音频视频同步。 我的问题是,电影有音频和视频流混合(交织),音频和视频以“突发”(多个音频包,然后是并列的
我不知道我在做什么错 $(document).ready(function() { var playing = false; var audioElement = document.
我正在尝试通过(input:file)Elem加载本地音频文件,当我将其作为对象传递给音频构造函数Audio()时,它不会加载/播放。 文件对象参数和方法: lastModified: 1586969
在 Qt 中创建播放/暂停按钮的最佳方法是什么?我应该创建一个操作并在单击时更改其图标,还是应该创建两个操作然后以某种方式在单击时隐藏一个操作?如何使用一个快捷键来激活这两个操作? (播放时暂停,暂停
我正在用 Python 和 SQLite 构建一个预订系统。 我有一个 Staff.db 和 Play.db (一对多关系)。这个想法是这样的:剧院的唯一工作人员可以通过指定开始日期和时间来选择何时添
我有一个服务于 AAC+ (HE v2) 的 Icecast 服务器。我在我的网页中使用 JPlayer 来播放内容。在没有 Flash Player 的 Chromium 中,它工作得很好。 对于支
当我运行我的方法时,我收到一个MediaException。我使用 playSound("src/assets/timeup.mp3"); 调用该方法。 private void playSound(
我有一项正在播放播客的服务。我希望该服务检测用户何时按下暂停或从他们的 BT radio 播放,以便我可以停止和启动它。对于我的生活,我无法弄清楚要向我的监听器添加什么过滤器(当我按下 BT 按钮时,
我对 Java 不是很在行,在研究网站上的音乐循环的简单播放/暂停按钮后,我得到了这段代码。它可以很好地离线测试,但在上传到 FTP 服务器后,它不会在任何浏览器中播放音频,我得到 SyntaxErr
我有一个使用 flickity carousel library 创建的视频轮播, 见过 here on codepen .我想要发生的是,当用户滑动轮播时,所选幻灯片停止播放,然后占据所选中间位置的
这是一个 JSFiddle: http://jsfiddle.net/8LczkwLz/19/ HTML: JS: var flashcardAudio = documen
我的问题是我无法将歌曲标题文本保持在 line-height: 800px;当用户播放或暂停播放器时。我设法在 :hover 上做到了。这似乎是一件非常棘手的事情,这真的是我第一次遇到 CSS 如此困
我还没有找到与我的完全一样的帖子,所以这就是问题所在。我正在制作一个 mp3 播放器,播放/暂停是两个单独的按钮。这是我的代码。 prevButton = document.getElementByI
我是一名优秀的程序员,十分优秀!