作者热门文章
- c - 在位数组中找到第一个零
- linux - Unix 显示有关匹配两种模式之一的文件的信息
- 正则表达式替换多个文件
- linux - 隐藏来自 xtrace 的命令
我之前没有用过android的录音类,所以我对这方面的了解不多。
我已经编写了一个小应用程序,可以在后台录制音频,然后在后台播放,全部采用 PCM 格式(我正在做一些测试,看看麦克风在后台使用了多少电池)。但是当我尝试运行我的 play() 方法时,我得到了 logcat 错误:
11-03 00:20:05.744 18248-18248/com.bacon.corey.audiotimeshift E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations
11-03 00:20:05.748 18248-18248/com.bacon.corey.audiotimeshift E/AudioTrack﹕ Playback Failed
我用谷歌搜索了这些错误,但我似乎无法找到关于它们的任何信息。
如果有人不介意给我一些建议,我将不胜感激。
这是应用程序的代码(非常草率且未完成,因为它仅用于测试电池生命周期):
public class MainActivity extends ActionBarActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (savedInstanceState == null) {
getSupportFragmentManager().beginTransaction()
.add(R.id.container, new PlaceholderFragment())
.commit();
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* A placeholder fragment containing a simple view.
*/
public static class PlaceholderFragment extends Fragment {
public PlaceholderFragment() {
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View rootView = inflater.inflate(R.layout.fragment_main, container, false);
return rootView;
}
}
public void play(View view) {
Toast.makeText(this, "play", Toast.LENGTH_SHORT).show();
// Get the file we want to playback.
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int musicLength = (int)(file.length()/2);
short[] music = new short[musicLength];
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
music[musicLength-1-i] = dis.readShort();
i++;
}
// Close the input streams.
dis.close();
// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(music, 0, musicLength);
} catch (Throwable t) {
Log.e("AudioTrack","Playback Failed");
}
}
public void record(View view){
Toast.makeText(this, "record", Toast.LENGTH_SHORT).show();
Log.v("ACS", "OnCreate called");
Intent intent = new Intent(this, ACS.class);
startService(intent);
}
public void stop(View view){
Toast.makeText(this, "stop", Toast.LENGTH_SHORT).show();
Intent intent = new Intent(this, ACS.class);
stopService(intent);
}
}
和
public class ACS extends IntentService {
AudioRecord audioRecord;
public ACS() {
super("ACS");
}
@Override
protected void onHandleIntent(Intent intent) {
Log.v("ACS", "ACS called");
record();
}
public void record() {
Log.v("ACS", "Record started");
int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Delete any previous recording.
if (file.exists())
file.delete();
// Create the new file.
try {
file.createNewFile();
} catch (IOException e) {
throw new IllegalStateException("Failed to create " + file.toString());
}
try {
// Create a DataOuputStream to write the audio data into the saved file.
OutputStream os = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(os);
DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (audioRecord.getRecordingState() == audioRecord.RECORDSTATE_RECORDING) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for (int i = 0; i < bufferReadResult; i++)
dos.writeShort(buffer[i]);
}
audioRecord.stop();
dos.close();
} catch (Throwable t) {
Log.e("AudioRecord", "Recording Failed");
}
Log.v("ACS", "Record stopped");
}
public void onDestroy(){
audioRecord.stop();
Log.v("ACS", "onDestroy called, Record stopped");
}
}
提前致谢
科里:)
最佳答案
我收到相同的错误消息“android.media.AudioTrack:多声道配置中必须存在前声道”。
当我将音频设置从 AudioFormat.CHANNEL_OUT_MONO 更改为 AudioFormat.CHANNEL_IN_MONO 时,错误消息消失了。 (或者您可以尝试不同的配置,例如 AudioFormat.CHANNEL_IN_STEREO)
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
但我不知道为什么会这样。希望对您有所帮助。
关于android - 音频播放失败 - E/android.media.AudioTrack:多声道配置中必须存在前置声道,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26700398/
我有 json 数据: { "products": [ { "productId" : 0, "productImg" : "../img/product-ph
我是一名优秀的程序员,十分优秀!