android - 如何使用 PlaybackCapture API 在 Android Q 上录制音频?

标签 android audio-recording playback android-10.0

我正在尝试使用 Playback Capture 在 Android 10(Q) 上录制音频API。如Playback Capture API,只允许使用 录制声音USAGE_GAME , 使用媒体 USAGE_UNKNOWN ,所以,我下载了Uamp具有 的样本使用媒体播放歌曲时设置。我还添加了android:allowAudioPlaybackCapture="true"AndroidManifest.xml .然后我启动了手机放大器 ,开始播放歌曲并将其保留在后台。

我开发了采集音频 使用 targetSdk 29 进行项目并将其安装在具有 的我的 OnePlus 7 Pro 上安卓 10 安装。我在 UI 上有两个按钮用于开始和停止捕获。当应用程序开始捕获时,读取函数会填充缓冲区中的所有 0。

使用 Playback Capture在项目中,我设置如下:

1. list :

        <?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="com.example.captureaudio">

    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAPTURE_AUDIO_OUTPUT" />
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

    <application
        android:allowBackup="false"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme"
        tools:ignore="GoogleAppIndexingWarning">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <service
            android:name=".services.MediaProjectionService"
            android:enabled="true"
            android:exported="false"
            android:foregroundServiceType="mediaProjection"
            tools:targetApi="q" />
    </application>

</manifest>

2. 主要 Activity :
class MainActivity : AppCompatActivity() {

    companion object {
        private const val REQUEST_CODE_CAPTURE_INTENT = 1
        private const val TAG = "CaptureAudio"
        private const val RECORDER_SAMPLE_RATE = 48000
        private const val RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO
        //or AudioFormat.CHANNEL_IN_BACK
        private const val RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT
        //  AudioFormat.ENCODING_PCM_16BIT
    }

    private var audioRecord: AudioRecord? = null
    private val mediaProjectionManager by lazy { (this@MainActivity).getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager }
    private val rxPermissions by lazy { RxPermissions(this) }
    private val minBufferSize by lazy {
        AudioRecord.getMinBufferSize(
            RECORDER_SAMPLE_RATE,
            RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING
        )
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        val intent = Intent(this, MediaProjectionService::class.java)
        startForegroundService(intent)
        getPermissions()
    }

    private fun getPermissions() {
        rxPermissions
            .request(
                Manifest.permission.RECORD_AUDIO,
                Manifest.permission.FOREGROUND_SERVICE,
                Manifest.permission.WRITE_EXTERNAL_STORAGE
            )
            .subscribe {
                log("Permission result: $it")
                if (it) { // Always true pre-M
                    val captureIntent = mediaProjectionManager.createScreenCaptureIntent()
                    startActivityForResult(captureIntent, REQUEST_CODE_CAPTURE_INTENT)
                } else {
                    getPermissions()
                }
            }
    }

    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_CODE_CAPTURE_INTENT && data != null) {
            val mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data)
            val playbackConfig = AudioPlaybackCaptureConfiguration.Builder(mediaProjection)
                .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
                .addMatchingUsage(AudioAttributes.USAGE_UNKNOWN)
                .addMatchingUsage(AudioAttributes.USAGE_GAME)
                .build()
            audioRecord = AudioRecord.Builder()
                .setAudioPlaybackCaptureConfig(playbackConfig)
                .setBufferSizeInBytes(minBufferSize * 2)
                .setAudioFormat(
                    AudioFormat.Builder()
                        .setEncoding(RECORDER_AUDIO_ENCODING)
                        .setSampleRate(RECORDER_SAMPLE_RATE)
                        .setChannelMask(RECORDER_CHANNELS)
                        .build()
                )
                .build()
        }
    }

    fun startCapture(view: View) {
        audioRecord?.apply {
            startRecording()
            log("Is stopped: $state $recordingState")
            startRecordingIntoFile()
        }
        stopRecBtn.visibility = View.VISIBLE
        startRecBtn.visibility = View.INVISIBLE
    }

    private fun AudioRecord.startRecordingIntoFile() {
        val file = File(
            getExternalFilesDir(Environment.DIRECTORY_MUSIC),
            "temp.wav"
            //System.currentTimeMillis().toString() + ".wav"
        )
        if (!file.exists())
            file.createNewFile()

        GlobalScope.launch {
            val out = file.outputStream()
            audioRecord.apply {
                while (recordingState == AudioRecord.RECORDSTATE_RECORDING) {

                    val buffer = ShortArray(minBufferSize)//ByteBuffer.allocate(MIN_BUFFER_SIZE)
                    val result = read(buffer, 0, minBufferSize)

                    // Checking if I am actually getting something in a buffer
                    val b: Short = 0
                    var nonZeroValueCount = 0
                    for (i in 0 until minBufferSize) {
                        if (buffer[i] != b) {
                            nonZeroValueCount += 1
                            log("Value: ${buffer[i]}")
                        }
                    }
                    if (nonZeroValueCount != 0) {

                        // Record the non-zero values in the file..
                        log("Result $nonZeroValueCount")
                        when (result) {
                            AudioRecord.ERROR -> showToast("ERROR")
                            AudioRecord.ERROR_INVALID_OPERATION -> showToast("ERROR_INVALID_OPERATION")
                            AudioRecord.ERROR_DEAD_OBJECT -> showToast("ERROR_DEAD_OBJECT")
                            AudioRecord.ERROR_BAD_VALUE -> showToast("ERROR_BAD_VALUE")
                            else -> {
                                log("Appending $buffer into ${file.absolutePath}")
                                out.write(shortToByte(buffer))
                            }
                        }
                    }
                }
            }
            out.close()
        }
    }

    private fun shortToByte(shortArray: ShortArray): ByteArray {
        val byteOut = ByteArray(shortArray.size * 2)
        ByteBuffer.wrap(byteOut).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(shortArray)
        return byteOut
    }

    private fun showToast(msg: String) {
        runOnUiThread {
            log("Toast: $msg")
            Toast.makeText(this@MainActivity, msg, Toast.LENGTH_LONG).show()
        }
    }

    fun stopCapture(view: View) {
        audioRecord?.apply {
            stop()
            log("Is stopped: $state $recordingState")
        }
        stopRecBtn.visibility = View.INVISIBLE
        startRecBtn.visibility = View.VISIBLE
    }

    private fun log(msg: String) {
        Log.d(TAG, msg)
    }

    override fun onDestroy() {
        super.onDestroy()
        audioRecord?.stop()
        audioRecord?.release()
        audioRecord = null
    }
}

3.媒体投影服务
    class MediaProjectionService : Service() {

    companion object {
        private const val CHANNEL_ID = "ForegroundServiceChannel"
    }

    override fun onBind(intent: Intent?): IBinder? {
        return null
    }

    override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {

        createNotificationChannel()
        val notificationIntent = Intent(this, MainActivity::class.java)
        val pendingIntent = PendingIntent.getActivity(
            this,
            0, notificationIntent, 0
        )

        val notification = NotificationCompat.Builder(this, CHANNEL_ID)
            .setContentTitle("Foreground Service")
            .setContentText("Call Recording Service")
//            .setSmallIcon(R.drawable.ic_stat_name)
            .setContentIntent(pendingIntent)
            .build()

        startForeground(1, notification)
        return START_NOT_STICKY
    }

    private fun createNotificationChannel() {
        val serviceChannel = NotificationChannel(
            CHANNEL_ID,
            "Foreground Service Channel",
            NotificationManager.IMPORTANCE_DEFAULT
        )

        val manager = getSystemService(NotificationManager::class.java)
        manager!!.createNotificationChannel(serviceChannel)
    }
}

问题是,

1. 文件 /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav已创建,但里面只有 0。我还用 xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav 进行了检查如下:
OnePlus7Pro:/sdcard # xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav | head
00000000: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000010: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000020: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000030: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000040: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000050: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000060: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000070: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000080: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000090: 0000 0000 0000 0000 0000 0000 0000 0000  ................

2. 在从设备播放时,它会给出错误“无法播放您请求的轨道”。

任何帮助或建议,我错过了什么?

最佳答案

我认为您 时出了点问题写入音频数据给您的.wav文件。
这是我的example app在 Android 10(Q) 上使用 录制音频回放捕捉 API .
在这个应用程序中,我将音频数据写入 .pcm文件,然后将其解码为 .mp3您可以使用任何播放器收听和执行的音频文件。
警告!
QRecorder应用程序使用 实现 lib 跛脚NDK .
如果您不想浪费时间将 lib lame 导入您的项目,您可以解码记录的 .pcm带有 PCM-Decoder 的文件图书馆

关于android - 如何使用 PlaybackCapture API 在 Android Q 上录制音频?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59356345/

相关文章:

android - 在 android 上从 r.raw 读取文本文件

android - 使用 SDK 在官方 Skobbler 应用程序和应用程序之间共享安装包

linux - SOX 的采样率误差

javascript - 使用 javascript 自动播放 <video> 元素

android - 防止媒体播放器同时播放多个声音

android - 修改android中音乐播放的速度

android - 使用 DatePickerDialog 时获取星期几

android - 为什么检查服务是否正在运行时出错

java - Android 中的音频录制示例项目

ios - 使用蓝牙麦克风的 Swift installTap 产生 "chipmunk"音频