无法使用avcaptureaudiodataOutputsampledelegate播放从语音录制的音频[英] Can't play audio recorded from voice using AVCaptureAudioDataOutputSampleDelegate

本文是小编为大家收集整理的关于无法使用avcaptureaudiodataOutputsampledelegate播放从语音录制的音频的处理方法,想解了无法使用avcaptureaudiodataOutputsampledelegate播放从语音录制的音频的问题怎么解决?无法使用avcaptureaudiodataOutputsampledelegate播放从语音录制的音频问题的解决办法?那么可以参考本文帮助大家快速定位并解决问题。

问题描述

我一直在谷歌曲和研究日,但我似乎无法搞定这个工作,我无法在互联网上找到任何解决方案.

我正在尝试使用麦克风捕捉我的声音,然后通过扬声器播放它.

这是我的代码:

class ViewController: UIViewController, AVAudioRecorderDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {

var recordingSession: AVAudioSession!
var audioRecorder: AVAudioRecorder!
var captureSession: AVCaptureSession!
var microphone: AVCaptureDevice!
var inputDevice: AVCaptureDeviceInput!
var outputDevice: AVCaptureAudioDataOutput!

override func viewDidLoad() {
    super.viewDidLoad()

    recordingSession = AVAudioSession.sharedInstance()

    do{
        try recordingSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
        try recordingSession.setMode(AVAudioSessionModeVoiceChat)
        try recordingSession.setPreferredSampleRate(44000.00)
        try recordingSession.setPreferredIOBufferDuration(0.2)
        try recordingSession.setActive(true)

        recordingSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
            DispatchQueue.main.async {
                if allowed {

                    do{
                        self.microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
                        try self.inputDevice = AVCaptureDeviceInput.init(device: self.microphone)

                        self.outputDevice = AVCaptureAudioDataOutput()
                        self.outputDevice.setSampleBufferDelegate(self, queue: DispatchQueue.main)

                        self.captureSession = AVCaptureSession()
                        self.captureSession.addInput(self.inputDevice)
                        self.captureSession.addOutput(self.outputDevice)
                        self.captureSession.startRunning()
                    }
                    catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
        }
    }catch let error{
        print(error.localizedDescription)
    }
}

和回调函数:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    var audioBufferList = AudioBufferList(
        mNumberBuffers: 1,
        mBuffers: AudioBuffer(mNumberChannels: 0,
        mDataByteSize: 0,
        mData: nil)
    )

    var blockBuffer: CMBlockBuffer?

    var osStatus = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(

        sampleBuffer,
        nil,
        &audioBufferList,
        MemoryLayout<AudioBufferList>.size,
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &blockBuffer
    )

    do {
        var data: NSMutableData = NSMutableData.init()
        for i in 0..<audioBufferList.mNumberBuffers {

            var audioBuffer = AudioBuffer(
                 mNumberChannels: audioBufferList.mBuffers.mNumberChannels,
                 mDataByteSize: audioBufferList.mBuffers.mDataByteSize,
                 mData: audioBufferList.mBuffers.mData
            )

            let frame = audioBuffer.mData?.load(as: Float32.self)
            data.append(audioBuffer.mData!, length: Int(audioBuffer.mDataByteSize))

        }

        var dataFromNsData = Data.init(referencing: data)
        var avAudioPlayer: AVAudioPlayer = try AVAudioPlayer.init(data: dataFromNsData)
        avAudioPlayer.prepareToPlay()
        avAudioPlayer.play()
        }
    }
    catch let error {
        print(error.localizedDescription)
        //prints out The operation couldn’t be completed. (OSStatus error 1954115647.)
}

任何帮助都会是惊人的,它可能会帮助很多其他人,因为它的许多不完整的Swift版本就在那里.

谢谢.

推荐答案

你很接近!您正在捕获didOutputSampleBuffer回调中的音频,但这是一个高频回调,因此您正在创建大量AVAudioPlayer s并将其传递给原始的LPCM数据,而他们只知道如何解析Coreaudio文件类型,然后他们正在进行无论如何,不​​相容.

您可以非常轻松地播放使用AVCaptureSession使用AVAudioEngine S AVAudioPlayerNode捕获的缓冲区,但在该点,您也可以使用AVAudioEngine来从麦克风记录:

import UIKit
import AVFoundation

class ViewController: UIViewController {
    var engine = AVAudioEngine()

    override func viewDidLoad() {
        super.viewDidLoad()

        let input = engine.inputNode!
        let player = AVAudioPlayerNode()
        engine.attach(player)

        let bus = 0
        let inputFormat = input.inputFormat(forBus: bus)
        engine.connect(player, to: engine.mainMixerNode, format: inputFormat)

        input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in
            player.scheduleBuffer(buffer)
        }

        try! engine.start()
        player.play()
    }
}

本文地址:https://www.itbaoku.cn/post/924354.html