ios - 如何在 init 中将参数从 Objective C 传递到 Swift

标签 ios objective-c swift parameter-passing init

在 Apple 示例代码中,iOS Metronome,https://developer.apple.com/library/content/samplecode/HelloMetronome/Introduction/Intro.html#//apple_ref/doc/uid/TP40017587

现在 Apple 硬编码 self.setTempo(120),并在以下代码末尾添加 120。

override init() {
    super.init()
    // Use two triangle waves which are generate for the metronome bips.

    // Create a standard audio format deinterleaved float.
    let format = AVAudioFormat(standardFormatWithSampleRate: 44100.0, channels: 2)

    // How many audio frames?
    let bipFrames: UInt32 = UInt32(GlobalConstants.kBipDurationSeconds * Float(format.sampleRate))

    // Create the PCM buffers.
    soundBuffer.append(AVAudioPCMBuffer(pcmFormat: format, frameCapacity: bipFrames))
    soundBuffer.append(AVAudioPCMBuffer(pcmFormat: format, frameCapacity: bipFrames))

    // Fill in the number of valid sample frames in the buffers (required).
    soundBuffer[0]?.frameLength = bipFrames
    soundBuffer[1]?.frameLength = bipFrames

    // Generate the metronme bips, first buffer will be A440 and the second buffer Middle C.
    let wg1 = TriangleWaveGenerator(sampleRate: Float(format.sampleRate))                     // A 440
    let wg2 = TriangleWaveGenerator(sampleRate: Float(format.sampleRate), frequency: 261.6)   // Middle C
    wg1.render(soundBuffer[0]!)
    wg2.render(soundBuffer[1]!)

    // Connect player -> output, with the format of the buffers we're playing.
    let output: AVAudioOutputNode = engine.outputNode

    engine.attach(player)
    engine.connect(player, to: output, fromBus: 0, toBus: 0, format: format)

    bufferSampleRate = format.sampleRate

    // Create a serial dispatch queue for synchronizing callbacks.
    syncQueue = DispatchQueue(label: "Metronome")

    self.setTempo(120)
}

如何在 init 中将参数从 Objective C 的以下代码传递到上面的 Swift 代码,而不是硬编码为 120:

- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

NSLog(@"Hello, Metronome!\n");

NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];

[audioSession setCategory:AVAudioSessionCategoryAmbient error:&error];
if (error) {
    NSLog(@"AVAudioSession error %ld, %@", error.code, error.localizedDescription);
}

[audioSession setActive:YES error:&error];
if (error) {
    NSLog(@"AVAudioSession error %ld, %@", error.code, error.localizedDescription);
}

// if media services are reset, we need to rebuild our audio chain
[[NSNotificationCenter defaultCenter] addObserver:self
                                         selector:@selector(handleMediaServicesWereReset:)
                                             name:AVAudioSessionMediaServicesWereResetNotification
                                           object:audioSession];

metronome = [[Metronome alloc] init];
metronome.delegate = self;

}

非常感谢!

最佳答案

要将参数添加到 Swift 方法,请更改

override init() {
...
self.setTempo(120)

类似于

init(frequency: Int) {
...
self.setTempo(frequency)

这将允许您将 Objective-C init 调用为

[[Metronome alloc] initWithFrequency: (your frequency)];

关于你的声音问题,如果没有更多关于你想要做什么的上下文,目前还不清楚发生了什么,但我会尝试将你的初始化代码从 viewDidLoad 移动到 viewDidAppear。

关于ios - 如何在 init 中将参数从 Objective C 传递到 Swift,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46467643/

相关文章:

iphone - 如何将货币字符串格式化为浮点值?

ios - Swift:从 iOS 7 编译存档时出现段错误

iphone - 使用 ALAssetsLibrary 访问单个图像

ios - 如何立即判断位置管理器是否在 iOS 中获取位置失败?

ios - UITableView 上的 JSON 数据

ios - 隐藏在图像后面的容器的程序化屏幕截图

ios - 有没有办法在主应用程序和扩展程序中对 Firebase 使用相同的匿名身份验证?

objective-c - 应用程序应如何向数据存储区进行身份验证?

ios - 如何使 Swift 包在 Objective-C 项目中可导入/可用?

ios - Xcode 4 - 包含第 3 方静态库时找不到头文件