背景:我找到了一個名爲「AVAudioEngine in Practice」的Apple WWDC會話,並試圖做出類似於43:35(https://youtu.be/FlMaxen2eyw?t=2614)所示的最後一個演示的類似內容。我使用的是SpriteKit而不是SceneKit,但原理是一樣的:我想生成球體,將它們扔到身邊,當它們發生碰撞時,發動聲音,對每個球體都是獨一無二的。有關使用AVAudioEngine的詳細信息
問題:
我想連接到每個SpriteKitNode,這樣我可以發揮不同的聲音對每個球的唯一AudioPlayerNode。即現在,如果我爲每個AudioPlayerNode創建兩個球體並設置不同的音高,則只有最近創建的AudioPlayerNode似乎正在播放,即使原始球體發生碰撞。在演示期間,他提到「我正在綁定一名球員,一名專職球員到每一球」。我會怎麼做呢?
每次發生新的碰撞時都會有聲音點擊/僞影。我假設這與AVAudioPlayerNodeBufferOptions和/或每次發生聯繫時我試圖創建,調度和使用緩衝區的事實有很大關係,這不是最有效的方法。這將是一個很好的解決方法?
代碼:由於在視頻中提到,「...對於每一個真實降生到這個世界上的球,一個新的球員節點也創造」。我有一個球一個單獨的類,有一個返回SpriteKitNode的方法,並創建一個AudioPlayerNode每次調用時間:
class Sphere {
var sphere: SKSpriteNode = SKSpriteNode(color: UIColor(), size: CGSize())
var sphereScale: CGFloat = CGFloat(0.01)
var spherePlayer = AVAudioPlayerNode()
let audio = Audio()
let sphereCollision: UInt32 = 0x1 << 0
func createSphere(position: CGPoint, pitch: Float) -> SKSpriteNode {
let texture = SKTexture(imageNamed: "Slice")
let collisionTexture = SKTexture(imageNamed: "Collision")
// Define the node
sphere = SKSpriteNode(texture: texture, size: texture.size())
sphere.position = position
sphere.name = "sphere"
sphere.physicsBody = SKPhysicsBody(texture: collisionTexture, size: sphere.size)
sphere.physicsBody?.dynamic = true
sphere.physicsBody?.mass = 0
sphere.physicsBody?.restitution = 0.5
sphere.physicsBody?.usesPreciseCollisionDetection = true
sphere.physicsBody?.categoryBitMask = sphereCollision
sphere.physicsBody?.contactTestBitMask = sphereCollision
sphere.zPosition = 1
// Create AudioPlayerNode
spherePlayer = audio.createPlayer(pitch)
return sphere
}
這裏是我的音頻類與創建AudioPCMBuffers和AudioPlayerNodes
class Audio {
let engine: AVAudioEngine = AVAudioEngine()
func createBuffer(name: String, type: String) -> AVAudioPCMBuffer {
let audioFilePath = NSBundle.mainBundle().URLForResource(name as String, withExtension: type as String)!
let audioFile = try! AVAudioFile(forReading: audioFilePath)
let buffer = AVAudioPCMBuffer(PCMFormat: audioFile.processingFormat, frameCapacity: UInt32(audioFile.length))
try! audioFile.readIntoBuffer(buffer)
return buffer
}
func createPlayer(pitch: Float) -> AVAudioPlayerNode {
let player = AVAudioPlayerNode()
let buffer = self.createBuffer("PianoC1", type: "wav")
let pitcher = AVAudioUnitTimePitch()
let delay = AVAudioUnitDelay()
pitcher.pitch = pitch
delay.delayTime = 0.2
delay.feedback = 90
delay.wetDryMix = 0
engine.attachNode(pitcher)
engine.attachNode(player)
engine.attachNode(delay)
engine.connect(player, to: pitcher, format: buffer.format)
engine.connect(pitcher, to: delay, format: buffer.format)
engine.connect(delay, to: engine.mainMixerNode, format: buffer.format)
engine.prepare()
try! engine.start()
return player
}
}
在我GameScene類,然後我測試碰撞,安排一個緩衝並播放AudioPlayerNode如果接觸發生
func didBeginContact(contact: SKPhysicsContact) {
let firstBody: SKPhysicsBody = contact.bodyA
if (firstBody.categoryBitMask & sphere.sphereCollision != 0) {
let buffer1 = audio.createBuffer("PianoC1", type: "wav")
sphere.spherePlayer.scheduleBuffer(buffer1, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Interrupts, completionHandler: nil)
sphere.spherePlayer.play()
}
}
我是新至S wift,只有基本的編程知識,所以任何建議/批評都是受歡迎的。
雖然此鏈接可以回答這個問題,最好是在這裏有答案的主要部件,並提供鏈接以供參考時在gameView實例化這個。如果鏈接頁面更改,則僅鏈接答案可能會失效。 - [來自評論](/ review/low-quality-posts/11350414) –
@BeauNouvelle我已經編輯完整的測試代碼和一個額外的功能 – triple7