2012-03-19 30 views
0

我想知道如何基於用戶事件在MIDI播放過程中傳遞時調用的用戶事件回調的內容來更新UI元素(UIImageViews)。更具體地說,這些用戶事件包含傳遞到用戶回調函數中的音符數據(例如播放的音符是60,又名中間C)。如何在用戶回調和視圖之間進行通話?

我想基於播放的音符更新我的UIImageViews。我試圖從回調中訪問UIImageViews,但由於它沒有對ViewController的直接訪問,並且因爲它運行在主線程以外的線程上,所以我被建議採用不同的方式。

所以,我想要做的是創建一個單獨的控制器,可以將回調函數的信息傳遞給UI,但我不知道該怎麼做。我已經在下面發佈了我的ViewController的代碼。它包含回調函數以及用於設置用戶事件和其他視圖相關內容的所有相關代碼。

我正在使用iOS 5在Xcode 4.3.1中工作,我正在使用ARC。

PracticeViewController.h

#import <UIKit/UIKit.h> 
#import "Lesson.h" 
#import "Note.h" 
#import <AudioToolbox/AudioToolbox.h> 

@interface PracticeViewController : UIViewController 

@property (strong, nonatomic) Lesson *selectedLesson; 
@property (strong, nonatomic) IBOutlet UINavigationItem *practiceWindowTitle; 
@property MusicPlayer player; 

//Outlets for White Keys 
@property (strong, nonatomic) IBOutlet UIImageView *whiteKey21; 
// […] 
@property (strong, nonatomic) IBOutlet UIImageView *whiteKey108; 

//Outlets for Black Keys 
@property (strong, nonatomic) IBOutlet UIImageView *blackKey22; 
// […] 
@property (strong, nonatomic) IBOutlet UIImageView *blackKey106; 

// Key Highlight Images 
@property (strong, nonatomic) UIImage *highlightA; 
@property (strong, nonatomic) UIImage *highlightB; 
@property (strong, nonatomic) UIImage *highlightC; 
@property (strong, nonatomic) UIImage *highlightD; 
@property (strong, nonatomic) UIImage *highlightE; 
@property (strong, nonatomic) UIImage *highlightF; 
@property (strong, nonatomic) UIImage *highlightG; 
@property (strong, nonatomic) UIImage *highlightH; 

- (IBAction)practiceLesson:(id)sender; 

@end 

PracticeViewController.m

#import "PracticeViewController.h" 

@interface PracticeViewController() 

@end 

@implementation PracticeViewController 
@synthesize blackKey22; 
// […] 
@synthesize blackKey106; 
@synthesize whiteKey21; 
// […] 
@synthesize whiteKey108; 

@synthesize selectedLesson, practiceWindowTitle, player, highlightA, highlightB, highlightC, highlightD, highlightE, highlightF, highlightG, highlightH; 

// Implement the UserEvent structure. 

typedef struct UserEvent { 
    UInt32 length; 
    UInt32 typeID; 
    UInt32 trackID; 
    MusicTimeStamp tStamp; 
    MusicTimeStamp dur; 
    int playedNote; 
} UserEvent; 

// Implement the UserCallback function. 

void noteUserCallback (void *inClientData, MusicSequence inSequence, MusicTrack inTrack, MusicTimeStamp inEventTime, const MusicEventUserData *inEventData, MusicTimeStamp inStartSliceBeat, MusicTimeStamp inEndSliceBeat) 
{  
    UserEvent* event = (UserEvent *)inEventData; 
    UInt32 size = event->length; 
    UInt32 note = event->playedNote; 
    UInt32 timestamp = event->tStamp; 
    NSLog(@"Size: %lu Note: %lu, Timestamp: %lu", size, note, timestamp); 
} 

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil 
{ 
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]; 
    if (self) { 
     // Custom initialization 
    } 

    return self; 
} 

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    self.practiceWindowTitle.title = selectedLesson.titleAndSubtitle; 

    // Load in the images for the glow. 
    highlightA = [UIImage imageNamed:@"glow_whiteKeysA.png"]; 
    highlightB = [UIImage imageNamed:@"glow_whiteKeysB.png"]; 
    highlightC = [UIImage imageNamed:@"glow_whiteKeysC.png"]; 
    highlightD = [UIImage imageNamed:@"glow_whiteKeysD.png"]; 
    highlightE = [UIImage imageNamed:@"glow_whiteKeysE.png"]; 
    highlightF = [UIImage imageNamed:@"glow_whiteKeysF.png"]; 
    highlightG = [UIImage imageNamed:@"glow_blackKey.png"]; 
    highlightH = [UIImage imageNamed:@"glow_whiteKeysH.png"]; 

    // Create player, sequence, left/right hand tracks, and iterator. 

    NewMusicPlayer(&player); 
    MusicSequence sequence; 
    NewMusicSequence(&sequence); 
    MusicTrack rightHand; 
    MusicTrack leftHand; 
    MusicEventIterator iterator; 

    // Load in MIDI file. 

    NSString *path = [[NSString alloc] init]; 
    path = [[NSBundle mainBundle] pathForResource:selectedLesson.midiFilename ofType:@"mid"]; 
    NSURL *url = [NSURL fileURLWithPath:path]; 
    MusicSequenceFileLoad(sequence, (__bridge CFURLRef)url, 0, kMusicSequenceLoadSMF_ChannelsToTracks); 

    // Get the right and left hand tracks from the sequence. 

    int rightHandIndex = 0; 
    //int leftHandIndex = 1; 

    MusicSequenceGetIndTrack(sequence, rightHandIndex, &rightHand); //Get right hand. 
    //MusicSequenceGetIndTrack(sequence, leftHandIndex, leftHand); //Get left hand. 

    //Iterate through the right hand track and add user events. 

    Boolean hasNextEvent = false; 
    Boolean hasEvent = false; 

    NewMusicEventIterator(rightHand, &iterator); 
    MusicEventIteratorHasCurrentEvent(iterator, &hasEvent); 
    MusicEventIteratorHasNextEvent(iterator, &hasNextEvent); 

    while (hasNextEvent == true) { 
     MusicTimeStamp timestamp = 0; 
     MusicEventType eventType = 0; 
     const void *eventData = NULL; 
     int note; 
     MusicTimeStamp duration; 

     MusicEventIteratorGetEventInfo(iterator, &timestamp, &eventType, &eventData, NULL); 

     if (eventType == kMusicEventType_MIDINoteMessage) { 
      MIDINoteMessage *noteMessage = (MIDINoteMessage *)eventData; 
      note = noteMessage->note; 
      duration = noteMessage->duration; 
      UserEvent event; 

      event.length = 0; 
      event.length = sizeof(UserEvent); 
      event.playedNote = note; 
      event.tStamp = timestamp; 

      MusicEventUserData* data = (MusicEventUserData *)&event; 
      MusicTrackNewUserEvent(rightHand, timestamp, data); 
     } 

     MusicEventIteratorHasNextEvent(iterator, &hasNextEvent); 
     MusicEventIteratorNextEvent(iterator); 
    } 

    MusicSequenceSetUserCallback(sequence, noteUserCallback, NULL); 

    MusicPlayerSetSequence(player, sequence); 
    MusicPlayerPreroll(player); 
} 

- (void)viewDidUnload 
{ 
    [self setPracticeWindowTitle:nil]; 
    [self setWhiteKey21:nil]; 
    // […] 
    [self setWhiteKey108:nil]; 
    [self setBlackKey22:nil]; 
    // […] 
    [self setBlackKey106:nil]; 
    [super viewDidUnload]; 
    // Release any retained subviews of the main view. 
} 

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation 
{ 
    return YES; 
} 

- (IBAction)practiceLesson:(id)sender { 
    MusicPlayerStart(player); 
} 
@end 
+0

哇,這是很多IBOutlets – meggar 2012-03-19 19:22:09

+0

5是很多'IBOutlets'? – msgambel 2012-03-19 19:30:35

+0

它只是截斷。 UIImageView有88個IBOutlets。一個用於全鍵盤上的每個按鍵。順便說一句,謝謝埃米爾:) – Barks 2012-03-19 19:33:57

回答

1

你是正確的軌道上your other question。我不知道你爲什麼認爲你需要一個完全不同的方法。

在您的回調中,請務必使用-performSelectorOnMainThread:dispatch_asyncdispatch_get_main_queue來執行任何涉及主線程UI的工作。可能的話,任何使用PracticeViewController的代碼都應該放在主線程中。

例如

void noteUserCallback (void *inClientData, MusicSequence inSequence, MusicTrack inTrack, MusicTimeStamp inEventTime, const MusicEventUserData *inEventData, MusicTimeStamp inStartSliceBeat, MusicTimeStamp inEndSliceBeat) 
{ 
    PracticeViewController* pvc = (__bridge PracticeViewController *)inClientData; 

    dispatch_async(dispatch_get_main_queue(), ^{ 
     [pvc.whiteKey21 setImage:pvc.highlightA]; 
    }); 
    ... 
} 
+0

嗯,我會被詛咒。我想我應該堅持一段時間。我有一個朋友認爲這樣做太尷尬。切換線程的伎倆! :)非常感謝Kurt。對此,我真的非常感激。我一直堅持這幾天。出於好奇,爲什麼不能從另一個線程更新UI? – Barks 2012-03-19 19:50:46

+0

簡短的回答是「這就是AppKit和UIKit的工作方式」。這是一個工程上的妥協。用戶界面需要以一種順序的方式進行操作 - 它重複地接受輸入,將其發送到處理程序,並重新繪製屏幕 - 所以最自然的方式是在一個線程中按順序運行。如果另一個線程在該進程中間更改UI數據,則UI線程將不會準備好處理更改。你可能會爭辯說UIKit應該自動處理它,但說起來容易做起來難,特別是要在非常受限的設備上保持高性能。 – 2012-03-19 20:11:48

相關問題