Enable Audio in your Application

In this guide we'll cover adding audio events to the Conversation we have created in the simple conversation with events guide. We'll deal with sending and receiving media events to and from the conversation.

Concepts

This guide will introduce you to the following concepts:

  • Audio Stream - The stream that the SDK gives you in your browser to listen to audio and send audio
  • Audio Leg - A server side API term. Legs are a part of a conversation. When audio is enabled on a conversation, a leg is created
  • Media Event - a member:media event that fires on a Conversation when the media state changes for a member

Before you begin

1 - Update the JavaScript App

We will use the application we created for the third getting started guide. All the basic setup has been done in the previous guides and should be in place. We can now focus on updating the client-side application.

1.1 - Add audio UI

First, we'll add the UI for the user to enable and disable audio, as well as an <audio> element that we'll use to play the Audio stream from the conversation. Let's add the UI at the top of the messages area.

Copy to Clipboard
<section id="messages">
  <div>
    <audio id="audio">
      <source>
    </audio>
    <button id="enable">Enable Audio</button>
    <button id="disable">Disable Audio</button>
  </div>
  ...
</section>

And add the buttons and <audio> element in the class constructor

Copy to Clipboard
constructor() {
...
  this.audio = document.getElementById('audio')
  this.enableButton = document.getElementById('enable')
  this.disableButton = document.getElementById('disable')
}

1.2 - Add enable audio handler

We'll then update the setupUserEvents method to trigger conversation.media.enable() when the user clicks the Enable Audio button. The conversation.media.enable() returns a promise with a stream object, which we'll use as the source for our <audio> element. We'll then add a listener on the <audio> element to start playing as soon as the metadata has been loaded.

Copy to Clipboard
setupUserEvents() {
...
  this.enableButton.addEventListener('click', () => {
    this.conversation.media
      .enable()
      .then(stream => {
        // Older browsers may not have srcObject
        if ("srcObject" in this.audio) {
          this.audio.srcObject = stream;
        } else {
          // Avoid using this in new browsers, as it is going away.
          this.audio.src = window.URL.createObjectURL(stream);
        }

        this.audio.onloadedmetadata = () => {
          this.audio.play();
        }

        this.eventLogger('member:media')()
      })
      .catch(this.errorLogger)
  })
}

Note that enabling audio in a conversation establishes an audio leg for a member of the conversation. The audio is only streamed to other members of the conversation who have also enabled audio.

1.3 - Add disable audio handler

Next, we'll add the ability for a user to disable the audio stream as well. In order to do this, we'll update the setupUserEvents method to trigger conversation.media.disable() when the user clicks the Disable Audio button.

Copy to Clipboard
setupUserEvents() {
...
  this.disableButton.addEventListener('click', () => {
    this.conversation.media
      .disable()
      .then(this.eventLogger('member:media'))
      .catch(this.errorLogger)
  })
}

1.4 - Add member:media listener

With these first parts we're sending member:media events into the conversation. Now we're going to register a listener for them as well that updates the messageFeed. In order to do that, we'll add a listener for member:media events at the end of the setupConversationEvents method

Copy to Clipboard
setupConversationEvents(conversation) {
  ...

  conversation.on("member:media", (member, event) => {
    console.log(`*** Member changed media state`, member, event)
    const text = `${member.user.name} <b>${event.body.audio ? 'enabled' : 'disabled'} audio in the conversation</b><br>`
    this.messageFeed.innerHTML = text + this.messageFeed.innerHTML
  })

}

If we want the conversation history to be updated, we need to add a case for member:media in the showConversationHistory switch:

Copy to Clipboard
showConversationHistory(conversation) {
  ...
  switch (value.type) {
    ...
    case 'member:media':
      eventsHistory = `${conversation.members.get(value.from).user.name} @ ${date}: <b>${value.body.audio ? "enabled" : "disabled"} audio</b><br>` + eventsHistory
      break;
    ...
  }
}

1.5 - Open the conversation in two browser windows

Now run index.html in two side-by-side browser windows, making sure to login with the user name jamie in one and with alice in the other. Enable audio on both and start talking. You'll also see events being logged in the browser console.

That's it! Your page should now look something like this.

Enable Audio in your Application

In this guide we'll cover adding audio events to the Conversation we have created in the creating a chat app tutorial guide. We'll deal with sending and receiving media events to and from the conversation.

Concepts

This guide will introduce you to the following concepts:

  • Audio Leg - A server side API term. Legs are a part of a conversation. When audio is enabled on a conversation, a leg is created
  • Media Event - a NexmoMediaEvent event that fires on a Conversation when the media state changes for a member

Before you begin

Run through the creating a chat app tutorial. You will be building on top of this project.

Add audio permissions

Since enabling audio uses the device microphone, you will need to ask the user for permission.

Add new entry in the app/src/AndroidManifest.xml file (below last <uses-permission tag):

Copy to Clipboard
<uses-permission android:name="android.permission.RECORD_AUDIO" />

Request permission on application start

Add requestCallPermissions method inside LoginFragment class.

Copy to Clipboard
private fun requestCallPermissions() {
    val callsPermissions = arrayOf(Manifest.permission.RECORD_AUDIO)
    val CALL_PERMISSIONS_REQUEST = 123
    ActivityCompat.requestPermissions(requireActivity(), callsPermissions, CALL_PERMISSIONS_REQUEST)
}

Call requestCallPermissions method inside onViewCreated method.

Copy to Clipboard
@Override
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
    // ...

    requestCallPermissions()
}

Add audio UI

You will now need to add two buttons for the user to enable and disable audio. Open app/src/main/res/layout/fragment_chat.xml file and add two new buttons (enableMediaButton and disableMediaButton) just below sendMessageButton.

Copy to Clipboard
        <!--...-->

        <Button
                android:id="@+id/sendMessageButton"
                android:layout_width="wrap_content"
                android:layout_height="0dp"
                android:text="@string/send"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintLeft_toRightOf="@id/messageEditText"
                app:layout_constraintRight_toRightOf="parent"
                app:layout_constraintTop_toBottomOf="@+id/conversationEventsScrollView" />

        <Button
                android:id="@+id/enableMediaButton"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                app:layout_constraintBottom_toTopOf="@id/conversationEventsScrollView"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                android:text="Enable Audio" />

        <Button
                android:id="@+id/disableMediaButton"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                app:layout_constraintBottom_toTopOf="@id/conversationEventsScrollView"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                android:visibility="gone"
                android:text="Disable Audio"
                tools:visibility="visible"/>

    </androidx.constraintlayout.widget.ConstraintLayout>

</androidx.constraintlayout.widget.ConstraintLayout>

Enable and disable audio

Add listeners to the buttons inside onViewCreated method of ChatFragment:

Copy to Clipboard
enableMediaButton.setOnClickListener {
    viewModel.enableMedia()
    enableMediaButton.visibility = View.GONE
    disableMediaButton.visibility = View.VISIBLE
}

disableMediaButton.setOnClickListener {
    viewModel.disableMedia()
    enableMediaButton.visibility = View.VISIBLE
    disableMediaButton.visibility = View.GONE
}

Add two methods to ChatViewModel:

Copy to Clipboard
fun disableMedia() {
    conversation?.disableMedia()
}

@SuppressLint("MissingPermission")
fun enableMedia() {
    conversation?.enableMedia()
}

NOTE: When enabling audio in a conversation establishes an audio leg for a member of the conversation. The audio is only streamed to other members of the conversation who have also enabled audio.

Display audio events

When enabling media, NexmoMediaEvent events are sent to the conversation. To display these events you will need to add a NexmoMediaEventListener. Replace the whole getConversation method in the ChatViewModel:

Copy to Clipboard
private fun getConversation() {
    client.getConversation(Config.CONVERSATION_ID, object : NexmoRequestListener<NexmoConversation> {
        override fun onSuccess(conversation: NexmoConversation?) {
            this@ChatViewModel.conversation = conversation

            conversation?.let {
                getConversationEvents(it)
                it.addMessageEventListener(messageListener)

                it.addMediaEventListener(object : NexmoMediaEventListener {
                    override fun onMediaEnabled(mediaEvent: NexmoMediaEvent) {
                        updateConversation(mediaEvent)
                    }

                    override fun onMediaDisabled(mediaEvent: NexmoMediaEvent) {
                        updateConversation(mediaEvent)
                    }
                })
            }
        }

        override fun onError(apiError: NexmoApiError) {
            this@ChatViewModel.conversation = null
            _errorMessage.postValue("Error: Unable to load conversation ${apiError.message}")
        }
    })
}

The conversationEvents observer have to support newly added NexmoMediaEvent type. Add new branch to the if statement:

Copy to Clipboard
private var conversationEvents = Observer<List<NexmoEvent>?> { events ->
    val events = events?.mapNotNull {
        when (it) {
            is NexmoMemberEvent -> getConversationLine(it)
            is NexmoTextEvent -> getConversationLine(it)
            is NexmoMediaEvent -> getConversationLine(it)
            else -> null
        }
    }

    // ...

Now add getConversationLine method needs to support NexmoMediaEvent type as well: private String getConversationLine(NexmoMediaEvent mediaEvent) { String user = mediaEvent.getFromMember().getUser().getName(); return user + " media state: " + mediaEvent.getMediaState(); }

Build and run

Press Cmd + R to build and run again. Once logged in you can enable or disable audio. To test it out you can run the app on two different devices.

Enable Audio in your Application

In this guide we'll cover adding audio events to the Conversation we have created in the creating a chat app tutorial guide. We'll deal with sending and receiving media events to and from the conversation.

Concepts

This guide will introduce you to the following concepts:

  • Audio Leg - A server side API term. Legs are a part of a conversation. When audio is enabled on a conversation, a leg is created
  • Media Event - a NexmoMediaEvent event that fires on a Conversation when the media state changes for a member

Before you begin

Run through the creating a chat app tutorial. You will be building on top of this project.

Add audio permissions

Since enabling audio uses the device microphone, you will need to ask the user for permission.

Add new entry in the app/src/AndroidManifest.xml file (below last <uses-permission tag):

Copy to Clipboard
<uses-permission android:name="android.permission.RECORD_AUDIO" />

Request permission on application start

Add requestCallPermissions method inside LoginFragment class.

Copy to Clipboard
private void requestCallPermissions() {
    String[] callsPermissions = {Manifest.permission.RECORD_AUDIO};
    int CALL_PERMISSIONS_REQUEST = 123;

    ActivityCompat.requestPermissions(requireActivity(), callsPermissions, CALL_PERMISSIONS_REQUEST);
}

Call requestCallPermissions method inside onViewCreated method.

Copy to Clipboard
@Override
public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) {
    // ...

    requestCallPermissions();
}

Add audio UI

You will now need to add two buttons for the user to enable and disable audio. Open app/src/main/res/layout/fragment_chat.xml file and add two new buttons (enableMediaButton and disableMediaButton) just below sendMessageButton.

Copy to Clipboard
        <!--...-->

        <Button
                android:id="@+id/sendMessageButton"
                android:layout_width="wrap_content"
                android:layout_height="0dp"
                android:text="@string/send"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintLeft_toRightOf="@id/messageEditText"
                app:layout_constraintRight_toRightOf="parent"
                app:layout_constraintTop_toBottomOf="@+id/conversationEventsScrollView" />

        <Button
                android:id="@+id/enableMediaButton"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                app:layout_constraintBottom_toTopOf="@id/conversationEventsScrollView"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                android:text="Enable Audio" />

        <Button
                android:id="@+id/disableMediaButton"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                app:layout_constraintBottom_toTopOf="@id/conversationEventsScrollView"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                android:visibility="gone"
                android:text="Disable Audio"
                tools:visibility="visible"/>

    </androidx.constraintlayout.widget.ConstraintLayout>

</androidx.constraintlayout.widget.ConstraintLayout>

Now you need to make sure that these buttons are accessible in the fragment. Add two new properties in the ChatFragment class:

Copy to Clipboard
Button enableMediaButton;
Button disableMediaButton;

Retrieve the buttons references by adding findViewById calls in the onViewCreated method:

Copy to Clipboard
public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) {

    //...
    enableMediaButton = view.findViewById(R.id.enableMediaButton);
    disableMediaButton = view.findViewById(R.id.disableMediaButton);
}

Enable and disable audio

Add listeners to the buttons inside onViewCreated method of ChatFragment:

Copy to Clipboard
enableMediaButton.setOnClickListener(it -> {
    viewModel.enableMedia();
    enableMediaButton.setVisibility(View.GONE);
    disableMediaButton.setVisibility(View.VISIBLE);
});

disableMediaButton.setOnClickListener(it -> {
    viewModel.disableMedia();
    enableMediaButton.setVisibility(View.VISIBLE);
    disableMediaButton.setVisibility(View.GONE);
});

Add two methods to ChatViewModel:

Copy to Clipboard
public void disableMedia() {
    conversation.disableMedia();
}

@SuppressLint("MissingPermission")
public void enableMedia() {
    conversation.enableMedia();
}

NOTE: When enabling audio in a conversation establishes an audio leg for a member of the conversation. The audio is only streamed to other members of the conversation who have also enabled audio.

Display audio events

When enabling media, NexmoMediaEvent events are sent to the conversation. To display these events you will need to add a NexmoMediaEventListener. Replace the whole getConversation method in the ChatViewModel:

Copy to Clipboard
private void getConversation() {
    client.getConversation(Config.CONVERSATION_ID, new NexmoRequestListener<NexmoConversation>() {
        @Override
        public void onSuccess(@Nullable NexmoConversation conversation) {
            ChatViewModel.this.conversation = conversation;

            if (ChatViewModel.this.conversation != null) {
                getConversationEvents(ChatViewModel.this.conversation);
                ChatViewModel.this.conversation.addMessageEventListener(messageListener);

                ChatViewModel.this.conversation.addMediaEventListener(new NexmoMediaEventListener() {
                    @Override
                    public void onMediaEnabled(@NonNull NexmoMediaEvent nexmoMediaEvent) {
                        updateConversation(nexmoMediaEvent);
                    }

                    @Override
                    public void onMediaDisabled(@NonNull NexmoMediaEvent nexmoMediaEvent) {
                        updateConversation(nexmoMediaEvent);
                    }
                });
            }
        }

        @Override
        public void onError(@NonNull NexmoApiError apiError) {
            ChatViewModel.this.conversation = null;
            _errorMessage.postValue("Error: Unable to load conversation " + apiError.getMessage());
        }
    });
}

The conversationEvents observer have to support newly added NexmoMediaEvent type. Add new branch to the if statement:

Copy to Clipboard
private Observer<ArrayList<NexmoEvent>> conversationEvents = events -> {

        //...

        if (event instanceof NexmoMemberEvent) {
            line = getConversationLine((NexmoMemberEvent) event);
        } else if (event instanceof NexmoTextEvent) {
            line = getConversationLine((NexmoTextEvent) event);
        } else if (event instanceof NexmoMediaEvent) {
            line = getConversationLine((NexmoMediaEvent) event);
        }

        //...
    };

Now add getConversationLine method needs to support NexmoMediaEvent type as well: private fun getConversationLine(mediaEvent: NexmoMediaEvent): String? { val user = mediaEvent.fromMember.user.name return user + " media state: " + mediaEvent.mediaState }

Build and run

Press Cmd + R to build and run again. Once logged in you can enable or disable audio. To test it out you can run the app on two different devices.

Enable Audio in your Application

In this guide we'll cover adding audio events to the Conversation we have created in the creating a chat app tutorial guide. We'll deal with sending and receiving media events to and from the conversation.

Concepts

This guide will introduce you to the following concepts:

  • Audio Leg - A server side API term. Legs are a part of a conversation. When audio is enabled on a conversation, a leg is created
  • Media Event - a NXMMediaEvent event that fires on a Conversation when the media state changes for a member

Before you begin

Run through the creating a chat app tutorial. You will be building on top of this project.

Add audio permissions

Since enabling audio uses the device microphone, you will need to ask the user for permission.

Info.plist

Every Xcode project contains an Info.plist file containing all the metadata required in each app or bundle - you will find the file inside the AppToAppChat group.

A new entry in the Info.plist file is required:

  1. Hover your mouse over the last entry in the list and click the little + button that appears.

  2. From the dropdown list, pick Privacy - Microphone Usage Description and add Microphone access required in order to make and receive audio calls. for its value.

Request permission on application start

Open AppDelegate.swift and import the AVFoundation library right after where UIKit is included.

Copy to Clipboard
import UIKit
import AVFoundation

Next, call requestRecordPermission: inside application:didFinishLaunchingWithOptions:.

Copy to Clipboard
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
    // Override point for customization after application launch.
    AVAudioSession.sharedInstance().requestRecordPermission { (granted:Bool) in
        NSLog("Allow microphone use. Response: %d", granted)
    }
    return true
}

Add audio UI

You will now need to add a button for the user to enable and disable audio. In the viewDidLoad function in the ChatViewController.swift class add a new bar button.

Copy to Clipboard
navigationItem.rightBarButtonItem = UIBarButtonItem(title: "Start Audio", style: .plain, target: self, action: #selector(self.toggleAudio))

Enable audio

Next would be to enable audio. Add a property to the ChatViewController class.

Copy to Clipboard
var audioEnabled = false

The bar button from the previous step calls a toggleAudio function when tapped so add the following function to the ChatViewController class.

Copy to Clipboard
@objc func toggleAudio() {
    if audioEnabled {
        conversation?.disableMedia()
        navigationItem.rightBarButtonItem?.title = "Start Audio"
        audioEnabled = false
    } else {
        conversation?.enableMedia()
        navigationItem.rightBarButtonItem?.title = "Stop Audio"
        audioEnabled = true
    }
}

Note that enabling audio in a conversation establishes an audio leg for a member of the conversation. The audio is only streamed to other members of the conversation who have also enabled audio.

Display audio events

When enabling media, NXMMediaEvent events are sent to the conversation. To display these you will need to add a function from the NXMConversationDelegate which will append the media events to events array for processing.

Copy to Clipboard
extension ChatViewController: NXMConversationDelegate {
  ...

  func conversation(_ conversation: NXMConversation, didReceive event: NXMMediaEvent) {
      self.events?.append(event)
  }
}

In the process events function you will need to add a clause for a NXMMediaEvent, which in turn calls showMediaEvent to display the audio events.

Copy to Clipboard
func processEvents() {
    DispatchQueue.main.async { [weak self] in
       ...
        self.events?.forEach { event in
            ...
            if let mediaEvent = event as? NXMMediaEvent {
                self.showMediaEvent(event: mediaEvent)
            }
        }
    }
}

func showMediaEvent(event: NXMMediaEvent) {
    if event.isEnabled {
        addConversationLine("\(event.fromMember?.user.name ?? "A user") enabled audio")
    } else {
        addConversationLine("\(event.fromMember?.user.name ?? "A user") disabled audio")
    }
}

Build and run

Press Cmd + R to build and run again. Once logged in you can enable or disable audio. To test it out you can run the app on two different simulators/devices.

Enable media

Enable Audio in your Application

In this guide we'll cover adding audio events to the Conversation we have created in the creating a chat app tutorial guide. We'll deal with sending and receiving media events to and from the conversation.

Concepts

This guide will introduce you to the following concepts:

  • Audio Leg - A server side API term. Legs are a part of a conversation. When audio is enabled on a conversation, a leg is created
  • Media Event - a NXMMediaEvent event that fires on a Conversation when the media state changes for a member

Before you begin

Run through the creating a chat app tutorial. You will be building on top of this project.

Add audio permissions

Since enabling audio uses the device microphone, you will need to ask the user for permission.

Info.plist

Every Xcode project contains an Info.plist file containing all the metadata required in each app or bundle - you will find the file inside the AppToAppChat group.

A new entry in the Info.plist file is required:

  1. Hover your mouse over the last entry in the list and click the little + button that appears.

  2. From the dropdown list, pick Privacy - Microphone Usage Description and add Microphone access required in order to make and receive audio calls. for its value.

Request permission on application start

Open AppDelegate.h and import the AVFoundation library right after where UIKit is included.

Copy to Clipboard
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

Next, call requestRecordPermission: inside application:didFinishLaunchingWithOptions: within AppDelegate.m.

Copy to Clipboard
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    // Override point for customization after application launch.
    [AVAudioSession.sharedInstance requestRecordPermission:^(BOOL granted) {
        NSLog(@"Allow microphone use. Response: %d", granted);
    }];
    return YES;
}

Add audio UI

You will now need to add a button for the user to enable and disable audio. In the viewDidLoad function in the ChatViewController.m class add a new bar button.

Copy to Clipboard
self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithTitle:@"Start Audio" style:UIBarButtonItemStyleDone target:self action:@selector(toggleAudio)];

Enable audio

Next would be to enable audio. Add a property to the ChatViewController interface.

Copy to Clipboard
@interface ChatViewController () <UITextFieldDelegate, NXMConversationDelegate>
...
@property BOOL audioEnabled;
@end

The bar button from the previous step calls a toggleAudio function when tapped so add the following function to the ChatViewController class.

Copy to Clipboard
- (void)toggleAudio {
    if (self.audioEnabled) {
        [self.conversation disableMedia];
        self.navigationItem.rightBarButtonItem.title = @"Start Audio";
        self.audioEnabled = NO;
    } else {
        [self.conversation enableMedia];
        self.navigationItem.rightBarButtonItem.title = @"Stop Audio";
        self.audioEnabled = YES;
    }
}

Note that enabling audio in a conversation establishes an audio leg for a member of the conversation. The audio is only streamed to other members of the conversation who have also enabled audio.

Display audio events

When enabling media, NXMMediaEvent events are sent to the conversation. To display these you will need to add a function from the NXMConversationDelegate which will append the media events to events array for processing.

Copy to Clipboard
- (void)conversation:(NXMConversation *)conversation didReceiveMediaEvent:(NXMMediaEvent *)event {
    [self.events addObject:event];
    [self processEvents];
}

In the process events function you will need to add a clause for a NXMMediaEvent, which in turn calls showMediaEvent to display the audio events.

Copy to Clipboard
- (void)processEvents {
    dispatch_async(dispatch_get_main_queue(), ^{
        self.conversationTextView.text = @"";
        for (NXMEvent *event in self.events) {
            if ([event isMemberOfClass:[NXMMemberEvent class]]) {
                [self showMemberEvent:(NXMMemberEvent *)event];
            } else if ([event isMemberOfClass:[NXMTextEvent class]]) {
                [self showTextEvent:(NXMTextEvent *)event];
            } else if ([event isMemberOfClass:[NXMMediaEvent class]]) {
                [self showMediaEvent:(NXMMediaEvent *)event];
            }
        }
    });
}

- (void) showMediaEvent:(NXMMediaEvent *)event {
    if (event.isEnabled) {
        [self addConversationLine:[NSString stringWithFormat:@"%@ enabled audio", event.fromMember.user.name]];
    } else {
        [self addConversationLine:[NSString stringWithFormat:@"%@ disabled audio", event.fromMember.user.name]];
    }
}

Build and run

Press Cmd + R to build and run again. Once logged in you can enable or disable audio. To test it out you can run the app on two different simulators/devices.

Enable media