Introduction

Creating a live streaming app for iOS involves a structured process to ensure its functionality and appeal to users. Initially, developers need to select suitable streaming protocols, such as HLS (HTTP Live Streaming), to ensure compatibility with iOS devices. User interface design is crucial, incorporating features like live chat, reactions, and notifications for enhanced engagement. It's essential to evaluate factors like performance, reliability, ease of integration, and platform support.

And here VideoSDK stands out as a comprehensive solution, offering SDKs for iOS, Android, Flutter, and React Native. Its key advantage lies in providing an intuitive and easy-to-integrate SDK specifically tailored for live-streaming apps. VideoSDK ensures seamless performance, robust features, and excellent support across multiple platforms.

Benefits of iOS live streaming app:

  • Real-Time Engagement: Live streaming allows for instant interaction and engagement with the audience, fostering a sense of connection and community.
  • Increased Reach: With live streaming, you can reach a wider audience, as viewers can tune in from anywhere in the world.
  • Cost-Effective Marketing: Compared to traditional marketing methods, live streaming can be a cost-effective way to promote your brand, as it requires minimal equipment and can be done from anywhere.
  • Builds Trust and Authenticity: Live streaming provides an authentic and transparent way to connect with your audience, helping to build trust and credibility.

Use Cases of iOS live streaming app:

  • Live Events and Webinars: Host live events or webinars to engage with your audience in real-time, share knowledge, and provide valuable information on specific topics. You can create a QR code to allow quick access to the event link or resources.
  • Product Launches: Live stream product launches to showcase new products or services, demonstrate features, and answer audience questions in real time.
  • Q&A Sessions: Host live Q&A sessions to interact with your audience, address their questions, and provide valuable insights or advice.

How to Develop a Live Streaming App with VideoSDK

Getting Started with VideoSDK

VideoSDK enables the opportunity to integrate video & audio calling into Web, Android, and iOS applications with so many different frameworks. It is the best infrastructure solution that provides programmable SDKs and REST APIs to build scalable video conferencing applications. This guide will get you running with the VideoSDK video & audio calling in minutes.

Create a VideoSDK Account

Go to your VideoSDK dashboard and sign up if you don't have an account. This account gives you access to the required Video SDK token, which acts as an authentication key that allows your application to interact with VideoSDK functionality.

Generate your Auth Token

Visit your VideoSDK dashboard and navigate to the "API Key" section to generate your auth token. This token is crucial in authorizing your application to use VideoSDK features. For a more visual understanding of the account creation and token generation process, consider referring to the provided tutorial.

Prerequisites and Setup

  • iOS 11.0+
  • Xcode 12.0+
  • Swift 5.0+

This App will contain two screens:

Join Screen: This screen allows the user to either create a meeting or join the predefined meeting.

Meeting Screen: This screen basically contains local and remote participant views and some meeting controls such as Enable/Disable the mic & Camera and Leave meeting.

Integrate VideoSDK​

To install VideoSDK, you must initialize the pod on the project by running the following command:

pod init

It will create the pod file in your project folder, Open that file and add the dependency for the VideoSDK, like below:

pod 'VideoSDKRTC', :git => 'https://github.com/videosdk-live/videosdk-rtc-ios-sdk.git'
Video SDK Image

then run the below code to install the pod:

pod install

then declare the permissions in Info.plist :

<key>NSCameraUsageDescription</key>
<string>Camera permission description</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone permission description</string>

Project Structure

iOSQuickStartDemo
   ├── Models
        ├── RoomStruct.swift
        └── MeetingData.swift
   ├── ViewControllers
        ├── StartMeetingViewController.swift
        └── MeetingViewController.swift
   ├── AppDelegate.swift // Default
   ├── SceneDelegate.swift // Default
   └── APIService
           └── APIService.swift
   ├── Main.storyboard // Default
   ├── LaunchScreen.storyboard // Default
   └── Info.plist // Default
 Pods
     └── Podfile

Create models

Create a swift file for MeetingData and RoomStruct class model for setting data in object pattern.

import Foundation
struct MeetingData {
    let token: String
    let name: String
    let meetingId: String
    let micEnabled: Bool
    let cameraEnabled: Bool
}
MeetingData.swift
import Foundation
struct RoomsStruct: Codable {
    let createdAt, updatedAt, roomID: String?
    let links: Links?
    let id: String?
    enum CodingKeys: String, CodingKey {
        case createdAt, updatedAt
        case roomID = "roomId"
        case links, id
    }
}

// MARK: - Links
struct Links: Codable {
    let getRoom, getSession: String?
    enum CodingKeys: String, CodingKey {
        case getRoom = "get_room"
        case getSession = "get_session"
    }
}
RoomStruct.swift

Essential Steps for Building the Video Calling

This guide is designed to walk you through the process of integrating Chat with VideoSDK. We'll cover everything from setting up the SDK to incorporating the visual cues into your app's interface, ensuring a smooth and efficient implementation process.

Step 1: Get started with APIClient

Before jumping to anything else, we have to write an API to generate a unique meetingId. You will require an authentication token; you can generate it either using videosdk-server-api-example or from the VideoSDK Dashboard for developers.

import Foundation

let TOKEN_STRING: String = "<AUTH_TOKEN>"

class APIService {

  class func createMeeting(token: String, completion: @escaping (Result<String, Error>) -> Void) {

    let url = URL(string: "https://api.videosdk.live/v2/rooms")!

    var request = URLRequest(url: url)
    request.httpMethod = "POST"
    request.addValue(TOKEN_STRING, forHTTPHeaderField: "authorization")

    URLSession.shared.dataTask(
      with: request,
      completionHandler: { (data: Data?, response: URLResponse?, error: Error?) in

        DispatchQueue.main.async {

          if let data = data, let utf8Text = String(data: data, encoding: .utf8) {
            do {
              let dataArray = try JSONDecoder().decode(RoomsStruct.self, from: data)

              completion(.success(dataArray.roomID ?? ""))
            } catch {
              print("Error while creating a meeting: \(error)")
              completion(.failure(error))
            }
          }
        }
      }
    ).resume()
  }
}
APIService.swift

Step 2: Implement Join Screen

The Join Screen will work as a medium to either schedule a new meeting or join an existing meeting.

import Foundation
import UIKit

class StartMeetingViewController: UIViewController, UITextFieldDelegate {

  private var serverToken = ""

  /// MARK: outlet for create meeting button
  @IBOutlet weak var btnCreateMeeting: UIButton!

  /// MARK: outlet for join meeting button
  @IBOutlet weak var btnJoinMeeting: UIButton!

  /// MARK: outlet for meetingId textfield
  @IBOutlet weak var txtMeetingId: UITextField!

  /// MARK: Initialize the private variable with TOKEN_STRING &
  /// setting the meeting id in the textfield
  override func viewDidLoad() {
    txtMeetingId.delegate = self
    serverToken = TOKEN_STRING
    txtMeetingId.text = "PROVIDE-STATIC-MEETING-ID"
  }

  /// MARK: method for joining meeting through seague named as "StartMeeting"
  /// after validating the serverToken in not empty
  func joinMeeting() {

    txtMeetingId.resignFirstResponder()

    if !serverToken.isEmpty {
      DispatchQueue.main.async {
        self.dismiss(animated: true) {
          self.performSegue(withIdentifier: "StartMeeting", sender: nil)
        }
      }
    } else {
      print("Please provide auth token to start the meeting.")
    }
  }

  /// MARK: outlet for create meeting button tap event
  @IBAction func btnCreateMeetingTapped(_ sender: Any) {
    print("show loader while meeting gets connected with server")
    joinRoom()
  }

  /// MARK: outlet for join meeting button tap event
  @IBAction func btnJoinMeetingTapped(_ sender: Any) {
    if (txtMeetingId.text ?? "").isEmpty {

      print("Please provide meeting id to start the meeting.")
      txtMeetingId.resignFirstResponder()
    } else {
      joinMeeting()
    }
  }

  // MARK: - method for creating room api call and getting meetingId for joining meeting

  func joinRoom() {

    APIService.createMeeting(token: self.serverToken) { result in
      if case .success(let meetingId) = result {
        DispatchQueue.main.async {
          self.txtMeetingId.text = meetingId
          self.joinMeeting()
        }
      }
    }
  }

  /// MARK: preparing to animate to meetingViewController screen
  override func prepare(for segue: UIStoryboardSegue, sender: Any?) {

    guard let navigation = segue.destination as? UINavigationController,

      let meetingViewController = navigation.topViewController as? MeetingViewController
    else {
      return
    }

    meetingViewController.meetingData = MeetingData(
      token: serverToken,
      name: txtMeetingId.text ?? "Guest",
      meetingId: txtMeetingId.text ?? "",
      micEnabled: true,
      cameraEnabled: true
    )
  }
}
StartMeetingViewController.swift

Step 3: Initialize and Join Meeting

Using the provided token and meetingId, we will configure and initialize the meeting in viewDidLoad().

Then, we'll add @IBOutlet for localParticipantVideoView and remoteParticipantVideoView, which can render local and remote participant media, respectively.

class MeetingViewController: UIViewController {

import UIKit
import VideoSDKRTC
import WebRTC
import AVFoundation

class MeetingViewController: UIViewController {

// MARK: - Properties
// outlet for local participant container view
   @IBOutlet weak var localParticipantViewContainer: UIView!

// outlet for label for meeting Id
   @IBOutlet weak var lblMeetingId: UILabel!

// outlet for local participant video view
   @IBOutlet weak var localParticipantVideoView: RTCMTLVideoView!

// outlet for remote participant video view
   @IBOutlet weak var remoteParticipantVideoView: RTCMTLVideoView!

// outlet for remote participant no media label
   @IBOutlet weak var lblRemoteParticipantNoMedia: UILabel!

// outlet for remote participant container view
   @IBOutlet weak var remoteParticipantViewContainer: UIView!

// outlet for local participant no media label
   @IBOutlet weak var lblLocalParticipantNoMedia: UILabel!

// Meeting data - required to start
   var meetingData: MeetingData!

// current meeting reference
   private var meeting: Meeting?

    // MARK: - video participants including self to show in UI
    private var participants: [Participant] = []

        // MARK: - Lifecycle Events

        override func viewDidLoad() {
        super.viewDidLoad()
        // configure the VideoSDK with token
        VideoSDK.config(token: meetingData.token)

        // init meeting
        initializeMeeting()

        // set meeting id in button text
        lblMeetingId.text = "Meeting Id: \(meetingData.meetingId)"
      }

      override func viewWillAppear(_ animated: Bool) {
          super.viewWillAppear(animated)
          navigationController?.navigationBar.isHidden = true
      }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        navigationController?.navigationBar.isHidden = false
        NotificationCenter.default.removeObserver(self)
    }

        // MARK: - Meeting

        private func initializeMeeting() {

            // Initialize the VideoSDK
            meeting = VideoSDK.initMeeting(
                meetingId: meetingData.meetingId,
                participantName: meetingData.name,
                micEnabled: meetingData.micEnabled,
                webcamEnabled: meetingData.cameraEnabled
            )

            // Adding the listener to meeting
            meeting?.addEventListener(self)

            // joining the meeting
            meeting?.join()
        }
}
MeetingViewController.swift

Step 4: Implement Controls

After initializing the meeting in the previous step, we will now add @IBOutlet for btnLeave, btnToggleVideo and btnToggleMic which can control the media in the meeting.

Along with that we'll add one more @IBOutlet button btnToggleHLS using which we will start/stop HLS.

Added a button to toggle HLS and code for the same, please check for grammar or typos.
class MeetingViewController: UIViewController {

...

    // outlet for leave button
    @IBOutlet weak var btnLeave: UIButton!

    // outlet for toggle video button
    @IBOutlet weak var btnToggleVideo: UIButton!

    // outlet for toggle audio button
    @IBOutlet weak var btnToggleMic: UIButton!
    
    // outlet for toggle HLS button
    @IBOutlet weak var btnToggleHLS: UIButton!

    // bool for mic
    var micEnabled = true
    // bool for video
    var videoEnabled = true
    // bool for hls state
    var hlsEnabled = false


    // outlet for leave button click event
    @IBAction func btnLeaveTapped(_ sender: Any) {
            DispatchQueue.main.async {
                self.meeting?.leave()
                self.dismiss(animated: true)
            }
        }

    // outlet for toggle mic button click event
    @IBAction func btnToggleMicTapped(_ sender: Any) {
        if micEnabled {
            micEnabled = !micEnabled // false
            self.meeting?.muteMic()
        } else {
            micEnabled = !micEnabled // true
            self.meeting?.unmuteMic()
        }
    }

    // outlet for toggle video button click event
    @IBAction func btnToggleVideoTapped(_ sender: Any) {
        if videoEnabled {
            videoEnabled = !videoEnabled // false
            self.meeting?.disableWebcam()
        } else {
            videoEnabled = !videoEnabled // true
            self.meeting?.enableWebcam()
        }
    }
    
      // outlet for toggle HLS button click event
    @IBAction func btnToggleHLSTapped(_ sender: Any) {
        if !hlsEnabled {
        		// sample config
             let config: HLSConfig = HLSConfig(layout: 
                            ConfigLayout(type: .GRID, 
                                         priority: .SPEAKER, 
                                         gridSize: 4), 
                                         theme: .DARK, 
                                         mode: .video_and_audio, 
                                         quality: .high, 
                                         orientation: .portrait)
          // start hls, you can use your custom configs as per your need
             self.meeting?.startHLS(config: config)
             self.hlsEnabled = true
        } else {
            self.meeting?.stopHLS()
            self.hlsEnabled = false
        }
    }

...

}
MeetingViewController.swift

Step 5: Implementing MeetingEventListener

In this step, we'll create an extension for the MeetingViewController that implements the MeetingEventListener, which implements the onMeetingJoined, onMeetingLeft, onParticipantJoined, onParticipantLeft, onParticipantChanged, onHlsStateChanged etc. methods.

When HLS is started, it triggers the onHlsStateChanged event of the MeetingEventListener.

Removed onspeakerchanged event and added onhlsstatechanged event and code for the same

class MeetingViewController: UIViewController {

...

extension MeetingViewController: MeetingEventListener {

        /// Meeting started
        func onMeetingJoined() {

            // handle local participant on start
            guard let localParticipant = self.meeting?.localParticipant else { return }
            // add to list
            participants.append(localParticipant)

            // add event listener
            localParticipant.addEventListener(self)

            localParticipant.setQuality(.high)

            if(localParticipant.isLocal){
                self.localParticipantViewContainer.isHidden = false
            } else {
                self.remoteParticipantViewContainer.isHidden = false
            }
        }

        /// Meeting ended
        func onMeetingLeft() {
            // remove listeners
            meeting?.localParticipant.removeEventListener(self)
            meeting?.removeEventListener(self)
        }

        /// A new participant joined
        func onParticipantJoined(_ participant: Participant) {
            participants.append(participant)

            // add listener
            participant.addEventListener(self)

            participant.setQuality(.high)

            if(participant.isLocal){
                self.localParticipantViewContainer.isHidden = false
            } else {
                self.remoteParticipantViewContainer.isHidden = false
            }
        }

        /// A participant left from the meeting
        /// - Parameter participant: participant object
        func onParticipantLeft(_ participant: Participant) {
            participant.removeEventListener(self)
            guard let index = self.participants.firstIndex(where: { $0.id == participant.id }) else {
                return
            }
            // remove participant from list
            participants.remove(at: index)
            // hide from ui
            UIView.animate(withDuration: 0.5){
                if(!participant.isLocal){
                    self.remoteParticipantViewContainer.isHidden = true
                }
            }
        }
        
        /// HLS state changed event
        /// - Parameters state: HLSState and hlsUrl: HLSUrl if given
        func onHlsStateChanged(state: HLSState, hlsUrl: HLSUrl?) {
            switch(state) {
                case .HLS_STARTING:
                    print("HLS Starting")

                case .HLS_STARTED:
                    print("HLS Started")

                case .HLS_PLAYABLE:
                    print("HLS Playable")

                case .HLS_STOPPING:
                    print("HLS Stopping")

                case .HLS_STOPPED:
                    print("HLS Stopped")
                }
    }
}

...
MeetingViewController.swift

Step 6: Implementing ParticipantEventListener

In this stage, we'll add an extension for the MeetingViewController that implements the ParticipantEventListener, which implements the onStreamEnabled and onStreamDisabled methods for the audio and video of MediaStreams enabled or disabled.

The function update UI is frequently used to control or modify the user interface (enable/disable camera & mic) by the MediaStream state.

class MeetingViewController: UIViewController {

...

extension MeetingViewController: ParticipantEventListener {

/// Participant has enabled mic, video or screenshare
/// - Parameters:
/// - stream: enabled stream object
/// - participant: participant object
func onStreamEnabled(_ stream: MediaStream, forParticipant participant: Participant) {
    updateUI(participant: participant, forStream: stream, enabled: true)
 }

/// Participant has disabled mic, video or screenshare
/// - Parameters:
///   - stream: disabled stream object
///   - participant: participant object
        
func onStreamDisabled(_ stream: MediaStream, 
			forParticipant participant: Participant) {
            
  updateUI(participant: participant, forStream: stream, enabled: false)
 }
 
}

private extension MeetingViewController {

 func updateUI(participant: Participant, forStream stream: MediaStream, 							enabled: Bool) { // true
        switch stream.kind {
        case .state(value: .video):
            if let videotrack = stream.track as? RTCVideoTrack {
                if enabled {
                    DispatchQueue.main.async {
                        UIView.animate(withDuration: 0.5){
                        
                            if(participant.isLocal) {
                            
    	self.localParticipantViewContainer.isHidden = false
	self.localParticipantVideoView.isHidden = false       
	self.localParticipantVideoView.videoContentMode = .scaleAspectFill                            self.localParticipantViewContainer.bringSubviewToFront(self.localParticipantVideoView)                         									
    videotrack.add(self.localParticipantVideoView)
    self.lblLocalParticipantNoMedia.isHidden = true

} else {
		self.remoteParticipantViewContainer.isHidden = false
        	self.remoteParticipantVideoView.isHidden = false
                                self.remoteParticipantVideoView.videoContentMode = .scaleAspectFill
                                self.remoteParticipantViewContainer.bringSubviewToFront(self.remoteParticipantVideoView)
                                		        videotrack.add(self.remoteParticipantVideoView)
 self.lblRemoteParticipantNoMedia.isHidden = true
        }
     }
  }
} else {
         UIView.animate(withDuration: 0.5){
                if(participant.isLocal){
                
                    self.localParticipantViewContainer.isHidden = false
                    self.localParticipantVideoView.isHidden = true
                    self.lblLocalParticipantNoMedia.isHidden = false
                            videotrack.remove(self.localParticipantVideoView)
} else {
                   self.remoteParticipantViewContainer.isHidden = false
                   self.remoteParticipantVideoView.isHidden = true
                   self.lblRemoteParticipantNoMedia.isHidden = false
                            videotrack.remove(self.remoteParticipantVideoView)
      }
    }
  }
}

     case .state(value: .audio):
            if participant.isLocal {
                
               localParticipantViewContainer.layer.borderWidth = 4.0
               localParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
            } else {
                remoteParticipantViewContainer.layer.borderWidth = 4.0
                remoteParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
            }
        default:
            break
        }
    }
}

...

Known Issue

Please add the following line to the MeetingViewController.swift file's viewDidLoad method If you get your video out of the container, view like below image.

override func viewDidLoad() {

  localParticipantVideoView.frame = CGRect(x: 10, y: 0, 
    		width: localParticipantViewContainer.frame.width, 
   		height: localParticipantViewContainer.frame.height)

  localParticipantVideoView.bounds = CGRect(x: 10, y: 0, 
  		width: localParticipantViewContainer.frame.width, 
        	height: localParticipantViewContainer.frame.height)

  localParticipantVideoView.clipsToBounds = true

  remoteParticipantVideoView.frame = CGRect(x: 10, y: 0, 
  		width: remoteParticipantViewContainer.frame.width, 
        	height: remoteParticipantViewContainer.frame.height)
        
  remoteParticipantVideoView.bounds = CGRect(x: 10, y: 0, 
  		width: remoteParticipantViewContainer.frame.width, 
    		height: remoteParticipantViewContainer.frame.height)
    
    remoteParticipantVideoView.clipsToBounds = true
}
MeetingViewController.swift
TIP:
Stuck anywhere? Check out this example code on GitHub
GitHub - videosdk-live/videosdk-rtc-ios-sdk-example: WebRTC based video conferencing SDK for iOS (Swift / Objective C)
WebRTC based video conferencing SDK for iOS (Swift / Objective C) - videosdk-live/videosdk-rtc-ios-sdk-example
Video SDK Image

Now you have successfully created an iOS Live Streaming Application with the VideoSDK.

Conclusion

VideoSDK offers a seamless solution for building iOS live-streaming applications. With its user-friendly interface and robust features, developers can efficiently create high-quality streaming experiences for users. With an iOS development agency like Simpalm can further enhance the development process, ensuring top-notch results.

Unlock the full potential of VideoSDK today and craft seamless video experiences! Sign up now to receive 10,000 free minutes and take your video app to new heights.