A week ago, new iPhones were released along with iOS 11 – a notable event. This release among everything else has brought one more important thing to developers: the Safari browser received long-awaited support for WebRTC.

Think for a minute: millions of iPhones and iPads all over the world suddenly learned to play real-time audio and video in a browser. iOS and Mac users can now enjoy full-functional in-browser video chats, live broadcasts with low (less than a second) real-time latency, calls and conferences and more. The road was long, and now we are here.

Before

Previously, we wrote about a way to play video with minimum latency in iOS Safari, and this method is still actual for iOS 9 and iOS 10 that lack support for WebRTC. We suggested to use the approach, the code name “WSPlayer”, that allows for delivering a live video stream via the Websocket protocol, then decode this stream using JavaScript and render the video stream onto the Canvas HTML5 element using WebGL. The received audio stream had to be played using Web Audio API of the browser. Here is how it looked:

Play come audio through the Web Audio API browser

This approach allowed and still allows to play a stream on a page in the iOS Safari browser with overall latency of about 3 seconds, but has its disadvantages:

1. Performance.

A video stream is decoded using JavaScript. This creates high CPU load of the mobile device, prevents from playing higher resolutions and consumes charge of the battery.

2. TCP.

A transport protocol video and audio are transmitted is Websocket / TCP. Due to this, latency cannot be targeted to a specific value, and still can increase if any network fluctuations occur.

All the time until iOS 11 was released, WSPlayer could play video with relatively low latency (3 seconds) compared to HLS (20 seconds). Now things got better, and the JavaScript player gives place to the native WebRTC technology, that does all the job using means of the browser itself without JavaScript decoding or using Canvas.

 

Now

With arrival of WebRTC, playing low latency video in iOS Safari 11 became identical to other browsers supporting WebRTC, namely Chrome, Firefox, Edge.

Scheme video playback with low latency on iOS Safari 11

 

Microphone and camera

Above, we said only about playing of real-time video. But you cannot run a video chat without a camera and a microphone. And this was a real headache for developers planning to add support for iOS Safari in their video chats or other live video projects. Thousands of manhours were wasted while searching for a solution in iOS Safari 9 and 10 that simply did not exist – Safari couldn’t capture the camera or the microphone, and this “feature” was fixed no sooner than in iOS 11.

Run iOS 11 Safari and request access to the camera and the microphone. Now, this is what we’ve been waiting for. The wait is over:

Enable or disable use of camera and microphone

The browser asks for the camera and the microphone and now can both stream live video and play audio and video.

Also, you can take a look at Safari settings and turn on/off the microphone there:

Enable/ disable the microphone in the settings of Safari browser

 

Camera display and playing of a streaming video

Surely, there are specifics. The most notable feature is that the video in the video element must be tapped (clicked) before it starts to play.

For developers, this is a limitation and a showstopper. Indeed, if a customer insists “I want this video to play automatically on load”, in iOS Safari this trick will not work, so the developer will have to explain that this is the fault of Safari and Apple’s F-ing security policy.

For users, however, this may be good, because websites will not be able to play a video stream without explicit will of a user who confirms his will to play the video be clicking the element.

 

What about Mac OS?

Here are some good news for Macbook and Mac OS owners. After the update, Safari 11 on Mac also supports WebRTC. Previously, Mac Safari used old reliable Flash Player that did work as a cheap replacement for WebRTC: it compressed and played audio and video via RTMP and RTMFP. But now as WebRTC is available, there is no need to use Flash Player for video chats anymore. So, we use WebRTC for Safari 11+ and continue using Flash Player or WebRTC plugins as a fallback mechanism in Safari 10.

 

Summary

As you can see, Safari 11 got support for WebRTC, while Safari 9 and 10 remained with fallbacks like Fash Player and WebRTC plugins on Mac OS, as well as WSPlayer on iOS.

Mac, Safari 10

iOS 9, 10, Safari

Mac, Safari 11

iOS 11, Safari

Flash Player

WebRTC plugins

WSPlayer

WSPlayer

WebRTC

WebRTC

 

Testing browser – to-browser broadcasting

Now, let’s make tests for the main use cases. We start with a player. First of all, we need to install the latest iOS 11.0.2 with new Safari.

Install the update iOS 11.0.2 with the new Safari

So, as the first test, we want Chrome for Windows to broadcast a video stream to the server, and a spectator on iOS Safari should play this video stream via WebRTC.

Open the Two Way Streaming example in the Chrome browser and send a WebRTC video stream called 1ad5 to the server. Chrome captures the video from the camera, encodes it to H.264 in our case and sends the live video stream to the server for further sharing. Video stream broadcasting looks as follows:

Testing stream from browser to browser

To play, specify the name of the video stream, and the player in iOS Safari starts playing the stream sent by Chrome to the server. Playing the stream on iPhone in the Safari browser looks like this

Playback of a stream on the iPhone in Safari

Latency is hardly visible (less than a second). The video stream plays smoothly and without artifacts. Quality of playback is good as you can see on the screenshots.

And here is how video playback of the same example Two Way Streaming looks in the Play block. Therefore, you can broadcast one stream and play another one on the same page in the browser. If users know each other’s stream names, that’s a simple video chat.

Simple video chat

 

Testing web camera and microphone broadcasting using iOS Safari

As we mentioned above, the main feature of WebRTC is its ability to capture the camera and the microphone in a browser and to send it to the network with a target low latency. Let’s see if it works in iOS Safari 11.

Open in Safari the same example of the demo streamer we opened in Chrome. Receive access to “Microphone and camera”. Safari shows a dialog where you should either allow or disallow using the camera and the microphone.

Enable or disable use of camera and microphone

After we’ve got access to the camera and the microphone, we should see the red icon of the camera in the top left corner of the browser. Safari indicates the camera is active and is in use. The video stream is being sent to the server.

Activity camera in Safari

We fetch this stream in another browser, for example, Chrome. On the playback, we see the stream sent from Safari with infamous vertical shooting.

The stream in the Chrome browser

After the iPhone was turned horizontally, the video streaming picture becomes normal:

Change orientation iPhone

Video capturing and broadcasting of a video are always more interesting than mere playback, because the most important things happen here including RTCP feedbacks that set the target for latency and quality of the video.

At the moment this article was written we didn’t find any suitable tools to monitor WebRTC in a browser for iOS Safari, like the webrtc-internals tool for Chrome. Let’s see how the server sees the video stream captured from Safari. To do this, we enable monitoring and take a look at the main graphs describing the traffic coming from Safari.

The first set of graphs displays such metrics as NACK and PLI that indicate loss of UDP packets. For a normal quality network NACK shown on the graphs is considered low, near 15, so we can conclude the patient is well enough.

FPS of the video stream stays near 29,30,31 and never goes down to lower values (10-15). This means that performance of hardware acceleration of iPhone is enough to encode the video to the H.264 codec, and the processor power is enough to stream this video to the network. For this test we used iPhone 6, 16 GB.

Schedules changing video resolution and bitrate

The following graphs display how the resolution and bitrate of the video change over time. The video bitrate varies from 1.2 – 1.6 Mbps, and the resolution stays the same: 640×480. This means the bandwidth is sufficient to encode video and Safari compresses the video with the maximum bitrate. Optionally, you can put the bitrate in certain limits.

The bitrate is in the right limits

Then we check the bitrate of the audio part of the stream and statistics of audio losses. We can see that there are no lost packets of audio, the counter is strictly zero. The audio bitrate is 30-34 kbps. This is the Opus codec Safari uses to compress the audio stream captured from the microphone.

 

Finally, the last graphs are timecodes. Timecodes allow evaluating if video and audio are synchronized. Lack of synchronization leads to notable discrepancy: the voice lags the lips, or goes ahead the video. In our case the stream from Safari is ideally synchronized and goes monotonously without any deviations.

Graphics timecodes

From these graphs we can see behavior typical for WebRTC and very similar to the behavior of Google Chrome: NACK and PLI feedbacks comes, FPS changes only slightly, the bitrate is varying. In other words, we’ve got the WebRTC we’ve been waiting for.

Please note changes of width and height. For example, if we change orientation to horizontal, resolution of the stream reverses from 640×480 to 480×640, like shown below.

Change the height and width

The orange line here is the width, and the cyan line is the height of the image. At 05:21:17 we turned the iPhone that performs streaming horizontally and the resolution of the stream changed accordingly: width 480 and height 640.

 

Testing video playback from an IP-camera using WebRTC for iOS Safari

An IP-camera is usually a portable Linux server that sends streams via the RTSP protocol. In this test we fetch the video from the IP camera that supports H.264 and play this video in the iOS Safari browser via WebRTC. To do this, we enter the RTSP address of the stream instead of its name in the player we used before.

Playing a stream from the IP-camera in Safari via WebRTC looks as follows:

Playback of a stream from an IP camera via WebRTC in Safari

In this case the video plays smoothly without any problems or glitches. However, the source of the stream has significant effect on playback. Depending on how the video goes from the IP-camera to the server, things can look different.

As a result, we successfully tested three cases:

  • Broadcasting from the Chrome browser to Safari
  • Capturing of the camera and the microphone and broadcasting from Safari to Chrome
  • Playing video from an IP-camera in iOS Safari

 

A few words about the code

To broadcast video streams we use the universal API (Web SDK), that for broadcasting purposes looks like this:

session.createStream({name:'stream22',display:document.getElementById('myVideo')}).publish();

Here, we set a unique name of the stream stream22 and use the div-element

div id='myVideo'

to display the captured camera on the web page.

Playing the same stream in a browser works as folllows:

session.createStream({name:'stream22',display:document.getElementById('myVideo')}).play();

That is, we set the name of the stream and specify the div-element to play this video in. Then, we call the play() method.

iOS Safari is currently the only browser that requires clicking the element before the video starts playing.

So we added a simple code specially for iOS Safari that “activates” the video element before playing the stream by executing the following code:

if (Flashphoner.getMediaProviders()[0] === "WSPlayer") {

        Flashphoner.playFirstSound();

    } else if ((Browser.isSafariWebRTC() && Flashphoner.getMediaProviders()[0] === "WebRTC") || Flashphoner.getMediaProviders()[0] === "MSE") {

        Flashphoner.playFirstVideo(remoteVideo);

}

This code is executed in the standard player upon clicking the play button, so we fulfill Apple’s security requirements and correctly start playing the video.

Flashphoner.playFirstVideo(remoteVideo);

 

In conclusion

iOS 11 Safari has finally received support for WebRTC and this support isn’t going to be removed in further releases. So we can use this possibility to create real-time streaming video and browser calls. Install further iOS 11.x updates and wait for new fixes and bugs. Good streaming!

 

Links

WCS – the server we used to test broadcasting on iOS 11 Safari

Two Way Streaming – the example of a streamer

Source Two Way Streaming – streamer sources

Player– the example of a player

Source Player   – player sources

WSPlayer – playing low latency video streams in iOS 9, 10 Safari