Captions provide viewers with information about what is occurring in their videos for viewers who are not able to understand or hear the language, and many streaming services, such as Netflix, YouTube, and Hulu, support captions to increase accessibility.
There are mainly two types of ways to support captions. One is Open Captions and the other one is Closed Captions.
Open captions, also known as burned-in, baked, burnt or hard-coded captions, are expressed as a part of a video frame, which means it can not be turned on and off. …
For media streaming such as HLS or MPEG-DASH, there is a mechanism called Adaptive Bitrate (ABR), which makes it possible to switch between multiple resolutions based on the network bandwidth. It helps reduce network usage and keep a video playing at the maximum resolution that a player is currently capable of to play. However, what resolution should a player play first? If you have enough bandwidth to play maximum resolution, you would not want to play the minimum resolution first and wait for the resolution goes up eventually.
Let’s look at how ExoPlayer decides the first resolution. Since ExoPlayer is…
With ExoPlayer, seek operation can be done by
player.seekTo(positionMs). In this post, I will look into how seeking can be archived inside ExoPlayer. To be specific, I will address these two.
*) This post is based on the version 2.11.0 of ExoPlayer.
*) I am not familiar with other players, so they could have different seeking logic inside.
To understand seek accuracy, you have to know PTS. PTS is the abbreviation for
presentation time stamp, and it means the time when a specific frame should be…
Sometimes, you may have some situation where you want to limit video resolution, for example, when a user explicitly set the max resolution from the setting or when a playback surface size is too small and you do not need to get the higher resolutions.
ExoPlayer provides a way of setting a max bitrate using
DefaultTrackSelector#Parameters. You can use
setMaxVideoSize to limit the max bitrate.
When a playback fails with some ExoPlaybackException, it’s relatively easier to know what’s is wrong and how to fix it, but how do you solve some non-crash problems like playback stalls for a very short time under certain conditions, or output surface flickers on a specific device.
This is a story of how I debugged a non-crash playback issue and hope this helps someone solve these kinds of problems.
audio and video stop for a very short amount of time (like 500ms) when switching from a lower resolution to a higher resolution.
One day this ticked was assigned to me…
Decoders and Encoders are very important components when handling media sources, and Android devices have multiple decoders/encoders on them. While Android supports a different kind of codecs and container formats, Android API provides ways to decode/encode those specific data.
MediaCodec can be used to access a certain encoder/decoder components on an Android device. For example, you can input raw video data to encode it into H264 encoded data, or pass AAC-encoded audio samples to decode and play the music.
However, how can I know that what kind of decoders/encoders are available on my device or what a retrieved decoder/encoder is…
In the previous article, I wrote about what is Multi-key and how to create sample multi-key contents. In this article, I will explain how to play them. If you have not read the previous article, I would recommend to read this first.
MediaDrmCallback.java is responsible for requesting keys, and you can just return a key response immediately without requesting to the license server. (of course, preparing your own license server works too.)
In this Article, I will not build a license server for simplicity.
If you have multiple keys (like video/audio or HD/FHD), there could be two…
Digital rights management, or you could just refer to it as DRM, is a way of controlling what users can do with some sort of digital content.
To protect the content, the decryption key is stored in a license server, then the player gets the license key from the license server to decrypt the content. As you can see the image below, CDM (Content Decryption Module) is required to play DRM contents.
Kotlin has a JvmStatic annotation, and in this article, I will go through about what JvmStatic annotation does and the necessity of it.
If you add
@JvmStatic to functions, in a
companion object scope like this below, you can access it like
Greeter.hello() from Java code.
To know what JvmStatic annotation does, let’s compare java bytecodes between the one without JvmStatic and the one with JvmStatic.
If you add a JvmStatic annotation, only this bytecode is added and everything else is the same.
// access flags 0x19
public final static hello()V
GETSTATIC com/takusemba/jvmstatic/Greeter.Companion : Lcom/takusemba/jvmstatic/Greeter$Companion;
INVOKEVIRTUAL com/takusemba/jvmstatic/Greeter$Companion.hello …
LivaData helps you observe data. Navigation helps you manage transitions. However, when you use both there is something you have to pay extra attention to.
Let’s think of a situation where you develop a single-activity app with two fragments in it. (MainActivity, Fragment1, Fragment2.)