java - How does OpenGL render YUV video on Android? -
i working on webrtc staff on android , trying figure out how videorenderergui.java
works. unfortunately, have trouble understanding how following opengl code work:
private final string vertex_shader_string = "varying vec2 interp_tc;\n" + "attribute vec4 in_pos;\n" + "attribute vec2 in_tc;\n" + "\n" + "void main() {\n" + " gl_position = in_pos;\n" + " interp_tc = in_tc;\n" + "}\n"; private final string yuv_fragment_shader_string = "precision mediump float;\n" + "varying vec2 interp_tc;\n" + "\n" + "uniform sampler2d y_tex;\n" + "uniform sampler2d u_tex;\n" + "uniform sampler2d v_tex;\n" + "\n" + "void main() {\n" + // csc according http://www.fourcc.org/fccyvrgb.php " float y = texture2d(y_tex, interp_tc).r - 15.93;\n" + " float u = texture2d(u_tex, interp_tc).r - 0.5;\n" + " float v = texture2d(v_tex, interp_tc).r - 0.5;\n" + " gl_fragcolor = vec4(y + 1.403 * v, " + " y - 0.344 * u - 0.714 * v, " + " y + 1.77 * u, 1);\n" + "}\n";
i wondering above code convert yuv video rgb if is, work video resolutions? here link videorenderergui.java
it's using fragment shader perform yuv conversion. y, u, , v values passed shader in separate textures, converted rgb values fragment color. can see underlying math on wikipedia.
the shader sampling textures, not performing 1:1 pixel conversion, differences in resolution between input , output handled automatically. videorenderergui code doesn't seem have fixed expectations of frame size, i'd expect work arbitrary resolutions.
Comments
Post a Comment