Mediametadataretriever.getframeattime() Returns Only First Frame
Solution 1:
MediaMetadataRetriever
's getFrameAt
method takes in microseconds (1/1000000th of a second) instead of milliseconds, so in your case it is always rounding down to the 1st frame.
Solution 2:
Just convert your milliseconds to micro because getFrameAt get data in milliseconds
1 miliseconds have 1000 microseconds..
for(int i=1000000;i<millis*1000;i+=1000000){
Bitmap bitmap=retriever.getFrameAtTime(i,OPTION_CLOSEST_SYNC);
rev.add(bitmap);
}
then your problem is solved..
I created it according to my need.
publicclassMainActivityextendsActivity
{
@OverrideprotectedvoidonCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
File videoFile=newFile(Environment.getExternalStorageDirectory().getAbsolutePath()+"/screenshots/","myvideo.mp4");
Uri videoFileUri=Uri.parse(videoFile.toString());
MediaMetadataRetrieverretriever=newMediaMetadataRetriever();
retriever.setDataSource(videoFile.getAbsolutePath());
ArrayList<Bitmap> rev=newArrayList<Bitmap>();
//Create a new Media PlayerMediaPlayermp= MediaPlayer.create(getBaseContext(), videoFileUri);
intmillis= mp.getDuration();
for(int i=1000000;i<millis*1000;i+=1000000)
{
Bitmap bitmap=retriever.getFrameAtTime(i,MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
rev.add(bitmap);
}
}
}
Solution 3:
An alternative solution to replace the getFrameAt
method of Android MediaMetadataRetriever
. AV_FrameCapture uses MediaCodec to decode the video frame and use OpenGL
to convert the video frame as RGB Bitmap. As Android MediaMetadataRetriever
does not guarantee to return a result when calling getFrameAtTime
, this AV_FrameCapture
can be used to extract a frame of video with accuracy.
Follow below steps to implement AV_FrameCapture
1. Add below classes to your project.
list of classes: AV_GLHelper.java
, AV_GLUtil.java
, AV_TextureRender.java
, AV_FrameCapture.java
, AV_VideoDecoder.java
, AV_BitmapUtil.java
2. How to use
we have provided capture snapshots of the video in two ways. if you want to capture using MediaMetadataRetriever
then use USE_MEDIA_META_DATA_RETRIEVER = true
; or else it will capture using OpenGL
.
Finally call captureFrame
function with VIDEO_FILE_PATH
, SNAPSHOT_DURATION_IN_MILLIS
& SNAPSHOT_WIDTH
, SNAPSHOT_HEIGHT
. it will automatically capture a screenshot of that frame.
privateAV_FrameCapturemFrameCapture=null;
booleanUSE_MEDIA_META_DATA_RETRIEVER=false;
privatevoidcaptureFrame(String VIDEO_FILE_PATH, long SNAPSHOT_DURATION_IN_MILLIS, int SNAPSHOT_WIDTH, int SNAPSHOT_HEIGHT) {
// getFrameAtTimeByMMDR & getFrameAtTimeByFrameCapture function uses a micro sec 1millisecond = 1000 microsecondsBitmapbmp= USE_MEDIA_META_DATA_RETRIEVER ? getFrameAtTimeByMMDR(VIDEO_FILE_PATH, (SNAPSHOT_DURATION_IN_MILLIS * 1000))
: getFrameAtTimeByFrameCapture(VIDEO_FILE_PATH, (SNAPSHOT_DURATION_IN_MILLIS * 1000), SNAPSHOT_WIDTH, SNAPSHOT_HEIGHT);
StringtimeStamp=newSimpleDateFormat("ddMMyyyy_HHmmss", Locale.getDefault()).format(newDate());
if (null != bmp) {
AV_BitmapUtil.saveBitmap(bmp, String.format("/sdcard/read_%s.jpg", timeStamp));
}
if (mFrameCapture != null) {
mFrameCapture.release();
}
}
private Bitmap getFrameAtTimeByMMDR(String path, long time) {
MediaMetadataRetrievermmr=newMediaMetadataRetriever();
mmr.setDataSource(path);
Bitmapbmp= mmr.getFrameAtTime(time, MediaMetadataRetriever.OPTION_CLOSEST);
mmr.release();
return bmp;
}
private Bitmap getFrameAtTimeByFrameCapture(String path, long time, int snapshot_width, int snapshot_height) {
mFrameCapture = newAV_FrameCapture();
mFrameCapture.setDataSource(path);
mFrameCapture.setTargetSize(snapshot_width, snapshot_height);
mFrameCapture.init();
return mFrameCapture.getFrameAtTime(time);
}
List of classes.
1. AV_GLHelper.java
publicclassAV_GLHelper {
privatestaticfinalintEGL_RECORDABLE_ANDROID=0x3142;
privatestaticfinalintEGL_OPENGL_ES2_BIT=4;
private SurfaceTexture mSurfaceTexture;
private AV_TextureRender mTextureRender;
privateEGLDisplaymEglDisplay= EGL14.EGL_NO_DISPLAY;
privateEGLContextmEglContext= EGL14.EGL_NO_CONTEXT;
privateEGLSurfacemEglSurface= EGL14.EGL_NO_SURFACE;
publicvoidinit(SurfaceTexture st) {
mSurfaceTexture = st;
initGL();
makeCurrent();
mTextureRender = newAV_TextureRender();
}
privatevoidinitGL() {
mEglDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
if (mEglDisplay == EGL14.EGL_NO_DISPLAY) {
thrownewRuntimeException("eglGetdisplay failed : " +
GLUtils.getEGLErrorString(EGL14.eglGetError()));
}
int[] version = newint[2];
if (!EGL14.eglInitialize(mEglDisplay, version, 0, version, 1)) {
mEglDisplay = null;
thrownewRuntimeException("unable to initialize EGL14");
}
// Configure EGL for pbuffer and OpenGL ES 2.0. We want enough RGB bits// to be able to tell if the frame is reasonable.int[] attribList = {
EGL14.EGL_RED_SIZE, 8,
EGL14.EGL_GREEN_SIZE, 8,
EGL14.EGL_BLUE_SIZE, 8,
EGL14.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL_RECORDABLE_ANDROID, 1,
EGL14.EGL_NONE
};
EGLConfig[] configs = newEGLConfig[1];
int[] numConfigs = newint[1];
if (!EGL14.eglChooseConfig(mEglDisplay, attribList, 0, configs, 0, configs.length,
numConfigs, 0)) {
thrownewRuntimeException("unable to find RGB888+recordable ES2 EGL config");
}
// Configure context for OpenGL ES 2.0.int[] attrib_list = {
EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
EGL14.EGL_NONE
};
mEglContext = EGL14.eglCreateContext(mEglDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
attrib_list, 0);
AV_GLUtil.checkEglError("eglCreateContext");
if (mEglContext == null) {
thrownewRuntimeException("null context");
}
// Create a window surface, and attach it to the Surface we received.int[] surfaceAttribs = {
EGL14.EGL_NONE
};
mEglSurface = EGL14.eglCreateWindowSurface(mEglDisplay, configs[0], newSurface(mSurfaceTexture),
surfaceAttribs, 0);
AV_GLUtil.checkEglError("eglCreateWindowSurface");
if (mEglSurface == null) {
thrownewRuntimeException("surface was null");
}
}
publicvoidrelease() {
if (null != mSurfaceTexture)
mSurfaceTexture.release();
}
publicvoidmakeCurrent() {
if (!EGL14.eglMakeCurrent(mEglDisplay, mEglSurface, mEglSurface, mEglContext)) {
thrownewRuntimeException("eglMakeCurrent failed");
}
}
publicintcreateOESTexture() {
int[] textures = newint[1];
GLES20.glGenTextures(1, textures, 0);
inttextureID= textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID);
AV_GLUtil.checkEglError("glBindTexture textureID");
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
AV_GLUtil.checkEglError("glTexParameter");
return textureID;
}
publicvoiddrawFrame(SurfaceTexture st, int textureID) {
st.updateTexImage();
if (null != mTextureRender)
mTextureRender.drawFrame(st, textureID);
}
public Bitmap readPixels(int width, int height) {
ByteBufferPixelBuffer= ByteBuffer.allocateDirect(4 * width * height);
PixelBuffer.position(0);
GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, PixelBuffer);
Bitmapbmp= Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
PixelBuffer.position(0);
bmp.copyPixelsFromBuffer(PixelBuffer);
return bmp;
}
}
2. AV_GLUtil.java
publicclassAV_GLUtil{
/**
* Checks for EGL errors.
*/publicstaticvoid checkEglError(String msg) {
boolean failed = false;
interror;
while ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
Log.e("TAG", msg + ": EGL error: 0x" + Integer.toHexString(error));
failed = true;
}
if (failed) {
thrownewRuntimeException("EGL error encountered (see log)");
}
}
}
3. AV_TextureRender.java
classAV_TextureRender {
privatestaticfinalStringTAG="TextureRender";
privatestaticfinalintFLOAT_SIZE_BYTES=4;
privatestaticfinalintTRIANGLE_VERTICES_DATA_STRIDE_BYTES=5 * FLOAT_SIZE_BYTES;
privatestaticfinalintTRIANGLE_VERTICES_DATA_POS_OFFSET=0;
privatestaticfinalintTRIANGLE_VERTICES_DATA_UV_OFFSET=3;
privatefinalfloat[] mTriangleVerticesData = {
// X, Y, Z, U, V
-1.0f, -1.0f, 0, 0.f, 1.f,
1.0f, -1.0f, 0, 1.f, 1.f,
-1.0f, 1.0f, 0, 0.f, 0.f,
1.0f, 1.0f, 0, 1.f, 0.f,
};
private FloatBuffer mTriangleVertices;
privatestaticfinalStringVERTEX_SHADER="uniform mat4 uMVPMatrix;\n" +
"uniform mat4 uSTMatrix;\n" +
"attribute vec4 aPosition;\n" +
"attribute vec4 aTextureCoord;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
" gl_Position = uMVPMatrix * aPosition;\n" +
" vTextureCoord = (uSTMatrix * aTextureCoord).xy;\n" +
"}\n";
privatestaticfinalStringFRAGMENT_SHADER="#extension GL_OES_EGL_image_external : require\n" +
"precision mediump float;\n" + // highp here doesn't seem to matter"varying vec2 vTextureCoord;\n" +
"uniform samplerExternalOES sTexture;\n" +
"void main() {\n" +
" vec2 texcoord = vTextureCoord;\n" +
" vec3 normalColor = texture2D(sTexture, texcoord).rgb;\n" +
" normalColor = vec3(normalColor.r, normalColor.g, normalColor.b);\n" +
" gl_FragColor = vec4(normalColor.r, normalColor.g, normalColor.b, 1); \n"+
"}\n";
privatefloat[] mMVPMatrix = newfloat[16];
privatefloat[] mSTMatrix = newfloat[16];
privateint mProgram;
privateint muMVPMatrixHandle;
privateint muSTMatrixHandle;
privateint maPositionHandle;
privateint maTextureHandle;
publicAV_TextureRender() {
mTriangleVertices = ByteBuffer.allocateDirect(
mTriangleVerticesData.length * FLOAT_SIZE_BYTES)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTriangleVertices.put(mTriangleVerticesData).position(0);
Matrix.setIdentityM(mSTMatrix, 0);
init();
}
publicvoiddrawFrame(SurfaceTexture st, int textureID) {
AV_GLUtil.checkEglError("onDrawFrame start");
st.getTransformMatrix(mSTMatrix);
GLES20.glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
if (GLES20.glIsProgram( mProgram ) != true){
reCreateProgram();
}
GLES20.glUseProgram(mProgram);
AV_GLUtil.checkEglError("glUseProgram");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID);
mTriangleVertices.position(TRIANGLE_VERTICES_DATA_POS_OFFSET);
GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
AV_GLUtil.checkEglError("glVertexAttribPointer maPosition");
GLES20.glEnableVertexAttribArray(maPositionHandle);
AV_GLUtil.checkEglError("glEnableVertexAttribArray maPositionHandle");
mTriangleVertices.position(TRIANGLE_VERTICES_DATA_UV_OFFSET);
GLES20.glVertexAttribPointer(maTextureHandle, 3, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
AV_GLUtil.checkEglError("glVertexAttribPointer maTextureHandle");
GLES20.glEnableVertexAttribArray(maTextureHandle);
AV_GLUtil.checkEglError("glEnableVertexAttribArray maTextureHandle");
Matrix.setIdentityM(mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(muSTMatrixHandle, 1, false, mSTMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
AV_GLUtil.checkEglError("glDrawArrays");
GLES20.glFinish();
}
/**
* Initializes GL state. Call this after the EGL surface has been created and made current.
*/publicvoidinit() {
mProgram = createProgram(VERTEX_SHADER, FRAGMENT_SHADER);
if (mProgram == 0) {
thrownewRuntimeException("failed creating program");
}
maPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
AV_GLUtil.checkEglError("glGetAttribLocation aPosition");
if (maPositionHandle == -1) {
thrownewRuntimeException("Could not get attrib location for aPosition");
}
maTextureHandle = GLES20.glGetAttribLocation(mProgram, "aTextureCoord");
AV_GLUtil.checkEglError("glGetAttribLocation aTextureCoord");
if (maTextureHandle == -1) {
thrownewRuntimeException("Could not get attrib location for aTextureCoord");
}
muMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
AV_GLUtil.checkEglError("glGetUniformLocation uMVPMatrix");
if (muMVPMatrixHandle == -1) {
thrownewRuntimeException("Could not get attrib location for uMVPMatrix");
}
muSTMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uSTMatrix");
AV_GLUtil.checkEglError("glGetUniformLocation uSTMatrix");
if (muSTMatrixHandle == -1) {
thrownewRuntimeException("Could not get attrib location for uSTMatrix");
}
}
privatevoidreCreateProgram() {
mProgram = createProgram(VERTEX_SHADER, FRAGMENT_SHADER);
if (mProgram == 0) {
thrownewRuntimeException("failed creating program");
}
maPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
AV_GLUtil.checkEglError("glGetAttribLocation aPosition");
if (maPositionHandle == -1) {
thrownewRuntimeException("Could not get attrib location for aPosition");
}
maTextureHandle = GLES20.glGetAttribLocation(mProgram, "aTextureCoord");
AV_GLUtil.checkEglError("glGetAttribLocation aTextureCoord");
if (maTextureHandle == -1) {
thrownewRuntimeException("Could not get attrib location for aTextureCoord");
}
muMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
AV_GLUtil.checkEglError("glGetUniformLocation uMVPMatrix");
if (muMVPMatrixHandle == -1) {
thrownewRuntimeException("Could not get attrib location for uMVPMatrix");
}
muSTMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uSTMatrix");
AV_GLUtil.checkEglError("glGetUniformLocation uSTMatrix");
if (muSTMatrixHandle == -1) {
thrownewRuntimeException("Could not get attrib location for uSTMatrix");
}
}
privateintloadShader(int shaderType, String source) {
intshader= GLES20.glCreateShader(shaderType);
AV_GLUtil.checkEglError("glCreateShader type=" + shaderType);
GLES20.glShaderSource(shader, source);
GLES20.glCompileShader(shader);
int[] compiled = newint[1];
GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);
if (compiled[0] == 0) {
Log.e(TAG, "Could not compile shader " + shaderType + ":");
Log.e(TAG, " " + GLES20.glGetShaderInfoLog(shader));
GLES20.glDeleteShader(shader);
shader = 0;
}
return shader;
}
privateintcreateProgram(String vertexSource, String fragmentSource) {
intvertexShader= loadShader(GLES20.GL_VERTEX_SHADER, vertexSource);
if (vertexShader == 0) {
return0;
}
intpixelShader= loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);
if (pixelShader == 0) {
return0;
}
intprogram= GLES20.glCreateProgram();
AV_GLUtil.checkEglError("glCreateProgram");
if (program == 0) {
Log.e(TAG, "Could not create program");
}
GLES20.glAttachShader(program, vertexShader);
AV_GLUtil.checkEglError("glAttachShader");
GLES20.glAttachShader(program, pixelShader);
AV_GLUtil.checkEglError("glAttachShader");
GLES20.glLinkProgram(program);
int[] linkStatus = newint[1];
GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0);
if (linkStatus[0] != GLES20.GL_TRUE) {
Log.e(TAG, "Could not link program: ");
Log.e(TAG, GLES20.glGetProgramInfoLog(program));
GLES20.glDeleteProgram(program);
program = 0;
}
return program;
}
}
4. AV_FrameCapture.java
publicclassAV_FrameCapture {
finalstaticStringTAG="AV_FrameCapture";
privateHandlerThreadmGLThread=null;
privateHandlermGLHandler=null;
privateAV_GLHelpermGLHelper=null;
privateintmDefaultTextureID=10001;
privateintmWidth=1920;
privateintmHeight=1080;
privateStringmPath=null;
publicAV_FrameCapture() {
mGLHelper = newAV_GLHelper();
mGLThread = newHandlerThread("AV_FrameCapture");
mGLThread.start();
mGLHandler = newHandler(mGLThread.getLooper());
}
publicvoidsetDataSource(String path) {
mPath = path;
}
publicvoidsetTargetSize(int width, int height) {
mWidth = width;
mHeight = height;
}
publicvoidinit() {
mGLHandler.post(newRunnable() {
@Overridepublicvoidrun() {
SurfaceTexturest=newSurfaceTexture(mDefaultTextureID);
st.setDefaultBufferSize(mWidth, mHeight);
mGLHelper.init(st);
}
});
}
publicvoidrelease() {
mGLHandler.post(newRunnable() {
@Overridepublicvoidrun() {
mGLHelper.release();
mGLThread.quit();
}
});
}
privateObjectmWaitBitmap=newObject();
privateBitmapmBitmap=null;
public Bitmap getFrameAtTime(finallong frameTime) {
if (null == mPath || mPath.isEmpty()) {
thrownewRuntimeException("Illegal State");
}
mGLHandler.post(newRunnable() {
@Overridepublicvoidrun() {
getFrameAtTimeImpl(frameTime);
}
});
synchronized (mWaitBitmap) {
try {
mWaitBitmap.wait();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
return mBitmap;
}
@SuppressLint("SdCardPath")publicvoidgetFrameAtTimeImpl(long frameTime) {
finalinttextureID= mGLHelper.createOESTexture();
finalSurfaceTexturest=newSurfaceTexture(textureID);
finalSurfacesurface=newSurface(st);
finalAV_VideoDecodervd=newAV_VideoDecoder(mPath, surface);
st.setOnFrameAvailableListener(newOnFrameAvailableListener() {
@OverridepublicvoidonFrameAvailable(SurfaceTexture surfaceTexture) {
Log.i(TAG, "onFrameAvailable");
mGLHelper.drawFrame(st, textureID);
mBitmap = mGLHelper.readPixels(mWidth, mHeight);
synchronized (mWaitBitmap) {
mWaitBitmap.notify();
}
vd.release();
st.release();
surface.release();
}
});
if (!vd.prepare(frameTime)) {
mBitmap = null;
synchronized (mWaitBitmap) {
mWaitBitmap.notify();
}
}
}
}
5. AV_VideoDecoder.java
publicclassAV_VideoDecoder {
finalstaticStringTAG="VideoDecoder";
finalstaticStringVIDEO_MIME_PREFIX="video/";
privateMediaExtractormMediaExtractor=null;
privateMediaCodecmMediaCodec=null;
privateSurfacemSurface=null;
privateStringmPath=null;
privateintmVideoTrackIndex= -1;
publicAV_VideoDecoder(String path, Surface surface) {
mPath = path;
mSurface = surface;
initCodec();
}
publicbooleanprepare(long time) {
return decodeFrameAt(time);
}
publicvoidstartDecode() {
}
publicvoidrelease() {
if (null != mMediaCodec) {
mMediaCodec.stop();
mMediaCodec.release();
}
if (null != mMediaExtractor) {
mMediaExtractor.release();
}
}
privatebooleaninitCodec() {
Log.i(TAG, "initCodec");
mMediaExtractor = newMediaExtractor();
try {
mMediaExtractor.setDataSource(mPath);
} catch (IOException e) {
e.printStackTrace();
returnfalse;
}
inttrackCount= mMediaExtractor.getTrackCount();
for (inti=0; i < trackCount; ++i) {
MediaFormatmf= mMediaExtractor.getTrackFormat(i);
Stringmime= mf.getString(MediaFormat.KEY_MIME);
if (mime.startsWith(VIDEO_MIME_PREFIX)) {
mVideoTrackIndex = i;
break;
}
}
if (mVideoTrackIndex < 0)
returnfalse;
mMediaExtractor.selectTrack(mVideoTrackIndex);
MediaFormatmf= mMediaExtractor.getTrackFormat(mVideoTrackIndex);
Stringmime= mf.getString(MediaFormat.KEY_MIME);
try {
mMediaCodec = MediaCodec.createDecoderByType(mime);
} catch (IOException e) {
e.printStackTrace();
}
mMediaCodec.configure(mf, mSurface, null, 0);
mMediaCodec.setVideoScalingMode(MediaCodec.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING);
mMediaCodec.start();
Log.i(TAG, "initCodec end");
returntrue;
}
privatebooleanmIsInputEOS=false;
privatebooleandecodeFrameAt(long timeUs) {
Log.i(TAG, "decodeFrameAt " + timeUs);
mMediaExtractor.seekTo(timeUs, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);
mIsInputEOS = false;
CodecStateinputState=newCodecState();
CodecStateoutState=newCodecState();
booleanreachTarget=false;
for (; ; ) {
if (!inputState.EOS)
handleCodecInput(inputState);
if (inputState.outIndex < 0) {
handleCodecOutput(outState);
reachTarget = processOutputState(outState, timeUs);
} else {
reachTarget = processOutputState(inputState, timeUs);
}
if (true == reachTarget || outState.EOS) {
Log.i(TAG, "decodeFrameAt " + timeUs + " reach target or EOS");
break;
}
inputState.outIndex = -1;
outState.outIndex = -1;
}
return reachTarget;
}
privatebooleanprocessOutputState(CodecState state, long timeUs) {
if (state.outIndex < 0)
returnfalse;
if (state.outIndex >= 0 && state.info.presentationTimeUs < timeUs) {
Log.i(TAG, "processOutputState presentationTimeUs " + state.info.presentationTimeUs);
mMediaCodec.releaseOutputBuffer(state.outIndex, false);
returnfalse;
}
if (state.outIndex >= 0) {
Log.i(TAG, "processOutputState presentationTimeUs " + state.info.presentationTimeUs);
mMediaCodec.releaseOutputBuffer(state.outIndex, true);
returntrue;
}
returnfalse;
}
privateclassCodecState {
intoutIndex= MediaCodec.INFO_TRY_AGAIN_LATER;
BufferInfoinfo=newBufferInfo();
booleanEOS=false;
}
privatevoidhandleCodecInput(CodecState state) {
ByteBuffer[] inputBuffer = mMediaCodec.getInputBuffers();
for (; !mIsInputEOS; ) {
intinputBufferIndex= mMediaCodec.dequeueInputBuffer(10000);
if (inputBufferIndex < 0)
continue;
ByteBufferin= inputBuffer[inputBufferIndex];
intreadSize= mMediaExtractor.readSampleData(in, 0);
longpresentationTimeUs= mMediaExtractor.getSampleTime();
intflags= mMediaExtractor.getSampleFlags();
booleanEOS= !mMediaExtractor.advance();
EOS |= (readSize <= 0);
EOS |= ((flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) > 0);
Log.i(TAG, "input presentationTimeUs " + presentationTimeUs + " isEOS " + EOS);
if (EOS && readSize < 0)
readSize = 0;
if (readSize > 0 || EOS)
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, readSize, presentationTimeUs, flags | (EOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0));
if (EOS) {
state.EOS = true;
mIsInputEOS = true;
break;
}
state.outIndex = mMediaCodec.dequeueOutputBuffer(state.info, 10000);
if (state.outIndex >= 0)
break;
}
}
privatevoidhandleCodecOutput(CodecState state) {
state.outIndex = mMediaCodec.dequeueOutputBuffer(state.info, 10000);
if (state.outIndex < 0) {
return;
}
if ((state.info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
state.EOS = true;
Log.i(TAG, "reach output EOS " + state.info.presentationTimeUs);
}
}
}
6. AV_BitmapUtil.java
publicclassAV_BitmapUtil {
publicstaticvoidsaveBitmap(Bitmap bmp, String path) {
try {
FileOutputStream fos = newFileOutputStream(path);
bmp.compress(CompressFormat.JPEG, 100, fos);
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
publicstaticBitmapflip(Bitmap src) {
Matrix matrix = newMatrix();
matrix.preScale(1.0f, -1.0f);
returnBitmap.createBitmap(src, 0, 0, src.getWidth(), src.getHeight(), matrix, true);
}
}
Solution 4:
long time;
String formattedFileCount;
FileOutputStream fos;
BufferedOutputStream bos;
NumberFormatfileCountFormatter=newDecimalFormat("00000");
intfileCount=0;
File jpegFile;
ArrayList<Bitmap> bArray = null;
Bitmaplastbitmap=null;
bArray = newArrayList<Bitmap>();
I am taking Time in microseconds and em getting duration from the Mediaplayer like:
time= mp.getDuration()*1000;
Log.e("Timeeeeeeee", time);
bArray.clear();
MediaMetadataRetriever mRetriever =new MediaMetadataRetriever();
mRetriever.setDataSource(path);
int j=0;
My frame rate is 12 frame/sec so I divided 1/12 = 0.083333 and this is the second's part of a frame and than I convert my frame sec to microsecond so it becomes 83333
for (int i = 833333; i <= time; i=i+83333) {
bArray.add(mRetriever.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
formattedFileCount = fileCountFormatter.format(fileCount);
lastbitmap = bArray.get(j);
j++;
// image is the bitmap
jpegFile = new File(Environment.getExternalStorageDirectory().getPath() + "/frames/frame_" + formattedFileCount + ".jpg");
fileCount++;
try {
fos = new FileOutputStream(jpegFile);
bos = new BufferedOutputStream(fos);
lastbitmap.compress(Bitmap.CompressFormat.JPEG, 15, bos);
bos.flush();
bos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
} catch (Exception e) {
}
Solution 5:
Try this on your for/loop:
Bitmapbitmap= retriever.getFrameAtTime(
TimeUnit.MICROSECONDS.convert(i, TimeUnit.MILLISECONDS),
retriever.OPTION_CLOSEST_SYNC);
Post a Comment for "Mediametadataretriever.getframeattime() Returns Only First Frame"