How to fit video in Live wallpaper, by center-crop and by fitting to width/height?

AndroidAndroid MediaplayerLive Wallpaper

Android Problem Overview


Background

I'm making a live wallpaper that can show a video. In the beginning I thought this is going to be very hard, so some people suggested using OpenGL solutions or other, very complex solutions (such as this one).

Anyway, for this, I've found various places talking about it, and based on this github library (which has some bugs), I finally got it to work.

The problem

While I've succeeded showing a video, I can't find the way to control how it's shown compared to the screen resolution.

Currently it always gets to be stretched to the screen size, meaning that this (video taken from here) :

enter image description here

gets to show as this:

enter image description here

Reason is the different aspect ratio : 560x320 (video resolution) vs 1080x1920 (device resolution).

Note: I'm well aware of solutions of scaling videos, that are available on various Github repositories (such as here), but I'm asking about a live wallpaper. As such, it doesn't have a View, so it's more limited about how to do things. To be more specifically, a solution can't have any kind of layout, a TextureView or a SurfaceView, or any other kind of View.

What I've tried

I tried to play with various fields and functions of the SurfaceHolder, but with no luck so far. Examples:

Here's the current code I've made (full project available here) :

class MovieLiveWallpaperService : WallpaperService() {
    override fun onCreateEngine(): WallpaperService.Engine {
        return VideoLiveWallpaperEngine()
    }

    private enum class PlayerState {
        NONE, PREPARING, READY, PLAYING
    }

    inner class VideoLiveWallpaperEngine : WallpaperService.Engine() {
        private var mp: MediaPlayer? = null
        private var playerState: PlayerState = PlayerState.NONE

        override fun onSurfaceCreated(holder: SurfaceHolder) {
            super.onSurfaceCreated(holder)
            Log.d("AppLog", "onSurfaceCreated")
            mp = MediaPlayer()
            val mySurfaceHolder = MySurfaceHolder(holder)
            mp!!.setDisplay(mySurfaceHolder)
            mp!!.isLooping = true
            mp!!.setVolume(0.0f, 0.0f)
            mp!!.setOnPreparedListener { mp ->
                playerState = PlayerState.READY
                setPlay(true)
            }
            try {
                //mp!!.setDataSource(this@MovieLiveWallpaperService, Uri.parse("http://techslides.com/demos/sample-videos/small.mp4"))
                mp!!.setDataSource(this@MovieLiveWallpaperService, Uri.parse("android.resource://" + packageName + "/" + R.raw.small))
            } catch (e: Exception) {
            }
        }

        override fun onDestroy() {
            super.onDestroy()
            Log.d("AppLog", "onDestroy")
            if (mp == null)
                return
            mp!!.stop()
            mp!!.release()
            playerState = PlayerState.NONE
        }

        private fun setPlay(play: Boolean) {
            if (mp == null)
                return
            if (play == mp!!.isPlaying)
                return
            when {
                !play -> {
                    mp!!.pause()
                    playerState = PlayerState.READY
                }
                mp!!.isPlaying -> return
                playerState == PlayerState.READY -> {
                    Log.d("AppLog", "ready, so starting to play")
                    mp!!.start()
                    playerState = PlayerState.PLAYING
                }
                playerState == PlayerState.NONE -> {
                    Log.d("AppLog", "not ready, so preparing")
                    mp!!.prepareAsync()
                    playerState = PlayerState.PREPARING
                }
            }
        }

        override fun onVisibilityChanged(visible: Boolean) {
            super.onVisibilityChanged(visible)
            Log.d("AppLog", "onVisibilityChanged:" + visible + " " + playerState)
            if (mp == null)
                return
            setPlay(visible)
        }

    }

    class MySurfaceHolder(private val surfaceHolder: SurfaceHolder) : SurfaceHolder {
        override fun addCallback(callback: SurfaceHolder.Callback) = surfaceHolder.addCallback(callback)

        override fun getSurface() = surfaceHolder.surface!!

        override fun getSurfaceFrame() = surfaceHolder.surfaceFrame

        override fun isCreating(): Boolean = surfaceHolder.isCreating

        override fun lockCanvas(): Canvas = surfaceHolder.lockCanvas()

        override fun lockCanvas(dirty: Rect): Canvas = surfaceHolder.lockCanvas(dirty)

        override fun removeCallback(callback: SurfaceHolder.Callback) = surfaceHolder.removeCallback(callback)

        override fun setFixedSize(width: Int, height: Int) = surfaceHolder.setFixedSize(width, height)

        override fun setFormat(format: Int) = surfaceHolder.setFormat(format)

        override fun setKeepScreenOn(screenOn: Boolean) {}

        override fun setSizeFromLayout() = surfaceHolder.setSizeFromLayout()

        override fun setType(type: Int) = surfaceHolder.setType(type)

        override fun unlockCanvasAndPost(canvas: Canvas) = surfaceHolder.unlockCanvasAndPost(canvas)
    }
}

The questions

I'd like to know how to adjust the scale the content based on what we have for ImageView, all while keeping the aspect ratio :

  1. center-crop - fits to 100% of the container (the screen in this case), cropping on sides (top&bottom or left&right) when needed. Doesn't stretch anything. This means the content seems fine, but not all of it might be shown.
  2. fit-center - stretch to fit width/height
  3. center-inside - set as original size, centered, and stretch to fit width/height only if too large.

Android Solutions


Solution 1 - Android

You can achieve this with a TextureView. (surfaceView won't work either).I have found some code which will help you for achieving this.
in this demo you can crop the video in three type center, top & bottom.

TextureVideoView.java

public class TextureVideoView extends TextureView implements TextureView.SurfaceTextureListener {

    // Indicate if logging is on
    public static final boolean LOG_ON = true;

    // Log tag
    private static final String TAG = TextureVideoView.class.getName();

    private MediaPlayer mMediaPlayer;

    private float mVideoHeight;
    private float mVideoWidth;

    private boolean mIsDataSourceSet;
    private boolean mIsViewAvailable;
    private boolean mIsVideoPrepared;
    private boolean mIsPlayCalled;

    private ScaleType mScaleType;
    private State mState;

    public enum ScaleType {
        CENTER_CROP, TOP, BOTTOM
    }

    public enum State {
        UNINITIALIZED, PLAY, STOP, PAUSE, END
    }

    public TextureVideoView(Context context) {
        super(context);
        initView();
    }

    public TextureVideoView(Context context, AttributeSet attrs) {
        super(context, attrs);
        initView();
    }

    public TextureVideoView(Context context, AttributeSet attrs, int defStyle) {
        super(context, attrs, defStyle);
        initView();
    }

    private void initView() {
        initPlayer();
        setScaleType(ScaleType.CENTER_CROP);
        setSurfaceTextureListener(this);
    }

    public void setScaleType(ScaleType scaleType) {
        mScaleType = scaleType;
    }

    private void updateTextureViewSize() {
        float viewWidth = getWidth();
        float viewHeight = getHeight();

        float scaleX = 1.0f;
        float scaleY = 1.0f;

        if (mVideoWidth > viewWidth && mVideoHeight > viewHeight) {
            scaleX = mVideoWidth / viewWidth;
            scaleY = mVideoHeight / viewHeight;
        } else if (mVideoWidth < viewWidth && mVideoHeight < viewHeight) {
            scaleY = viewWidth / mVideoWidth;
            scaleX = viewHeight / mVideoHeight;
        } else if (viewWidth > mVideoWidth) {
            scaleY = (viewWidth / mVideoWidth) / (viewHeight / mVideoHeight);
        } else if (viewHeight > mVideoHeight) {
            scaleX = (viewHeight / mVideoHeight) / (viewWidth / mVideoWidth);
        }

        // Calculate pivot points, in our case crop from center
        int pivotPointX;
        int pivotPointY;

        switch (mScaleType) {
            case TOP:
                pivotPointX = 0;
                pivotPointY = 0;
                break;
            case BOTTOM:
                pivotPointX = (int) (viewWidth);
                pivotPointY = (int) (viewHeight);
                break;
            case CENTER_CROP:
                pivotPointX = (int) (viewWidth / 2);
                pivotPointY = (int) (viewHeight / 2);
                break;
            default:
                pivotPointX = (int) (viewWidth / 2);
                pivotPointY = (int) (viewHeight / 2);
                break;
        }

        Matrix matrix = new Matrix();
        matrix.setScale(scaleX, scaleY, pivotPointX, pivotPointY);

        setTransform(matrix);
    }

    private void initPlayer() {
        if (mMediaPlayer == null) {
            mMediaPlayer = new MediaPlayer();
        } else {
            mMediaPlayer.reset();
        }
        mIsVideoPrepared = false;
        mIsPlayCalled = false;
        mState = State.UNINITIALIZED;
    }

    /**
     * @see MediaPlayer#setDataSource(String)
     */
    public void setDataSource(String path) {
        initPlayer();

        try {
            mMediaPlayer.setDataSource(path);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    /**
     * @see MediaPlayer#setDataSource(Context, Uri)
     */
    public void setDataSource(Context context, Uri uri) {
        initPlayer();

        try {
            mMediaPlayer.setDataSource(context, uri);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    /**
     * @see MediaPlayer#setDataSource(java.io.FileDescriptor)
     */
    public void setDataSource(AssetFileDescriptor afd) {
        initPlayer();

        try {
            long startOffset = afd.getStartOffset();
            long length = afd.getLength();
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), startOffset, length);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    private void prepare() {
        try {
            mMediaPlayer.setOnVideoSizeChangedListener(
                    new MediaPlayer.OnVideoSizeChangedListener() {
                        @Override
                        public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
                            mVideoWidth = width;
                            mVideoHeight = height;
                            updateTextureViewSize();
                        }
                    }
            );
            mMediaPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
                @Override
                public void onCompletion(MediaPlayer mp) {
                    mState = State.END;
                    log("Video has ended.");

                    if (mListener != null) {
                        mListener.onVideoEnd();
                    }
                }
            });

            // don't forget to call MediaPlayer.prepareAsync() method when you use constructor for
            // creating MediaPlayer
            mMediaPlayer.prepareAsync();

            // Play video when the media source is ready for playback.
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) {
                    mIsVideoPrepared = true;
                    if (mIsPlayCalled && mIsViewAvailable) {
                        log("Player is prepared and play() was called.");
                        play();
                    }

                    if (mListener != null) {
                        mListener.onVideoPrepared();
                    }
                }
            });

        } catch (IllegalArgumentException e) {
            Log.d(TAG, e.getMessage());
        } catch (SecurityException e) {
            Log.d(TAG, e.getMessage());
        } catch (IllegalStateException e) {
            Log.d(TAG, e.toString());
        }
    }

    /**
     * Play or resume video. Video will be played as soon as view is available and media player is
     * prepared.
     *
     * If video is stopped or ended and play() method was called, video will start over.
     */
    public void play() {
        if (!mIsDataSourceSet) {
            log("play() was called but data source was not set.");
            return;
        }

        mIsPlayCalled = true;

        if (!mIsVideoPrepared) {
            log("play() was called but video is not prepared yet, waiting.");
            return;
        }

        if (!mIsViewAvailable) {
            log("play() was called but view is not available yet, waiting.");
            return;
        }

        if (mState == State.PLAY) {
            log("play() was called but video is already playing.");
            return;
        }

        if (mState == State.PAUSE) {
            log("play() was called but video is paused, resuming.");
            mState = State.PLAY;
            mMediaPlayer.start();
            return;
        }

        if (mState == State.END || mState == State.STOP) {
            log("play() was called but video already ended, starting over.");
            mState = State.PLAY;
            mMediaPlayer.seekTo(0);
            mMediaPlayer.start();
            return;
        }

        mState = State.PLAY;
        mMediaPlayer.start();
    }

    /**
     * Pause video. If video is already paused, stopped or ended nothing will happen.
     */
    public void pause() {
        if (mState == State.PAUSE) {
            log("pause() was called but video already paused.");
            return;
        }

        if (mState == State.STOP) {
            log("pause() was called but video already stopped.");
            return;
        }

        if (mState == State.END) {
            log("pause() was called but video already ended.");
            return;
        }

        mState = State.PAUSE;
        if (mMediaPlayer.isPlaying()) {
            mMediaPlayer.pause();
        }
    }

    /**
     * Stop video (pause and seek to beginning). If video is already stopped or ended nothing will
     * happen.
     */
    public void stop() {
        if (mState == State.STOP) {
            log("stop() was called but video already stopped.");
            return;
        }

        if (mState == State.END) {
            log("stop() was called but video already ended.");
            return;
        }

        mState = State.STOP;
        if (mMediaPlayer.isPlaying()) {
            mMediaPlayer.pause();
            mMediaPlayer.seekTo(0);
        }
    }

    /**
     * @see MediaPlayer#setLooping(boolean)
     */
    public void setLooping(boolean looping) {
        mMediaPlayer.setLooping(looping);
    }

    /**
     * @see MediaPlayer#seekTo(int)
     */
    public void seekTo(int milliseconds) {
        mMediaPlayer.seekTo(milliseconds);
    }

    /**
     * @see MediaPlayer#getDuration()
     */
    public int getDuration() {
        return mMediaPlayer.getDuration();
    }

    static void log(String message) {
        if (LOG_ON) {
            Log.d(TAG, message);
        }
    }

    private MediaPlayerListener mListener;

    /**
     * Listener trigger 'onVideoPrepared' and `onVideoEnd` events
     */
    public void setListener(MediaPlayerListener listener) {
        mListener = listener;
    }

    public interface MediaPlayerListener {

        public void onVideoPrepared();

        public void onVideoEnd();
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
        Surface surface = new Surface(surfaceTexture);
        mMediaPlayer.setSurface(surface);
        mIsViewAvailable = true;
        if (mIsDataSourceSet && mIsPlayCalled && mIsVideoPrepared) {
            log("View is available and play() was called.");
            play();
        }
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }
}

After that use this class like the below code in MainActivity.java

public class MainActivity extends AppCompatActivity implements View.OnClickListener,
        ActionBar.OnNavigationListener {

    // Video file url
    private static final String FILE_URL = "http://techslides.com/demos/sample-videos/small.mp4";
    private TextureVideoView mTextureVideoView;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        initView();
        initActionBar();

        if (!isWIFIOn(getBaseContext())) {
            Toast.makeText(getBaseContext(), "You need internet connection to stream video",
                    Toast.LENGTH_LONG).show();
        }
    }

    private void initActionBar() {
        ActionBar actionBar = getSupportActionBar();
        actionBar.setNavigationMode(ActionBar.NAVIGATION_MODE_LIST);
        actionBar.setDisplayShowTitleEnabled(false);

        SpinnerAdapter mSpinnerAdapter = ArrayAdapter.createFromResource(this, R.array.action_list,
                android.R.layout.simple_spinner_dropdown_item);
        actionBar.setListNavigationCallbacks(mSpinnerAdapter, this);
    }

    private void initView() {
        mTextureVideoView = (TextureVideoView) findViewById(R.id.cropTextureView);

        findViewById(R.id.btnPlay).setOnClickListener(this);
        findViewById(R.id.btnPause).setOnClickListener(this);
        findViewById(R.id.btnStop).setOnClickListener(this);
    }

    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.btnPlay:
                mTextureVideoView.play();
                break;
            case R.id.btnPause:
                mTextureVideoView.pause();
                break;
            case R.id.btnStop:
                mTextureVideoView.stop();
                break;
        }
    }

    final int indexCropCenter = 0;
    final int indexCropTop = 1;
    final int indexCropBottom = 2;

    @Override
    public boolean onNavigationItemSelected(int itemPosition, long itemId) {
        switch (itemPosition) {
            case indexCropCenter:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.CENTER_CROP);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
            case indexCropTop:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.TOP);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
            case indexCropBottom:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.BOTTOM);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
        }
        return true;
    }

    public static boolean isWIFIOn(Context context) {
        ConnectivityManager connMgr =
                (ConnectivityManager) context.getSystemService(Context.CONNECTIVITY_SERVICE);
        NetworkInfo networkInfo = connMgr.getNetworkInfo(ConnectivityManager.TYPE_WIFI);

        return (networkInfo != null && networkInfo.isConnected());
    }
}

and layout activity_main.xml file for that is below

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent">

    <com.example.videocropdemo.crop.TextureVideoView
        android:id="@+id/cropTextureView"
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"
        android:layout_centerInParent="true" />

    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:layout_alignParentBottom="true"
        android:layout_margin="16dp"
        android:orientation="horizontal">

        <Button
            android:id="@+id/btnPlay"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Play" />

        <Button
            android:id="@+id/btnPause"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Pause" />

        <Button
            android:id="@+id/btnStop"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Stop" />
    </LinearLayout>
</RelativeLayout>

Output of the code for center crop look like

enter image description here

Solution 2 - Android

So I wasn't yet able to get all scale types that you've asked but I've been able to get fit-xy and center-crop working fairly easily using exo player. The full code can be seen at https://github.com/yperess/StackOverflow/tree/50091878 and I'll update it as I get more. Eventually I'll also fill the MainActivity to allow you to choose the scaling type as the settings (I'll do this with a simple PreferenceActivity) and read the shared preferences value on the service side.

The overall idea is that deep down MediaCodec already implements both fit-xy and center-crop which are really the only 2 modes you would need if you had access to a view hierarchy. This is the case because fit-center, fit-top, fit-bottom would all really just be fit-xy where the surface has a gravity and is scaled to match the video size * minimum scaling. To get these working what I believe will need to happen is we'd need to create an OpenGL context and provide a SurfaceTexture. This SurfaceTexture can be wrapped with a stub Surface which can be passed to exo player. Once the video is loaded we can set the size of these since we created them. We also have a callback on SurfaceTexture to let us know when a frame is ready. At this point we should be able to modify the frame (hopefully just using a simple matrix scale and transform).

The key components here are creating the exo player:

    private fun initExoMediaPlayer(): SimpleExoPlayer {
        val videoTrackSelectionFactory = AdaptiveTrackSelection.Factory(bandwidthMeter)
        val trackSelector = DefaultTrackSelector(videoTrackSelectionFactory)
        val player = ExoPlayerFactory.newSimpleInstance(this@MovieLiveWallpaperService,
                trackSelector)
        player.playWhenReady = true
        player.repeatMode = Player.REPEAT_MODE_ONE
        player.volume = 0f
        if (mode == Mode.CENTER_CROP) {
            player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING
        } else {
            player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT
        }
        if (mode == Mode.FIT_CENTER) {
            player.addVideoListener(this)
        }
        return player
    }

Then loading the video:

    override fun onSurfaceCreated(holder: SurfaceHolder) {
        super.onSurfaceCreated(holder)
        if (mode == Mode.FIT_CENTER) {
            // We need to somehow wrap the surface or set some scale factor on exo player here.
            // Most likely this will require creating a SurfaceTexture and attaching it to an
            // OpenGL context. Then for each frame, writing it to the original surface but with
            // an offset
            exoMediaPlayer.setVideoSurface(holder.surface)
        } else {
            exoMediaPlayer.setVideoSurfaceHolder(holder)
        }

        val videoUri = RawResourceDataSource.buildRawResourceUri(R.raw.small)
        val dataSourceFactory = DataSource.Factory { RawResourceDataSource(context) }
        val mediaSourceFactory = ExtractorMediaSource.Factory(dataSourceFactory)
        exoMediaPlayer.prepare(mediaSourceFactory.createMediaSource(videoUri))
    }

UPDATE:

Got it working, I'll need tomorrow to clean it up before I post the code but here's a sneak preview... fit_center

What I ended up doing it basically taking GLSurfaceView and ripping it apart. If you look at the source for it the only thing missing that's making it impossible to use in a wallpaper is the fact that it only starts the GLThread when attached to the window. So if you replicate the same code but allow to manually start the GLThread you can go ahead. After that you just need to keep track of how big your screen is vs the video after scaling to the minimum scale that would fit and shift the quad on which you draw to.

Known issues with the code:

  1. There's a small bug with the GLThread I haven't been able to fish out. Seems like there's a simple timing issue where when the thread pauses I get a call to signallAll() that's not actually waiting on anything.
  2. I didn't bother dynamically modifying the mode in the renderer. It shouldn't be too hard. Add a preference listener when creating the Engine then update the renderer when scale_type changes.

UPDATE: All issues have been resolved. signallAll() was throwing because I missed a check to see that we actually have the lock. I also added a listener to update the scale type dynamically so now all scale types use the GlEngine.

ENJOY!

Solution 3 - Android

I find this article: How to set video as live wallpaper and keep video aspect ratio(width and height)

above article has simple source, just click "set wallpaper" button, if you wanna full feature app, see https://github.com/AlynxZhou/alynx-live-wallpaper

the keypoint is use glsurfaceview instead of wallpaperservice default surfaceview, make custom glsurfaceview renderer, glsurfaceview can use opengl to display, so the question become "how to use glsurfaceview play video" or "how to use opengl play video"

how to use glsurfaceview instead of wallpaperservice default surfaceview:

public class GLWallpaperService extends WallpaperService {
...

    class GLWallpaperEngine extends Engine {
...

        private class GLWallpaperSurfaceView extends GLSurfaceView {
            @SuppressWarnings("unused")
            private static final String TAG = "GLWallpaperSurface";

            public GLWallpaperSurfaceView(Context context) {
                super(context);
            }

            /**
             * This is a hack. Because Android Live Wallpaper only has a Surface.
             * So we create a GLSurfaceView, and when drawing to its Surface,
             * we replace it with WallpaperEngine's Surface.
             */
            @Override
            public SurfaceHolder getHolder() {
                return getSurfaceHolder();
            }

            void onDestroy() {
                super.onDetachedFromWindow();
            }
        }

Solution 4 - Android

my solution is use gif(size and fps same with video) instead of video in live wallpaper

see my answer: https://stackoverflow.com/a/60425717/6011193, WallpaperService can best fit gif

convert video to gif in computer with ffmpeg

or in android, video can be converted to gif in android code: see https://stackoverflow.com/a/16749143/6011193

Solution 5 - Android

You can use Glide for GIF and image loading and its give scaling options as you like. Based on document https://bumptech.github.io/glide/doc/targets.html#sizes-and-dimensions and https://futurestud.io/tutorials/glide-image-resizing-scaling this.

Glide v4 requires Android Ice Cream Sandwich (API level 14) or higher.

Like :

public static void loadCircularImageGlide(String imagePath, ImageView view) {
    Glide.with(view.getContext())
            .load(imagePath)
            .asGif()
            .override(600, 200) // resizes the image to these dimensions (in pixel). resize does not respect aspect ratio
            .error(R.drawable.create_timeline_placeholder)
            .fitCenter() // scaling options
            .transform(new CircularTransformation(view.getContext())) // Even you can Give image tranformation too
            .into(view);
}

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionandroid developerView Question on Stackoverflow
Solution 1 - AndroidMaraj HussainView Answer on Stackoverflow
Solution 2 - AndroidTheHebrewHammerView Answer on Stackoverflow
Solution 3 - AndroidchikadanceView Answer on Stackoverflow
Solution 4 - AndroidchikadanceView Answer on Stackoverflow
Solution 5 - AndroidMayur PatelView Answer on Stackoverflow