[PATCH v2 0/4] media: meson: Add support for the Amlogic GE2D Accelerator Unit

Neil Armstrong narmstrong at baylibre.com
Thu Nov 5 03:46:20 EST 2020


Hi Linus,

On 05/11/2020 08:53, Linus Walleij wrote:
> Hi Neil,
> 
> this is just a drive-by question and I'm looping in Todd in the hopes for
> a discussion or clarification.
> 
> On Fri, Oct 30, 2020 at 3:37 PM Neil Armstrong <narmstrong at baylibre.com> wrote:
> 
>> The GE2D is a 2D accelerator with various features like configurable blitter
>> with alpha blending, frame rotation, scaling, format conversion and colorspace
>> conversion.
>>
>> The driver implements a Memory2Memory VB2 V4L2 streaming device permitting:
>> - 0, 90, 180, 270deg rotation
>> - horizontal/vertical flipping
>> - source cropping
>> - destination compositing
>> - 32bit/24bit/16bit format conversion
>>
>> This adds the support for the GE2D version found in the AXG SoCs Family.
> 
> We are starting to see a bunch of these really nicely abstracted blitters
> and other 2D-accelerators now.

The actual blitting functionality is limited to non-alpha blitting since
no standard CID are available for this, but we could totally try to find
common CIDs to describe the possible Alpha Blending properties for these
2D accelerators.

> 
> Is stuff like Android going to pick up and use this to blit and blend
> generic buffers?

I'm not sure this is Google's plan right now, but maybe it should be doable.

> 
> Or is this in essence a camera and/or video out accelerator thing?

No it's really a blitter & scaler, rotate & format converter, like
the samsung and rockhip drivers, and somehow the allwinner rotate driver.
Amlogic mainly uses it to copy frames beeing displayed for encoding,
or to convert frames from the HDMI RX on their TV SoCs.

> 
> The placement of this driver in drivers/media makes me think that
> it is for cameras or video output, but the functionality is actually
> quite generic.

It's really a memory-2-memory driver, like the video codecs, it's a separate
class than the camera & video output drivers.

> 
> I've been half-guessing that userspace like Android actually mostly
> use GPUs to composit their graphics, but IIUC this can sometimes be
> used for 2D compositing, and when used will often be quicker and/or
> more energy efficient than using a GPU for the same task.

Well drm-hwcomposer can already use the DRM universal planes and the virtual
writeback connector when available for compositing.
But this kind of driver can be really useful for display rotation for example
when the DRM driver doesn't support it.

Honestly I don't understand the Android graphics stack enough to formally answer
this question, but if it can be used, this kind of driver is much faster and much
simpler than a GPU for simple blitting and rotation.
And since they support DMA-BUF, they can totally be used in a modern graphics pipeline.

Maybe someone could answer ? Maybe drm-hwcomposer could be extended for that ?

I know there is an issue opened in GloDroid for that:
https://github.com/GloDroid/glodroid_manifest/issues/66

For the record, I use this driver to accelerate the LVGL flush to display on the
AXG SoCs lacking a GPU, this by using DMA-BUF and DRM atomic modesetting with the DRM
LVGL display driver I submitted (and tweaked for V4L2 M2M):
https://github.com/lvgl/lv_drivers/blob/master/display/drm.c

Neil

> 
> Yours,
> Linus Walleij
> 




More information about the linux-amlogic mailing list