[RFC V2 0/5] arm64: dts: imx8mm: Enable CSI and OV5640 Camera

Adam Ford aford173 at gmail.com
Sat Oct 23 13:34:51 PDT 2021


The imx8mm appears to have both a CSI bridge and mipi-csi-2 drivers.  With
those enabled, both the imx8mm-evk and imx8mm-beacon boards should be able
use an OV5640 camera.

The mipi-csi2 driver sets the clock frequency to 333MHz, so the clock parent
of the CSI1 must be reparented to a faster clock.  On the custom NXP kernel,
they use IMX8MM_SYS_PLL2_1000M, so that is done in the device tree to match.

With the CSI and mipi_csi2 drivers pointing to an OV5640 camera, the media
pipeline can be configured with the following:

    media-ctl --links "'ov5640 1-003c':0->'imx7-mipi-csis.0':0[1]"

The camera and various nodes in the pipeline can be configured for UYVY:
    media-ctl -v -V "'ov5640 1-003c':0 [fmt:UYVY8_1X16/640x480 field:none]"
    media-ctl -v -V "'csi':0 [fmt:UYVY8_1X16/640x480 field:none]"

With that, the media pipeline looks like:


Media controller API version 5.15.0

Media device information
------------------------
driver          imx7-csi
model           imx-media
serial          
bus info        platform:32e20000.csi
hw revision     0x0
driver version  5.15.0

Device topology
- entity 1: csi (2 pads, 2 links)
            type V4L2 subdev subtype Unknown flags 0
            device node name /dev/v4l-subdev0
	pad0: Sink
		[fmt:UYVY8_1X16/640x480 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:lim-range]
		<- "imx7-mipi-csis.0":1 [ENABLED,IMMUTABLE]
	pad1: Source
		[fmt:UYVY8_1X16/640x480 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:lim-range]
		-> "csi capture":0 [ENABLED,IMMUTABLE]

- entity 4: csi capture (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video0
	pad0: Sink
		<- "csi":1 [ENABLED,IMMUTABLE]

- entity 10: imx7-mipi-csis.0 (2 pads, 2 links)
             type V4L2 subdev subtype Unknown flags 0
             device node name /dev/v4l-subdev1
	pad0: Sink
		[fmt:UYVY8_1X16/640x480 field:none colorspace:smpte170m xfer:709 ycbcr:601 quantization:lim-range]
		<- "ov5640 1-003c":0 [ENABLED]
	pad1: Source
		[fmt:UYVY8_1X16/640x480 field:none colorspace:smpte170m xfer:709 ycbcr:601 quantization:lim-range]
		-> "csi":0 [ENABLED,IMMUTABLE]

- entity 15: ov5640 1-003c (1 pad, 1 link)
             type V4L2 subdev subtype Sensor flags 0
             device node name /dev/v4l-subdev2
	pad0: Source
		[fmt:UYVY8_1X16/640x480 at 1/30 field:none colorspace:srgb xfer:srgb ycbcr:601 quantization:full-range]
		-> "imx7-mipi-csis.0":0 [ENABLED]

When configured, gstreamer can be used to capture 1 frame and store it to a file.

gst-launch-1.0 -v v4l2src num-buffers=1 ! video/x-raw,format=UYVY,width=640,height=480,framerate=60/1 ! filesink location=test

Unfortunately, the video capture never appears to happen.  No errors occur, not
interrupts are recorded and no errors are recorded.

gst-launch-1.0 -v v4l2src num-buffers=1 ! video/x-raw,format=UYVY,width=640,height=480,framerate=60/1 ! filesink location=test
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to [  114.819632] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
PLAYING ...
New clock: GstSystem[  114.829203] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
Clock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709


If anyone has any insight as to what might be wrong, I'd like feedback.
I posted a device tree that I beleive goes with the newer imx8mm-evk, but
I do not have this hardware, so I cannot test it.

Adam Ford (5):
  arm64: dts: imx8mm: Add CSI nodes
  arm64: defconfig: Enable VIDEO_IMX_MEDIA
  arm64: dts: imx8mm-beacon: Enable OV5640 Camera
  arm64: defconfig: Enable OV5640
  arm64: dts: imx8mm-evk: Enable OV5640 Camera

 .../freescale/imx8mm-beacon-baseboard.dtsi    | 55 +++++++++++++++++++
 arch/arm64/boot/dts/freescale/imx8mm-evk.dtsi | 44 +++++++++++++++
 arch/arm64/boot/dts/freescale/imx8mm.dtsi     | 55 +++++++++++++++++++
 arch/arm64/configs/defconfig                  |  2 +
 4 files changed, 156 insertions(+)

-- 
2.25.1




More information about the linux-arm-kernel mailing list