[PATCH v4 62/78] drm/vc4: hdmi: Adjust HSM clock rate depending on pixel rate

Chanwoo Choi cw00.choi at samsung.com
Tue Sep 1 06:48:41 EDT 2020


Hi Maxime,

On 9/1/20 6:45 PM, Maxime Ripard wrote:
> Hi Chanwoo,
> 
> On Tue, Sep 01, 2020 at 01:36:17PM +0900, Chanwoo Choi wrote:
>> On 7/9/20 2:42 AM, Maxime Ripard wrote:
>>> The HSM clock needs to be setup at around 101% of the pixel rate. This
>>> was done previously by setting the clock rate to 163.7MHz at probe time and
>>> only check in mode_valid whether the mode pixel clock was under the pixel
>>> clock +1% or not.
>>>
>>> However, with 4k we need to change that frequency to a higher frequency
>>> than 163.7MHz, and yet want to have the lowest clock as possible to have a
>>> decent power saving.
>>>
>>> Let's change that logic a bit by setting the clock rate of the HSM clock
>>> to the pixel rate at encoder_enable time. This would work for the
>>> BCM2711 that support 4k resolutions and has a clock that can provide it,
>>> but we still have to take care of a 4k panel plugged on a BCM283x SoCs
>>> that wouldn't be able to use those modes, so let's define the limit in
>>> the variant.
>>>
>>> Signed-off-by: Maxime Ripard <maxime at cerno.tech>
>>> ---
>>>  drivers/gpu/drm/vc4/vc4_hdmi.c | 79 ++++++++++++++++-------------------
>>>  drivers/gpu/drm/vc4/vc4_hdmi.h |  3 +-
>>>  2 files changed, 41 insertions(+), 41 deletions(-)
>>>
>>> diff --git a/drivers/gpu/drm/vc4/vc4_hdmi.c b/drivers/gpu/drm/vc4/vc4_hdmi.c
>>> index 17797b14cde4..9f30fab744f2 100644
>>> --- a/drivers/gpu/drm/vc4/vc4_hdmi.c
>>> +++ b/drivers/gpu/drm/vc4/vc4_hdmi.c
>>> @@ -53,7 +53,6 @@
>>>  #include "vc4_hdmi_regs.h"
>>>  #include "vc4_regs.h"
>>>  
>>> -#define HSM_CLOCK_FREQ 163682864
>>>  #define CEC_CLOCK_FREQ 40000
>>>  
>>>  static int vc4_hdmi_debugfs_regs(struct seq_file *m, void *unused)
>>> @@ -326,6 +325,7 @@ static void vc4_hdmi_encoder_disable(struct drm_encoder *encoder)
>>>  	HDMI_WRITE(HDMI_VID_CTL,
>>>  		   HDMI_READ(HDMI_VID_CTL) & ~VC4_HD_VID_CTL_ENABLE);
>>>  
>>> +	clk_disable_unprepare(vc4_hdmi->hsm_clock);
>>>  	clk_disable_unprepare(vc4_hdmi->pixel_clock);
>>>  
>>>  	ret = pm_runtime_put(&vc4_hdmi->pdev->dev);
>>> @@ -423,6 +423,7 @@ static void vc4_hdmi_encoder_enable(struct drm_encoder *encoder)
>>>  	struct vc4_hdmi *vc4_hdmi = encoder_to_vc4_hdmi(encoder);
>>>  	struct vc4_hdmi_encoder *vc4_encoder = to_vc4_hdmi_encoder(encoder);
>>>  	bool debug_dump_regs = false;
>>> +	unsigned long pixel_rate, hsm_rate;
>>>  	int ret;
>>>  
>>>  	ret = pm_runtime_get_sync(&vc4_hdmi->pdev->dev);
>>> @@ -431,9 +432,8 @@ static void vc4_hdmi_encoder_enable(struct drm_encoder *encoder)
>>>  		return;
>>>  	}
>>>  
>>> -	ret = clk_set_rate(vc4_hdmi->pixel_clock,
>>> -			   mode->clock * 1000 *
>>> -			   ((mode->flags & DRM_MODE_FLAG_DBLCLK) ? 2 : 1));
>>> +	pixel_rate = mode->clock * 1000 * ((mode->flags & DRM_MODE_FLAG_DBLCLK) ? 2 : 1);
>>> +	ret = clk_set_rate(vc4_hdmi->pixel_clock, pixel_rate);
>>>  	if (ret) {
>>>  		DRM_ERROR("Failed to set pixel clock rate: %d\n", ret);
>>>  		return;
>>> @@ -445,6 +445,36 @@ static void vc4_hdmi_encoder_enable(struct drm_encoder *encoder)
>>>  		return;
>>>  	}
>>>  
>>> +	/*
>>> +	 * As stated in RPi's vc4 firmware "HDMI state machine (HSM) clock must
>>> +	 * be faster than pixel clock, infinitesimally faster, tested in
>>> +	 * simulation. Otherwise, exact value is unimportant for HDMI
>>> +	 * operation." This conflicts with bcm2835's vc4 documentation, which
>>> +	 * states HSM's clock has to be at least 108% of the pixel clock.
>>> +	 *
>>> +	 * Real life tests reveal that vc4's firmware statement holds up, and
>>> +	 * users are able to use pixel clocks closer to HSM's, namely for
>>> +	 * 1920x1200 at 60Hz. So it was decided to have leave a 1% margin between
>>> +	 * both clocks. Which, for RPi0-3 implies a maximum pixel clock of
>>> +	 * 162MHz.
>>> +	 *
>>> +	 * Additionally, the AXI clock needs to be at least 25% of
>>> +	 * pixel clock, but HSM ends up being the limiting factor.
>>> +	 */
>>> +	hsm_rate = max_t(unsigned long, 120000000, (pixel_rate / 100) * 101);
>>> +	ret = clk_set_rate(vc4_hdmi->hsm_clock, hsm_rate);
>>> +	if (ret) {
>>> +		DRM_ERROR("Failed to set HSM clock rate: %d\n", ret);
>>> +		return;
>>> +	}
>>> +
>>> +	ret = clk_prepare_enable(vc4_hdmi->hsm_clock);
>>> +	if (ret) {
>>> +		DRM_ERROR("Failed to turn on HSM clock: %d\n", ret);
>>> +		clk_disable_unprepare(vc4_hdmi->pixel_clock);
>>> +		return;
>>> +	}
>>
>> About vc4_hdmi->hsm_clock instance, usually, we need to enable the clock
>> with clk_prepare_enable() and then touch the clock like clk_set_rate().
>> I think that need to enable the clock before calling clk_set_rate().
>>
>> When I tested this patchset, it is well working because I think that
>> vc4_hdmi->hsm_clock was already enabled on other side.
> 
> There's no clear rule here on the ordering (at least enforced by the
> framework). There's clocks that need to be disabled to change their rate
> (CLK_SET_RATE_GATE) and some that need to be enabled to change their
> rate (CLK_SET_RATE_UNGATE).
> 
> Generally speaking, it seems more logical to me to have first the rate
> changed and then the clock enabled since it won't create any "hiccup",
> but I could very well see the opposite to be preferred.

If it doesn't cause the problem on h/w, I agree. Thanks for the explanation.

-- 
Best Regards,
Chanwoo Choi
Samsung Electronics



More information about the linux-arm-kernel mailing list