[PATCH v4 3/8] arm64/runtime-const: Introduce runtime_const_mask_32()
K Prateek Nayak
kprateek.nayak at amd.com
Thu Apr 30 02:47:25 PDT 2026
Futex hash computation requires a mask operation with read-only after
init data that will be converted to a runtime constant in the subsequent
commit.
Introduce runtime_const_mask_32 to further optimize the mask operation
in the futex hash computation hot path. GCC generates a:
movz w1, #lo16, lsl #0 // w1 = bits [15:0]
movk w1, #hi16, lsl #16 // w1 = full 32-bit value
and w0, w0, w1 // w0 = w0 & w1
pattern to tackle arbitrary 32-bit masks and the same was also suggested
by Claude which is implemented here. The final (__ret & mask) operation
is intentiaonally placed outside of asm block to allow compilers to
further optimize it if possible.
__runtime_fixup_ptr() already patches a "movz, + movk lsl #16" sequence
which has been reused to patch the same sequence for
__runtime_fixup_mask().
Assisted-by: Claude:claude-sonnet-4-5
Signed-off-by: K Prateek Nayak <kprateek.nayak at amd.com>
---
changelog v3..v4:
o Reverted back to using __ret as the macro variable to prevent
collision with local varaibles at callsite. (Sashiko)
o Separated out the & operation to prevent any confusion with operator
precedence id "val" is an expression. (Sashiko)
---
arch/arm64/include/asm/runtime-const.h | 19 +++++++++++++++++++
1 file changed, 19 insertions(+)
diff --git a/arch/arm64/include/asm/runtime-const.h b/arch/arm64/include/asm/runtime-const.h
index 838145bc289d2..1db4faac8c373 100644
--- a/arch/arm64/include/asm/runtime-const.h
+++ b/arch/arm64/include/asm/runtime-const.h
@@ -36,6 +36,18 @@
:"r" (0u+(val))); \
__ret; })
+#define runtime_const_mask_32(val, sym) ({ \
+ unsigned long __ret; \
+ asm_inline("1:\t" \
+ "movz %w0, #0xcdef\n\t" \
+ "movk %w0, #0x89ab, lsl #16\n\t" \
+ ".pushsection runtime_mask_" #sym ",\"a\"\n\t" \
+ ".long 1b - .\n\t" \
+ ".popsection" \
+ :"=r" (__ret)); \
+ __ret &= val; /* Allow compiler to optimize & op. */ \
+ __ret; })
+
#define runtime_const_init(type, sym) do { \
extern s32 __start_runtime_##type##_##sym[]; \
extern s32 __stop_runtime_##type##_##sym[]; \
@@ -73,6 +85,13 @@ static inline void __runtime_fixup_shift(void *where, unsigned long val)
aarch64_insn_patch_text_nosync(p, insn);
}
+static inline void __runtime_fixup_mask(void *where, unsigned long val)
+{
+ __le32 *p = where;
+ __runtime_fixup_16(p, val);
+ __runtime_fixup_16(p+1, val >> 16);
+}
+
static inline void runtime_const_fixup(void (*fn)(void *, unsigned long),
unsigned long val, s32 *start, s32 *end)
{
--
2.34.1
More information about the linux-riscv
mailing list