Most certainly, and that applies whether you’re on a monitor, projector, portable projector, or your trusty TV. While up until HDMI 1.3 the saying “all HDMI cables are made equal” was more or less correct, since HDMI 1.4 and the advent of 4K the various data bandwidth of each cable makes a huge difference.
To place things in perspective, HDMI 1.3 can pass 10.2Gbps (gigabits per second) and doesn’t support 4K at all. That version of HDMI is now retro, being a product of the 2000s and the 1080p era. HDMI 1.4 has the same bandwidth of 10.2Gbps. It was designed as a quick fix update to HDMI 1.3, with support for 4K in 30Hz and no HDR. Then a massive step up arrived with HDMI 2.0, which nearly doubled bandwidth to 18Gbps. That allows for 4K 60Hz (or 60 frames per second) plus HDR metadata. It’s why HDMI 2.0 was so effective in popularizing 4K HDR video and importantly 4K HDR gaming. While 4K 30Hz may be OK for some game genres, 4K 60Hz offers good performance even in the most reflex-based titles.
More recently, along came HDMI 2.1, the biggest development in HDMI history. This monster again more than doubles bandwidth, going up to 48Gbps. HDMI 2.1 supports 4K 120Hz and 8K 60Hz, so it’s very future proof. The bandwidth overhead is even enough for features like auto low latency mode and variable refresh rates, two features aimed squarely at high end gaming. HDMI 2.1 further supports the next generation of HDR, known as dynamic HDR. As you may guess, dynamic HDR adjusts image parameters on the fly rather than outputting a fixed HDR profile. Amazingly, HDMI 2.1 even supports 10K resolution in 24Hz for cinematic and television content.
So yes, HDMI bandwidth definitely makes a difference. The days of buying just any cable are long behind us.