The setup: Sony A8G (circa 2019, running an ancient Android 9 system, which Sony doesn't seem to be updating), M1 MacBook Air (circa 2021, running Ventura 13.4.1), no-name Thunderbolt to HDMI adapter.

With a previous Intel Mac, I could plug this all together and everything worked. I'd never bothered trying it with the shiny newer M1 MacBook Air until today. The screen would connect for a few seconds and then disconnect. I tried it in a variety of configurations. Different HDMI cables. Passing through my AV receiver or not. Same problem. Also tried an M2 MacBook Air. Same problem.

Also of note, this TV seems to be too old to speak Apple AirPlay. That means that the TV doesn't show up for any sort of wireless sharing. (Yes, I "authorized" my laptop. This didn't help.)

My workaround was to use the Chrome browser to "cast" the screen. This worked, but it's a kludge, and for some reason it doesn't go full screen. You get a black border around whatever you're casting.

I'm assuming that the root problem here is HDCP shenanigans, but I don't know how to determine this in any definitive way. I've read many, many web pages, and the best advice seems to be buying an HDMI splitter (1 in, 2 out), typically $15 - $25 on Amazon. These apparently filter out all HDCP requests and then things get back to magically working again. Again, this seems like a kludge, and I suspect that it will fail in some important way when I want something that "just works".

The goal:

Once in a while, I like to use my TV as if it's a computer monitor. Maybe that's to show / practice a PowerPoint talk. Maybe I just want to sit on the couch and work on the giant screen as if it was a regular computer monitor.

So, am I missing something or it this fundamentally going to be a pain to resolve?