Samsung 4K TV stuttering video

- Code:
<Video targetContainer="mpegts" targetVCodec="mpeg2video" targetACodec="ac3" aBitrate="192" forceInheritance="true">
<Matches container="*" vCodec="h264" profile="high_10"/>
<Matches container="*" vCodec="h264" profile="high" levelGreaterThan="4.2"/>
<Matches container="*" vCodec="h264" profile="main" levelGreaterThan="4.2"/>
<Matches container="mp4" vCodec="mpeg4" aCodec="lpcm"/>
<Matches container="mp4" vCodec="dvvideo"/>
<Matches container="mp4" vCodec="mjpeg"/>
<Matches container="matroska" vCodec="mjpeg"/>
</Video>
Using the H Series profile provided by Karnith (didn't work originally, but now it does) I can stream from Crackle with no issues. The TV is plugged into my wired gigabit network, so there's no bandwidth issues. I can watch Netflix and play a 4K video with no problem. don't know how much bandwidth that is using, but I'm sure it's a bunch.
If I connect to the same computer via Windows Media sharing, rather than Serviio, the SD video stutter is reduced a little but still there. Kind of points to the TV itself, but why on earth would the HD video play fine? It has to require more power to process the HD. The video bitrate is between 3-4 times higher for the HD stuff, with a couple of them touching 50Mbps.
Has any other Samsung user encountered such an issue, and is there perhaps a setting somewhere in the menus that might be having an effect on DLNA playback?

LG NANO85 4K TV, Samsung JU7100 4K TV, Sony BDP-S3500, Sharp 4K Roku TV, Insignia Roku TV, Roku Ultra, Premiere and Stick, Nvidia Shield, Yamaha RX-V583 AVR.
Primary server: AMD Ryzen 5 5600GT, 32 gig ram, Windows 11 Pro, 22 TB hard drive space | Test server: Intel i5-6400, 16 gig ram, Windows 10 Pro
HOWTO: Enable debug logging HOWTO: Identify media file contents