My hub is setting to auto adjust the output for hdmi and auto detect frame rate.
But when i check on my tv the actual rate i have some disperancy between the file resolution and the actual played resolution of the hub.
for examples :
I tried a mkv h264 avc1 video who is true 1080p at 30fps (checked with VLC on my PC). THe hub play it at 720p.
I try a mkv file in 720p, and the hub played it at 1080p/24
I try the sample file included in the hub, they are played at 1080p/24… (i don’t knox the actual resolution of this file)
I try a 624x352 xvid files at 24 fps, and it played it at 1080p/24
A 608 x 352 divx files at 25 fps is played at 720p
i don’t really understand how the hub is dealing with this.
Should I change the hdmi output to a fixed value? If yes wich value would be the best for all files? My tv is 1080p/24 compatibla, has not a high upscaling engine capacity (LG 50PK550 plasma).