Frames per second influences (Live TV)

bagz

Newbie
What's the biggest influencer of FPS on IPTV live TV streams. VOD isn't a problem and looks spot on!!

Is it the IPTV providers line?
The app being used? (i use Tivimate premium)
The channel being watched? (I find US channels offer 50fps, whereas most Uk are 25fps)
Internet provider (Sly with VPN - decent speeds, no buffering etc)
The device being used? (i use Firestick 4k)

Or something else?

I appreciate IPTV is fraction of the cost of other TV packages so accept it is what it is. But what's everyone else experience of FPS?
 
You can only get 50fps if the stream source has them. Pretty much all broafcasters in the UK use 25 fps.

USA have 30fps and 60fps. You can watch most sports in 50fps on various other channels

Internet speeds and device have little to do with things
 
Its not all about FPS sure on an original broadcast you could say it is and thats correct, but when its IPTV its also about the compression or codec used which do effect on the quality of the pic as apposed to the original source regardless of FPS.
 
I’m using firestick and always had to select 1080 in the display settings instead of auto to stop the picture jumping on certain channels due to frame rates I don’t know if Amazon have done some changes recently as the picture started stuttering again on 1080 tried everything even going down to 25hz then I set to auto with match frame rate on and it’s all good don’t know why it’s changed
 
You can only get 50fps if the stream source has them. Pretty much all broafcasters in the UK use 25 fps.

USA have 30fps and 60fps. You can watch most sports in 50fps on various other channels

Internet speeds and device have little to do with things
Is it true that most broadcasters in the uk use 25 fps? BBC ITV Channel 4/5 etc?

Is this true for your standard sky channels aswell?

Im asking because im with two providers at the moment, one streams these channels at 50fps the other at 25fps.

If these channels are only broadcast at 25fps how does the supplier provide them at 50? What impact does it have?

Thanks for any advice! Apologies for bumping an old thread
 
It's not 25 frames per second that is what most people think on the streams.

It just means 25/50 fields per second which is just the way streams are layered.

You can't have what is not broadcast it is as simple as that in a nutshell.

I
 
UK and Ireland use PAL which is 25 frames per second. If the source is 50fps in theory motion should be smoother. US uses NTSC which is 29.97 fps though sometimes can be 30fps - so 60fps in theory would give a smoother picture. If your TV or STB does doesn't do framerate switching to match then you will get a juddery picture as it's sending(or displaying in the case of the TV) a NTSC (~30fps) source at a PAL(25fps) framerate so some frames are getting lost. TiviMate and IMPlayer will tell you on the info screen what fps the stream you are watching is in. UK and Irish users should set their device to 50fps as 25fps will just display the same frame twice (unnoticeable) and 50fps properly so either way playback will be smooth as possible.
On my Shield and Chromecast I've set the framerate on the box to 50fps and playback is fine on all UK and Irish streams. I use Kodi to playback downloaded stuff which is often in NTSC however Kodi matches the source framerate and sends the correct (~30fps) signal to the TV and playback is smooth (TV auto switches to NTSC framerate).
 
Last edited:
Appreciate the detailed responses so far but im struggling with the core of my question.

What do UK channels, ITV, BBC etc broadcast in for the most part? 50fps or 25fps?

To expand on this, if we’re watching the exact same channel, the exact same show, at the exact same time but on providers stream is 25fps and the other is 50fps whats the actual difference? Is the provide at 25fps dropping frames? Is the provider at 50fps duplicating frames?
 
Its all broadcast at 25fps, however IPTV is re-encoding the original stream which is why most of the time its not as sharp as broadcast and received directly via aerial or satellite dish.
My provider has a world cup folder and the same channel in there has far better quality than in the "entertainment" folder despite both being 1080p feeds - he's clearly prioritising bandwidth for the soccer viewers. Similarly I've noted differences in the real 4k/UHD feeds (as opposed to ones being labelled that while actually being 1080p) in terms of quality - IMPlayer also reports higher bandwidth for the higher quality feeds . Bottom line, in theory if its being re-encoded at 50fps that should be the higher quality feed as twice as much info/frames are coming down the line, at least that's my understanding of it though i'm no expert. If you re ausing an android box like a Shield or Chromecast set their refresh rate to 50fps (default is usually 59.94 which won't display PAL properly, especially noticeable on action scenes or where the camera is panning like during a soccer match).
 
Back
Top