When I connected my RPi board to a TV via the HDMI port, the display was outside the visible area of the screen. That meant I couldn't see the edges of the display, including the command line prompt and the LXDE toolbar. That meant I had to guess at what I was typing in order to log on and start up.
The fix is to create the file /boot/config.txt, containing the lines
overscan_top=50
overscan_bottom=50
overscan_left=50
overscan_right=50
(50 pixels is a little large for my display, but good enough.)
Can I suggest that the pre-compiled distros cointain this file, so that when the RPi finally arrives in people's hands, they can plug it into a TV and have it work? (If people are annoyed by the wasted space around the edge of a display, they can always reduce the overscans later.)
Re: Fitting the display on a TV: config.txt and overscan_*
If your TV has a special mode for playing video games then try that. I think that in that mode the overscan is switched off.
Martyn.
Martyn.
Re: Fitting the display on a TV: config.txt and overscan_*
I've seen up to 10% (5% on either side) overscan on typical TV's. But it can be worse: Some TV's try to detect any borders in the video image and stretch the image to remove the borders (with a tiny amount of overscan). This can create funky results when a device is displaying only the login prompt.
You might be able to turn all these features off, but not all TV's allow that.
To fix this you would need to display an animating icon in each of the 4 corners and stay far away from the borders of the screen with any important content. And yes.. that's ugly.
You might be able to turn all these features off, but not all TV's allow that.
To fix this you would need to display an animating icon in each of the 4 corners and stay far away from the borders of the screen with any important content. And yes.. that's ugly.
-
- Posts: 178
- Joined: Sun Oct 02, 2011 3:05 pm
Re: Fitting the display on a TV: config.txt and overscan_*
Martyn said:
If your TV has a special mode for playing video games then try that. I think that in that mode the overscan is switched off.
Martyn.
Or search AVSforum for tips and tricks.
All TVs are different and they have different way to turn off overscan.
On mine, you switch it to "zoom mode", and turn it all the way down. [So, it'll actually zoom out in this case ]
If your TV has a special mode for playing video games then try that. I think that in that mode the overscan is switched off.
Martyn.
Or search AVSforum for tips and tricks.
All TVs are different and they have different way to turn off overscan.
On mine, you switch it to "zoom mode", and turn it all the way down. [So, it'll actually zoom out in this case ]
Re: Fitting the display on a TV: config.txt and overscan_*
I don't understand, with digital output I would expect if the screen had the native resolution of the output, shouldn't it give you one to one pixel matching?
i.e. fill the screen automatically like a DVI monitor or a laptop.
i.e. fill the screen automatically like a DVI monitor or a laptop.
Re: Fitting the display on a TV: config.txt and overscan_*
arm2 said:
I don't understand, with digital output I would expect if the screen had the native resolution of the output, shouldn't it give you one to one pixel matching?
i.e. fill the screen automatically like a DVI monitor or a laptop.
You'd certainly hope so Sadly, this is not always the case...
I don't understand, with digital output I would expect if the screen had the native resolution of the output, shouldn't it give you one to one pixel matching?
i.e. fill the screen automatically like a DVI monitor or a laptop.
You'd certainly hope so Sadly, this is not always the case...
-
- Posts: 210
- Joined: Tue Jan 17, 2012 8:17 am
Re: Fitting the display on a TV: config.txt and overscan_*
The intent with pixels used in video processing is that they are NOT individually 'visible' or isolatable - they are predicatable fom the values of adjacent samples.
[This is also the basis of JPEG compression which was designed for camera-generated, sampled images]
'Unfortunately', those in the 'computer industry' don't seem to understand sampling theory, and assume 'square pixels' and an individual unrelated nature of any pixel to its neighbours ... this cannot occur with a correctly sampled camera output!
(Note that RISC OS computers did/do still offer anti-aliased text - which gives the effect of being able to sub-pixel position characters on the screen - long before Mickysoft attempted it - and allowing quality judgement in DTP layout programs)
However, 'ignoring' anti-aliasing makes the processing so much easier for games writers etc... hence the need to synchronise display pixels with the dot-clock to avoid moire effects in resolution, for computer displays.
However - the resolution of a screen in pixels may vary considerable from the input-format which it is sent. Eg: an HD 1080 line screen could be generated from 1920 pixels across, or perhaps 1440 from an HDV camera, but viewed physically on a 800 x 480 panel, or 1200-line panel.
As a former broadcast engineer, and current RISCOS user; my preference is NOT to see individual pixels, but correct anti-aliased displays with sub-pixel positioning!
[Sub-pixel positioning requires the use of 'grey-scales' - ie intermediate values between fully on and fully off unlike such as characterised by the binary RGB nature of a Teletext Display - or non-Risc-Os (ie inferior - which can only be moved in whole-pixel quantities -) ] The computer industry's response is to use every higher resolution displays, resulting in smaller and smaller text, and more and more memory required for uppdating: true bloatware
[This is also the basis of JPEG compression which was designed for camera-generated, sampled images]
'Unfortunately', those in the 'computer industry' don't seem to understand sampling theory, and assume 'square pixels' and an individual unrelated nature of any pixel to its neighbours ... this cannot occur with a correctly sampled camera output!
(Note that RISC OS computers did/do still offer anti-aliased text - which gives the effect of being able to sub-pixel position characters on the screen - long before Mickysoft attempted it - and allowing quality judgement in DTP layout programs)
However, 'ignoring' anti-aliasing makes the processing so much easier for games writers etc... hence the need to synchronise display pixels with the dot-clock to avoid moire effects in resolution, for computer displays.
However - the resolution of a screen in pixels may vary considerable from the input-format which it is sent. Eg: an HD 1080 line screen could be generated from 1920 pixels across, or perhaps 1440 from an HDV camera, but viewed physically on a 800 x 480 panel, or 1200-line panel.
As a former broadcast engineer, and current RISCOS user; my preference is NOT to see individual pixels, but correct anti-aliased displays with sub-pixel positioning!
[Sub-pixel positioning requires the use of 'grey-scales' - ie intermediate values between fully on and fully off unlike such as characterised by the binary RGB nature of a Teletext Display - or non-Risc-Os (ie inferior - which can only be moved in whole-pixel quantities -) ] The computer industry's response is to use every higher resolution displays, resulting in smaller and smaller text, and more and more memory required for uppdating: true bloatware
Re: Fitting the display on a TV: config.txt and overscan_*
Phil Spiegel said:
The computer industry's response is to use every higher resolution displays, resulting in smaller and smaller text, ...
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
The computer industry's response is to use every higher resolution displays, resulting in smaller and smaller text, ...
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
-
- Posts: 4277
- Joined: Sun Jan 15, 2012 1:11 pm
Re: Fitting the display on a TV: config.txt and overscan_*
Montala said:
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
I take it, you've seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU), the console output is the tiniest print/font I've ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
I take it, you've seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU), the console output is the tiniest print/font I've ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
And some folks need to stop being fanboys and see the forest behind the trees.
(One of the best lines I've seen on this board lately)
(One of the best lines I've seen on this board lately)
Re: Fitting the display on a TV: config.txt and overscan_*
arm2 said:
I don't understand, with digital output I would expect if the screen had the native resolution of the output, shouldn't it give you one to one pixel matching?
i.e. fill the screen automatically like a DVI monitor or a laptop.
That's one of the differences between a tv and a monitor. Even with a digital signal (from a settop-box?) the tv-broadcast can contain a border and tv sets can better cut of the overscan area instead of showing ugly borders.
With hdtv the borders are disappearing, but studios know the overscan area is traditionally removed by tv sets, so the keep anything important out of this area anyway.
So your tv set is perfectly correct in removing the borders.
I don't understand, with digital output I would expect if the screen had the native resolution of the output, shouldn't it give you one to one pixel matching?
i.e. fill the screen automatically like a DVI monitor or a laptop.
That's one of the differences between a tv and a monitor. Even with a digital signal (from a settop-box?) the tv-broadcast can contain a border and tv sets can better cut of the overscan area instead of showing ugly borders.
With hdtv the borders are disappearing, but studios know the overscan area is traditionally removed by tv sets, so the keep anything important out of this area anyway.
So your tv set is perfectly correct in removing the borders.
-
- Posts: 210
- Joined: Tue Jan 17, 2012 8:17 am
Re: Fitting the display on a TV: config.txt and overscan_*
With hdtv the borders are disappearing, but studios know the overscan area is traditionally removed by tv sets, so the keep anything important out of this area anyway.
Watch BBC breakfast any morning ( need to double check now they're at Salford Media Centre) – but you'll see blanking coming and going as they select different sources – I suspect some sources may still be being ARC'ed into (almost) widescreen. whilst other fill the screen.
The vertical drop, and showing either timecode or widescreen-switching signal is a different effect … frequently seen in the past with news editing on Betacam's and the playout machine still being locked to its input (as when editing) … and therefore the output delayed a couple of lines - but still seen each day on Breakfast/opt outs.
In the mixed world of 4x3 (12x9) and 16x9 there is also '14x9' viewed on either 16x9 or 12x9 screens, let alone all the times that '4x3' channels show widescreen 16x9 (or 14x9 copies?) in differing ratios including letterboxed.. and which are then viewed on 16x9 receivers; Within any / all of those are 'standards' for 'safe action' and 'safe caption' areas. Adverts also have a requirement of a minimum line-height for 'disclaimer' text etc, I believe …. (11 scan lines???)
And that's ignoring the 'Smart' (shades of 1984?) and other distorting modes used by TV manufacturers to fill the screens. Earlier someone mentioned the 'Auto' mode, as used in a Philips TV we have some years ago – it could, if allowed, auto-change after measuring the blanking at top and bottom of the screen … about 20-30 seconds after returning fom the adverts, it would 'correct' itself …. when playing back a 21x9 cinema-film, it would try to zoom it up to fill the screen, and off VHS (a long time ago) it looked a Very Horrible System.
Historical Note: Although 16x9 should be considered the 'new standard', and 4x3 its predecessor; tv pictures used to be 5x4 in some monochome days …. but since sets were all analogue in those days, the height control was a fairly common user control.
TEXT SIZE ON PI: The default, I believe, is a standard 8x8 matrix, with a possible Linux font option of 8x 14high; therefore on the default 1080line display this will be small!, and the video is 'unknown' – we don't even know if its a proper interlaced 625line display (unlikely), or 2 non-interlaced fields as typical on most home computers (except the Model B Mode 7 teletext, which was interlaced).
Re: Fitting the display on a TV: config.txt and overscan_*
Joe Schmoe said:
Montala said:
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
I take it, you"ve seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU), the console output is the tiniest print/font I"ve ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
I have yes… at "The Gadget Show" in Birmingham (U.K.) a couple of days ago.
The Samsung monitor screen was full of text where people had tried to log in, and yes, the text was very small indeed!
Edit: Not being an expert on this sort of thing, I would imagine that the text would be "sharper", when displayed on a 19" or 23" computer monitor (fairly common sizes) as opposed to (say) a fairly modern (16:9) 19" TFT TV screen… or would it?
Montala said:
Presumably this is why the text output from a Raspberry Pi looks so small when viewed on a large(ish) computer monitor?
I take it, you"ve seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU), the console output is the tiniest print/font I"ve ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
I have yes… at "The Gadget Show" in Birmingham (U.K.) a couple of days ago.
The Samsung monitor screen was full of text where people had tried to log in, and yes, the text was very small indeed!
Edit: Not being an expert on this sort of thing, I would imagine that the text would be "sharper", when displayed on a 19" or 23" computer monitor (fairly common sizes) as opposed to (say) a fairly modern (16:9) 19" TFT TV screen… or would it?
-
- Posts: 4277
- Joined: Sun Jan 15, 2012 1:11 pm
Re: Fitting the display on a TV: config.txt and overscan_*
(I was just running it under QEMU on a Windows laptop - and it was incredibly teeny tiny)
And some folks need to stop being fanboys and see the forest behind the trees.
(One of the best lines I've seen on this board lately)
(One of the best lines I've seen on this board lately)
Re: Fitting the display on a TV: config.txt and overscan_*
Joe Schmoe said:
I take it, you've seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU),
QEMU is NOT a pi emulator. It allows us to emulate a system with the same CPU as the Pi and the same ram as the Pi but you cannot rely on anything else being the same.
the console output is the tiniest print/font I've ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
Generally arm boards don't tend to support VGA text mode so the kernel is configured to use a framebuffer console. Running linux with a framebuffer console (you CAN do this on a PC too but it's not generally the default setup) on a high resoloution framebuffer tends to look a bit weird because people aren't expecting "text mode" to have text that size (though the text is actually comparable in size to text in most GUIs i've used)
I take it, you've seen an actual Pi, then? I noticed that when running the Pi-emulator (QEMU),
QEMU is NOT a pi emulator. It allows us to emulate a system with the same CPU as the Pi and the same ram as the Pi but you cannot rely on anything else being the same.
the console output is the tiniest print/font I've ever seen. I assume, then, that this is that to which you refer? Anyone know, specifically, why it is so small?
Generally arm boards don't tend to support VGA text mode so the kernel is configured to use a framebuffer console. Running linux with a framebuffer console (you CAN do this on a PC too but it's not generally the default setup) on a high resoloution framebuffer tends to look a bit weird because people aren't expecting "text mode" to have text that size (though the text is actually comparable in size to text in most GUIs i've used)
Re: Fitting the display on a TV: config.txt and overscan_*
Joe Schmoe said:
(I was just running it under QEMU on a Windows laptop - and it was incredibly teeny tiny)
I dunno how QEMU handles display output but I wonder if it was scaling down the image produced by the emulated system to fit in the window you gave it.
(I was just running it under QEMU on a Windows laptop - and it was incredibly teeny tiny)
I dunno how QEMU handles display output but I wonder if it was scaling down the image produced by the emulated system to fit in the window you gave it.
Re: Fitting the display on a TV: config.txt and overscan_*
Joe Schmoe said:
(I was just running it under QEMU on a Windows laptop - and it was incredibly teeny tiny)
Depending on the kernel you provide to qemu (!) your startup font may be either 8x16 pixels giving 80x30 characters, or 8x8 pixels giving 80x60. The latter is not only very small but also out of proportion and hard to read. "setfont Lat15-VGA14" will give something more sensible.
At 1080p on a real Pi you may want to try "setfont Lat15-Terminus32x16". See /usr/share/consolefonts for other options.
(I was just running it under QEMU on a Windows laptop - and it was incredibly teeny tiny)
Depending on the kernel you provide to qemu (!) your startup font may be either 8x16 pixels giving 80x30 characters, or 8x8 pixels giving 80x60. The latter is not only very small but also out of proportion and hard to read. "setfont Lat15-VGA14" will give something more sensible.
At 1080p on a real Pi you may want to try "setfont Lat15-Terminus32x16". See /usr/share/consolefonts for other options.
Re: Fitting the display on a TV: config.txt and overscan_*
redman684 said:
That's one of the differences between a tv and a monitor. Even with a digital signal (from a settop-box?) the tv-broadcast can contain a border and tv sets can better cut of the overscan area instead of showing ugly borders.
Ugly borders! Well I'd much have black borders on the left and right than the awful distortion of stretching to fit the screen. One wide screen TV I had, had a mode that automaticaly gave me 1:1 display so all modes had round circles AND it zoomed in a 14: signal to fill horizontally with a bit off top and bottom and part zoom a 4:3 signal simarlarly but with half the left/right black border. When buying a new TV a while back I gave up trying to find another screen with the above sensible setting. The manufacturers support lines I spoke either couldn't grasp what I wanted (Round circles automatically) or had to admit there TVs could do it:-(
That's one of the differences between a tv and a monitor. Even with a digital signal (from a settop-box?) the tv-broadcast can contain a border and tv sets can better cut of the overscan area instead of showing ugly borders.
Ugly borders! Well I'd much have black borders on the left and right than the awful distortion of stretching to fit the screen. One wide screen TV I had, had a mode that automaticaly gave me 1:1 display so all modes had round circles AND it zoomed in a 14: signal to fill horizontally with a bit off top and bottom and part zoom a 4:3 signal simarlarly but with half the left/right black border. When buying a new TV a while back I gave up trying to find another screen with the above sensible setting. The manufacturers support lines I spoke either couldn't grasp what I wanted (Round circles automatically) or had to admit there TVs could do it:-(
Re: Fitting the display on a TV: config.txt and overscan_*
One of my monitors is basically a "digital TV" but without the tuner. This gives it a number of very bad habits.
The single worst one is that when connected to an Nvidia graphics card, it automatically applies overscan - even though the signal perfectly matches the native resolution, the display is set to not rescale, and so is the card. This results in a wide black mask around the edge of the desktop, obscuring the first column of icons and the taskbar, and the titlebar of open windows. WHY?
It does not do this with an ATI or AMD graphics card.
To fix the problem, I have to hack the driver so that it ignores the extended EDID attributes of the display and produces a "monitor" signal instead of a "TV" signal. I have to redo this every time the driver is installed, and for each such machine connected to it.
Another slightly less annoying problem is that for most 1920x1080 modes - as opposed to the native 1920x1200 - it will rescale the picture down slightly so that it does *not* fill the entire width of the display. Fortunately I can tell my graphics cards to convert all modes to rescaled and letterboxed 1920x1200, so I get pixel-accurate output.
I am increasingly convinced that the entire video industry is on crack.
Computers require pixel-perfect output. Advanced rendering algorithms explicitly assume that they have this, and even rely on the subpixel layout of the individual pixels, in order to enhance the perceived resolution of the displayed image *beyond* the physical resolution of the display. Any rescaling destroys that assumption and makes the image look blurry and sometimes even distorted.
Historically, TVs were set with relatively wide beam dispersion in order to hide imperfections in the transmission technology - not least interlacing. Algorithms which upscale old SD video have to take this into account. Fair enough.
But I fail to see why, with digital transmission and digital displays, rescaling is still considered the default or even ideal state of affairs just because it happens to involve a TV signal. If I have taken the trouble to obtain a 1080p signal and a 1920-wide display panel, why should I not see every single pixel I have paid for? And why should Bluray transfer technicians not be able to rely on such a system?
It's enough to make you go ARRRRRR!
The single worst one is that when connected to an Nvidia graphics card, it automatically applies overscan - even though the signal perfectly matches the native resolution, the display is set to not rescale, and so is the card. This results in a wide black mask around the edge of the desktop, obscuring the first column of icons and the taskbar, and the titlebar of open windows. WHY?
It does not do this with an ATI or AMD graphics card.
To fix the problem, I have to hack the driver so that it ignores the extended EDID attributes of the display and produces a "monitor" signal instead of a "TV" signal. I have to redo this every time the driver is installed, and for each such machine connected to it.
Another slightly less annoying problem is that for most 1920x1080 modes - as opposed to the native 1920x1200 - it will rescale the picture down slightly so that it does *not* fill the entire width of the display. Fortunately I can tell my graphics cards to convert all modes to rescaled and letterboxed 1920x1200, so I get pixel-accurate output.
I am increasingly convinced that the entire video industry is on crack.
Computers require pixel-perfect output. Advanced rendering algorithms explicitly assume that they have this, and even rely on the subpixel layout of the individual pixels, in order to enhance the perceived resolution of the displayed image *beyond* the physical resolution of the display. Any rescaling destroys that assumption and makes the image look blurry and sometimes even distorted.
Historically, TVs were set with relatively wide beam dispersion in order to hide imperfections in the transmission technology - not least interlacing. Algorithms which upscale old SD video have to take this into account. Fair enough.
But I fail to see why, with digital transmission and digital displays, rescaling is still considered the default or even ideal state of affairs just because it happens to involve a TV signal. If I have taken the trouble to obtain a 1080p signal and a 1920-wide display panel, why should I not see every single pixel I have paid for? And why should Bluray transfer technicians not be able to rely on such a system?
It's enough to make you go ARRRRRR!
The key to knowledge is not to rely on people to teach you it.
-
- Posts: 210
- Joined: Tue Jan 17, 2012 8:17 am
Re: Fitting the display on a TV: config.txt and overscan_*
To answer your 'computer point of view' misunderstanding about sampling....
The compression systems used to enable you to view a multiplicity of channels (whether MPEG2 or H264 etc) ALL REQUIRE that the images were 'photographic' - ie as seen by a camera - and that the intent is NOT to see individual pixels - because no single pixel is totally independant in value of its neighbours!!! ie there is some correlation between them - otherwise your sampling was too low and aliasing would result (or error correction be impossible.
It is only 'computer' users - expecting to write to individual pixels,through an uncompressed wide-band communication stream, in which errors are unexpected, that 'expect' to see each individual pixel: you're actually too close to your screen!.
The compression systems used to enable you to view a multiplicity of channels (whether MPEG2 or H264 etc) ALL REQUIRE that the images were 'photographic' - ie as seen by a camera - and that the intent is NOT to see individual pixels - because no single pixel is totally independant in value of its neighbours!!! ie there is some correlation between them - otherwise your sampling was too low and aliasing would result (or error correction be impossible.
It is only 'computer' users - expecting to write to individual pixels,through an uncompressed wide-band communication stream, in which errors are unexpected, that 'expect' to see each individual pixel: you're actually too close to your screen!.
- mahjongg
- Forum Moderator
- Posts: 14919
- Joined: Sun Mar 11, 2012 12:19 am
- Location: South Holland, The Netherlands
Re: Fitting the display on a TV: config.txt and overscan_*
Qemu probably defaults to the typical 8x6 pixel characters when outputting console text. I can imagine these look incredibly tiny on a 1080 pixel display.
Re: Fitting the display on a TV: config.txt and overscan_*
mahjongg said:
Qemu probably defaults to the typical 8x6 pixel characters when outputting console text. I can imagine these look incredibly tiny on a 1080 pixel display.
qemu-system-arm emulates a PL110 LCD driver, which does not have a hardware character generator. The font you get at boot is the one you chose to compile into your kernel. The font in use by the time getty runs is the one you configured your distro to load.
Presumably the situation on a Pi is the same; I do not imagine the VideoCore IV includes a character generator either.
Qemu probably defaults to the typical 8x6 pixel characters when outputting console text. I can imagine these look incredibly tiny on a 1080 pixel display.
qemu-system-arm emulates a PL110 LCD driver, which does not have a hardware character generator. The font you get at boot is the one you chose to compile into your kernel. The font in use by the time getty runs is the one you configured your distro to load.
Presumably the situation on a Pi is the same; I do not imagine the VideoCore IV includes a character generator either.
- mahjongg
- Forum Moderator
- Posts: 14919
- Joined: Sun Mar 11, 2012 12:19 am
- Location: South Holland, The Netherlands
Re: Fitting the display on a TV: config.txt and overscan_*
No, for none, it lacks a BIOS to program a default font into the "video card" (GPU).
Such a function is probably done by the "binary blob" (start.arm) code.
Such a function is probably done by the "binary blob" (start.arm) code.