Reply
Thread Tools
Posts: 1,522 | Thanked: 392 times | Joined on Jul 2010 @ São Paulo, Brazil
#101
I imagine the proper way of doing it for backwards compatibility would be to mess with the video driver to make it so when a program tries to make the screen be 16bits, padding is added as the least significant bits of each color component to produce a 32 (or 24 or whatever) bits image; the programs would think they are making the display be 16bits when actually everything is more bits all the time. Unless, of course, it is actually easy and trouble free to change bit depth each time a program wants a different one.
 

The Following User Says Thank You to TiagoTiago For This Useful Post:
Posts: 468 | Thanked: 610 times | Joined on Jun 2006
#102
when using opengl you can clearly see that the display depth is 16bit. if you display a number of 32bit textures with an alpha channel there is a lot van color banding.

i would love to know how to set the bitdepth to 32bit when using an QGLwidget. But i think you would loose the ability to view a thumbnail of your application when multitasking and the ability to see the alerts (like the yellowbar for the volume), since these are composited in 16bit and i don't think you can mix those.
 

The Following User Says Thank You to Bernard For This Useful Post:
javispedro's Avatar
Posts: 2,355 | Thanked: 5,249 times | Joined on Jan 2009 @ Barcelona
#103
Originally Posted by Bernard View Post
But i think you would loose the ability to view a thumbnail of your application when multitasking and the ability to see the alerts (like the yellowbar for the volume), since these are composited in 16bit and i don't think you can mix those.
Oh, of course you can -- in fact, you can actually do 8+8+8+8 rgba surfaces and hildon desktop will take proper care of the alpha values up to the point it will actually composite the background with both your actual window and the thumbnail.

I do not know how to do it with QGLWidget, though.

Last edited by javispedro; 2011-03-05 at 02:00.
 

The Following User Says Thank You to javispedro For This Useful Post:
Posts: 468 | Thanked: 610 times | Joined on Jun 2006
#104
Originally Posted by javispedro View Post
Oh, of course you can -- in fact, you can actually do 8+8+8+8 rgba surfaces and hildon desktop will take proper care of the alpha values up to the point it will actually composite the background with both your actual window and the thumbnail.

I do not know how to do it with QGLWidget, though.
I just found out how!!

And it is REALLY simple!
you can start yourQt application in 32bit mode by adding the argument "-visual TrueColor"
Works perfectly, all the banding problems are GONE!!
This really should be known by anybodu who is writing OpenGL games in Qt for Maemo.

http://doc.trolltech.com/latest/qapplication.html
 

The Following User Says Thank You to Bernard For This Useful Post:
javispedro's Avatar
Posts: 2,355 | Thanked: 5,249 times | Joined on Jan 2009 @ Barcelona
#105
The composited output is still 16bpp, and the banding is not gone (look closely).

Also, I suggest you don't do this because it will be even more slow.
 

The Following User Says Thank You to javispedro For This Useful Post:
Posts: 468 | Thanked: 610 times | Joined on Jun 2006
#106
The composited output maybe 16bit, but my banding problem is definitely gone. It isn't directly related to 16 bit output, but to loading textures as 16bit, and that kills the alpha channel

just have a look at the blue nebula in the background:
http://dl.dropbox.com/u/6050252/16bitQGLN900.png
http://dl.dropbox.com/u/6050252/32bitQGLN900.png

So yes it did solve my problem
 

The Following User Says Thank You to Bernard For This Useful Post:
Posts: 1 | Thanked: 1 time | Joined on May 2011
#107
Check the following link to get detailed specs about the N900:

http://pdadb.net/index.php?m=specs&i...00_nokia_rover

Hope this will be helpful to all...


RSVR
 

The Following User Says Thank You to rsvr For This Useful Post:
Posts: 284 | Thanked: 161 times | Joined on Dec 2009
#108
hmm, wonder if it can be applied to the kernel ... ? it has to be a battery issue ..
 

The Following User Says Thank You to shady For This Useful Post:
Posts: 560 | Thanked: 422 times | Joined on Mar 2011
#109
I have a few questions about this...

It seems the screen is fixed to 18bpp and software is generally written in RGB565. Obviously, a h/w limitation many devices are subjected to. On the device, that's fine because the pixel density is quite high and everything generally looks quite good.

- I'm not clear as to whether the frame buffer is similarly limited or can it handle True-Color (or whatever it's called)? Indeed, is that what the much-touted specs really refer to - the SGX530 can render 16M colours, even if the screen can't?

- The reaason I ask is when an external screen, capable of displaying 24bpp, is used in lieu of the standard one, can the graphics hardware cope even if the display cannot?

- Can data go straight from the frame buffer to the out channel, without going to/via the screen, thus no need to be interpolated down? (out channel could be any, not just the normal tv-out e.g. BT, wifi, etc).

- Can the frame buffer render different sizes/proportions to that of the screen? Again, if connected to a different screen, this would be ideal for running desktop applications. Not much more, say 1024 x 768. Though, the N900 might have to live in a bucket of dry ice to do this!

Incidentally, those green artifacts that appear on embedded video streams - do they occur because of the 888 - to - 565 conversion?
 

The Following User Says Thank You to demolition For This Useful Post:
javispedro's Avatar
Posts: 2,355 | Thanked: 5,249 times | Joined on Jan 2009 @ Barcelona
#110
Originally Posted by demolition View Post
I have a few questions about this...

It seems the screen is fixed to 18bpp and software is generally written in RGB565. Obviously, a h/w limitation many devices are subjected to. On the device, that's fine because the pixel density is quite high and everything generally looks quite good.
Tbh, I'm not still sure what the screen depth is. The device software is configured to 16bpp, and possibly makes that assumption in some closed places noone knows. But Hildon works on 24bpp. That's tested easily on a PC.

Originally Posted by demolition View Post
- I'm not clear as to whether the frame buffer is similarly limited or can it handle True-Color (or whatever it's called)? Indeed, is that what the much-touted specs really refer to - the SGX530 can render 16M colours, even if the screen can't?
Yes. Also, the Pre1 does 24bpp.

Originally Posted by demolition View Post
- The reaason I ask is when an external screen, capable of displaying 24bpp, is used in lieu of the standard one, can the graphics hardware cope even if the display cannot?
Well, no real idea here... probably can be answered with a lookup to schematics.

Originally Posted by demolition View Post
- Can data go straight from the frame buffer to the out channel, without going to/via the screen, thus interpolated down? (out channel could be any, not just the normal tv-out e.g. BT, wifi, etc).
Yes -- you're not transferring uncompressed 800x480 at 24bpp via wifi, though. So you will have to use some compression, like VNC, or NX. In which case, 24bpp is more like a nuisance...

Originally Posted by demolition View Post
Can the frame buffer render different sizes/proportions to that of the screen? Again, if connected to a different screen, this would be ideal for running desktop applications. Not much more, say 1024 x 768. Though, the N900 might have to live in a bucket of dry ice to do this!
Of course. And no bucket of ice. Remember this is pretty much like a desktop GNU/Linux after all.

Originally Posted by demolition View Post
Incidentally, those green artifacts that appear on embedded video streams - do they occur because of the 888 - to - 565 conversion?
Na, video is not rgb888, but usually YUV.
 

The Following 2 Users Say Thank You to javispedro For This Useful Post:
Reply


 
Forum Jump


All times are GMT. The time now is 21:43.