Reply
Thread Tools
qole's Avatar
Moderator | Posts: 7,109 | Thanked: 8,820 times | Joined on Oct 2007 @ Vancouver, BC, Canada
#11
Originally Posted by daperl View Post
As is, it won't work on an n900, but the changes should be obvious.
If you make the "obvious" (to you, maybe! ) changes, I'll test it out for you and even time how long it takes to make the png. I'm curious to know how much faster the process is...
__________________
qole.org --- twitter --- Easy Debian wiki page
Please don't send me a private message, post to the appropriate thread.
Thank you all for your donations!
 
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#12
Run this script and post the output. It won't save a picture, it should just give me information about the captured buffer. Then I'll post back an n900 save solution while I work on a more general one. This is fun; I haven't flipped bits in a long while.

Code:
#! /usr/bin/env python

import string
import platform
import gtk
import gst

class ShowMe:
	def __init__(self):
		window = gtk.Window(gtk.WINDOW_TOPLEVEL)
		window.set_title("Webcam-Viewer")
		window.connect("destroy", gtk.main_quit, "WM destroy")
		vbox = gtk.VBox()
		window.add(vbox)
		self.movie_window = gtk.DrawingArea()
		vbox.add(self.movie_window)
		hbox = gtk.HBox()
		vbox.pack_start(hbox, False)
		hbox.set_border_width(10)
		hbox.pack_start(gtk.Label())
                self.takePicture = 0
		self.button0 = gtk.Button("Snap")
		self.button0.connect("clicked", self.onTakePicture)
		hbox.pack_start(self.button0, False)
		self.button = gtk.Button("Start")
		self.button.connect("clicked", self.start_stop)
		hbox.pack_start(self.button, False)
		self.button2 = gtk.Button("Quit")
		self.button2.connect("clicked", self.exit)
		hbox.pack_start(self.button2, False)
		hbox.add(gtk.Label())
		window.show_all()
                self.machine = platform.uname()[4]

                if self.machine == 'armv6l':
                    self.player = gst.Pipeline('ThePipe')
                    src = gst.element_factory_make("gconfv4l2src","src")
                    self.player.add(src)
                    for p in src.pads():
                        #print p.get_caps().to_string()
                        print p.get_name()
                    caps = gst.element_factory_make("capsfilter", "caps")
                    caps.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    #caps.set_property('caps', gst.caps_from_string(
                        #'video/x-raw-rgb,width=352,height=288,\
                        #framerate=15/1'))
                        #red_mask=224,green_mask=28,blue_mask=3,framerate=15/1'))
                    self.player.add(caps)
                    filt = gst.element_factory_make("ffmpegcolorspace", "filt")
                    self.player.add(filt)
                    caps2 = gst.element_factory_make("capsfilter", "caps2")
                    caps2.set_property('caps', gst.caps_from_string(
                        'video/x-raw-rgb,width=352,height=288,bpp=16,depth=16,\
                        framerate=15/1'))
                    self.player.add(caps2)
                    #sink = gst.element_factory_make("xvimagesink", "sink")
                    sink = gst.element_factory_make("autovideosink", "sink")
                    self.player.add(sink)
                    pad = src.get_pad('src')
                    pad.add_buffer_probe(self.doBuffer)
                    src.link(caps)
                    caps.link(filt)
                    filt.link(caps2)
                    caps2.link(sink)
                else:
                    self.player = gst.Pipeline('ThePipe')
                    src = gst.element_factory_make("v4l2src","src")
                    src.set_property('device','/dev/video0')
                    self.player.add(src)
                    sink = gst.element_factory_make("autovideosink", "sink")
                    self.player.add(sink)
                    pad = src.get_pad('src')
                    pad.add_buffer_probe(self.doBuffer)
                    src.link(sink)

		# Set up the gstreamer pipeline
		#self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! autovideosink')
		#self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-yuv,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! queue ! filesink location=qole.raw qole. ! queue ! autovideosink')
		#self.player = gst.parse_launch ('gconfv4l2src ! video/x-raw-rgb,width=352,height=288,framerate=(fraction)15/1 ! tee name=qole qole. ! ffmpegcolorspace ! jpegenc ! filesink location=qole.raw qole. ! queue ! autovideosink')
		#self.player = gst.parse_launch ('v4l2src ! autovideosink')

		bus = self.player.get_bus()
		bus.add_signal_watch()
		bus.enable_sync_message_emission()
		bus.connect("message", self.on_message)
		bus.connect("sync-message::element", self.on_sync_message)

	def onTakePicture(self, w):
            self.takePicture = 1

	def doBuffer(self, pad, buffer):
            if self.takePicture:
                self.takePicture = 0
                print 'buffer length =',len(buffer)
                caps = buffer.get_caps()
                #struct = caps.get_structure(0)
                struct = caps[0]
                print 'caps',caps
                for i in range(0,struct.n_fields()):
                    fn = struct.nth_field_name(i)
                    print '  ',fn,'=',struct[fn]
                # 63488 2016 31
                # 0xf8  0x07,0xe0  0x1f
                return True
                p = gtk.gdk.Pixbuf(gtk.gdk.COLORSPACE_RGB,False,8,352,288)
                pa = p.get_pixels()
                pal = list(pa)
                for i in range(0,len(buffer)/2):
                    pal[i*3] = "%c" % (0xf8 & ord(buffer[i*2+1]))
                    pal[i*3+1] = "%c" % (((0x07 & ord(buffer[i*2+1])) << 5) |\
                        ((0xe0 & ord(buffer[i*2])) >> 5))
                    pal[i*3+2] = "%c" % ((0x1f & ord(buffer[i*2])) << 3)
                js = string.join(pal,'')
                pb = gtk.gdk.pixbuf_new_from_data(js,gtk.gdk.COLORSPACE_RGB,                        False,8,352,288,1056)
                pb.save('/home/user/MyDocs/.images/daperl00.png','png')
                print pb.get_width(),pb.get_height()
            return True

	def start_stop(self, w):
		if self.button.get_label() == "Start":
			self.button.set_label("Stop")
			self.player.set_state(gst.STATE_PLAYING)
		else:
			self.player.set_state(gst.STATE_NULL)
			self.button.set_label("Start")

	def exit(self, widget, data=None):
		gtk.main_quit()

	def on_message(self, bus, message):
		t = message.type
		if t == gst.MESSAGE_EOS:
			self.player.set_state(gst.STATE_NULL)
			self.button.set_label("Start")
		elif t == gst.MESSAGE_ERROR:
			err, debug = message.parse_error()
			print "Error: %s" % err, debug
			self.player.set_state(gst.STATE_NULL)
			self.button.set_label("Start")

	def on_sync_message(self, bus, message):
		if message.structure is None:
			return
		message_name = message.structure.get_name()
		if message_name == "prepare-xwindow-id":
			# Assign the viewport
			imagesink = message.src
			imagesink.set_property("force-aspect-ratio", True)
			imagesink.set_xwindow_id(self.movie_window.window.xid)

if __name__ == "__main__":
    gtk.gdk.threads_init()
    ShowMe()
    gtk.main()
__________________
N9: Go white or go home
 
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#13
I've cut file creation down to under 13 10 seconds. It's just number crunching, so I'm guessing it should fly on the n900. Also, there should be other options like the numpy module and GStreamer's jpegenc. jpegenc would be interesting 'cause it would force us to learn more about pipeline control flow. Which would be a good thing.

numpy could be just what the doctor ordered.
Attached Files
File Type: txt pygscam.txt (7.3 KB, 192 views)
__________________
N9: Go white or go home

Last edited by daperl; 2009-09-19 at 23:57.
 
qgil's Avatar
Posts: 3,105 | Thanked: 11,088 times | Joined on Jul 2007 @ Mountain View (CA, USA)
#14
I f you want to go beyond http://wiki.maemo.org/Documentation/...mera_API_Usage then I recommend you to ask to maemo-developers where some of our Multimedia developers are following.
 

The Following 4 Users Say Thank You to qgil For This Useful Post:
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#15
Code:
/* Initialize the the Gstreamer pipeline. Below is a diagram
 * of the pipeline that will be created:
 * 
 *                             |Screen|  |Screen|
 *                           ->|queue |->|sink  |-> Display
 * |Camera|  |CSP   |  |Tee|/
 * |src   |->|Filter|->|   |\  |Image|   |Image |  |Image|
 *                           ->|queue|-> |filter|->|sink |-> JPEG file
 */
So, this is the pipeline from example_camera.c. I basically implemented this in the attached Python text file, with the addition of scaling down the displayed dimensions (nearest-neighbour). This cuts file creation time in half, but as I feared, it also unnecessarily sucks on the CPU.

I quickly tried modifying things so that the image queue path doesn't run until a snapshot is requested, but I haven't had luck there yet. So, even though a file isn't created until the button's clicked, the Image queue and the Image filter are processing every frame. On my n800, that's 640x480 @ 15 fps. For the n900, that would be more like 2592x1944 @ 25 fps. According to the specs, the supplied video recorder does 848x480 @ 25 fps with unknown sound quality. Pretty good, and I'm guessing the supplied camera app doesn't use the above pipeline. But if it does, which I doubt, there seems to be room for improvement.

The problem here is that you really only want to be pushing the decimated view finder pixels until it's picture time, so I really just want some hardware decimated camera buffer. Instead, with a pipeline like this, I'm using the CPU to decimate the feed. Currently, I'm using a view window of 320x240. From the n900 demos, it looks like they're using close to the whole 800x480. And since I haven't figured out how to no-op the Image queue until photo request, I have two unnecessary loads.

My next two steps are to see if I can change the Camera src capabilities on-the-fly, and also see if I can dynamically link and unlink the Image branch from the tee when taking a picture. The first might need a pipeline start-and-stop, but the second might just need a simple switch that I haven't found yet.

But most importantly, are the camera and video apps open source?
Attached Files
File Type: txt pygscam.txt (11.7 KB, 187 views)
__________________
N9: Go white or go home
 

The Following User Says Thank You to daperl For This Useful Post:
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#16
Attached is a hard-coded, proof-of-concept that works on my n800. Look at it closely 'cause you'll probably have to change a few things to get it working on the n900. Including, but not limited to, anything that says:

Code:
if self.machine == 'armv6l':


More info about what I've been doing can be found here.
Attached Files
File Type: txt pygscam.txt (11.7 KB, 364 views)
__________________
N9: Go white or go home
 
Posts: 182 | Thanked: 540 times | Joined on Aug 2009 @ Finland
#17
I tried several Vala Gstreamer samples from http://live.gnome.org/Vala/GStreamerSample -- the last one is working well with N900 when v4l2camsrc is used instead of videotestsrc.

While this is not a Python, you can easily see how to use gstreamer correctly and I can confirm that this approach works well on the device with both cameras (front and rear).
 

The Following User Says Thank You to abbra For This Useful Post:
qole's Avatar
Moderator | Posts: 7,109 | Thanked: 8,820 times | Joined on Oct 2007 @ Vancouver, BC, Canada
#18
Just a note, guys. The original post (from the thread that this post was originally in before sjgadsby unceremoniously dumped everything in the right thread) is talking about Diablo and the tablets, not Fremantle and the N900...

I know that's where everyone's head is these days, but read closely...
__________________
qole.org --- twitter --- Easy Debian wiki page
Please don't send me a private message, post to the appropriate thread.
Thank you all for your donations!

Last edited by qole; 2009-09-21 at 20:22.
 
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#19
Originally Posted by qole View Post
Just a note, guys. The original post is talking about Diablo and the tablets, not Fremantle and the N900...

I know that's where everyone's head is these days, but read closely...
Well, he should fire my sh*t up then. And where is my output from you? Did cut-and-paste stop working on your n900?
__________________
N9: Go white or go home
 
Posts: 182 | Thanked: 540 times | Joined on Aug 2009 @ Finland
#20
I wish I had looked at the original post's timestamp
 

The Following User Says Thank You to abbra For This Useful Post:
Reply

Tags
camera, fremantle, gstreamer, maemo 5


 
Forum Jump


All times are GMT. The time now is 07:57.