My first patch for Blender Game Engine
So I figured I should get myself a new challenge. I work for z25 and we are consolidating the software and tools we use. Blender is one of those beauties we completely embrace. However for realtime usuage Blender is not there yet. So why not help making it better?
First I need to get familiar with the source. So first mission is to checkout the repo and compile from source to see if blender works before we make changes.
I'm doing this on Ubuntu Lucid which doesn't have python3.2 packages. Later versions do I presume. Just found a ppa containing those packages which I just added:
sudo add-apt-repository ppa:cheleb/blender-svn
So start with installing the dependencies:
#dependencies
sudo apt-get update; sudo apt-get install subversion build-essential gettext \
libxi-dev libsndfile1-dev \
libpng12-dev libfftw3-dev \
libopenexr-dev libopenjpeg-dev \
libopenal-dev libalut-dev libvorbis-dev \
libglu1-mesa-dev libsdl1.2-dev libfreetype6-dev \
libtiff4-dev libavdevice-dev \
libavformat-dev libavutil-dev libavcodec-dev libjack-dev \
libswscale-dev libx264-dev libmp3lame-dev python3.2-dev \
libspnav-dev cmake cmake-curses-gui
Then get all the Blender gear:
mkdir blender-svn
cd blender-svn
#blender source
svn co https://svn.blender.org/svnroot/bf-blender/trunk/blender
#precompiled libs
svn co https://svn.blender.org/svnroot/bf-blender/trunk/lib/linux64 lib/linux64
#compile
cd blender
make
After the compile try to run your build
cd ../build/linux/bin/
./blender
Crap! The build is without Ffmpeg support. Fix this by editing ../build/linux/CMakeCache.txt and set:
*Enable FFMPeg Support (http:*ffmpeg.org)
WITH_CODEC_FFMPEG:BOOL=ON
now do 'make' again.
OK, that seems to work just fine. Now I can safely edit the source knowing that everything that fails is my fault.
IDE
But before I start editing sources I would like to use an IDE. I'm pretty much new to the c/c++ world but since using OpenFrameworks I'm used to CodeBlocks. So I would like to use that.
Reading doc/build_systems/cmake.txt I came up with this:
cd ../
mkdir codeblocks
cd codeblocks
cmake -G "CodeBlocks - Unix Makefiles" ../blender/
ccmake ../blender/ #Configure your needs, look for ffmpeg here! (c - g to accept default)
#also pay attention to the FFMPEG setting in advanced mode you might need to add the path there
codeblocks Blender.cbp #This might take a while for codeblocks to load everything
*Also check this for ffmpeg settings. I added://
FFMPEG:avformat;avcodec;avutil;avdevice;swscale;dirac_encoder;mp
3lame;ogg;orc-0.4;schroedinger-1.0;theora;vorbis;vorbisenc;vpx;x264;xvidcore;faad
;asound;jack
Don't know why really, but it works
To build and run blender select the 'blender' target. By default is set at 'all'. If it starts you'll notice it's completely crippled. This is because it cannot find all it's extra files. You'll need to execute 'make install' before it'll work. What I've done is add the following command to the post-build phase in build-options:
make install
So it's executed every time your build is finished.
Pheew, that wasn't too hard, even building Blender worked out of the box after some tweaking. So let's get to the hard work.
*Working with Codeblocks (10.05 rev0) it frequently freezes while working with the source. I reported this at the CB forum and this seems to be a bug which is already fixed in the source. See http:*forums.codeblocks.org/index.php/topic,16149.new.html //
What needs patching?
The Blender Game Engine (BGE) is able to use a webcam as a source for a texture. BGE uses ffmpeg to accomplish this. However while using this feature I discovered that ffmpeg isn't used to it's full potential. BGE is hardcoded to use vfwcap on Windows and video4linux on Linux or dv1394. However ffmpeg supports more formats like:
- video4linux2
- bktr (BSD)
- dshow
- fbdev
- X11 grabbing
Or even http streaming could be supported. However reading in the source it already should be supported. In theory the source should be modified so that two extra options can be added to the capture code (python). We would essentially need to add the format option and the input option. Optionally we could add more options to support specific format options.
However since I'm new to this source let's start simple.
Line 585 in VideoFFmpeg.cpp reads:
void VideoFFmpeg::openCam (char * file, short camIdx)
{
// open camera source
AVInputFormat *inputFormat;
AVFormatParameters formatParams;
AVRational frameRate;
char *p, filename[28], rateStr[20];
do_init_ffmpeg();
memset(&formatParams, 0, sizeof(formatParams));
#ifdef WIN32
// video capture on windows only through Video For Windows driver
inputFormat = av_find_input_format("vfwcap");
if (!inputFormat)
// Video For Windows not supported??
return;
sprintf(filename, "%d", camIdx);
#else
// In Linux we support two types of devices: VideoForLinux and DV1394.
// the user specify it with the filename:
// [<device_type>][:<standard>]
// <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394. By default 'v4l'
// <standard> : 'pal', 'secam' or 'ntsc'. By default 'ntsc'
// The driver name is constructed automatically from the device type:
// v4l : /dev/video<camIdx>
// dv1394: /dev/dv1394/<camIdx>
// If you have different driver name, you can specify the driver name explicitly
// instead of device type. Examples of valid filename:
// /dev/v4l/video0:pal
// /dev/ieee1394/1:ntsc
// dv1394:secam
// v4l:pal
if (file && strstr(file, "1394") != NULL)
{
// the user specifies a driver, check if it is v4l or d41394
inputFormat = av_find_input_format("dv1394");
sprintf(filename, "/dev/dv1394/%d", camIdx);
} else
{
inputFormat = av_find_input_format("video4linux");
sprintf(filename, "/dev/video%d", camIdx);
}
if (!inputFormat)
// these format should be supported, check ffmpeg compilation
return;
if (file && strncmp(file, "/dev", 4) == 0)
{
// user does not specify a driver
strncpy(filename, file, sizeof(filename));
filename[sizeof(filename)-1] = 0;
if [^1]
[^1]: p = strchr(filename, ':' != 0)
*p = 0;
}
if (file && (p = strchr(file, ':')) != NULL)
formatParams.standard = p+1;
#endif
//frame rate
if (m_captRate <= 0.f)
m_captRate = defFrameRate;
sprintf(rateStr, "%f", m_captRate);
av_parse_video_rate(&frameRate, rateStr);
// populate format parameters
// need to specify the time base = inverse of rate
formatParams.time_base.num = frameRate.den;
formatParams.time_base.den = frameRate.num;
formatParams.width = m_captWidth;
formatParams.height = m_captHeight;
if (openStream(filename, inputFormat, &formatParams) != 0)
return;
// for video capture it is important to do non blocking read
m_formatCtx->flags |= AVFMT_FLAG_NONBLOCK;
// open base class
VideoBase::openCam(file, camIdx);
// check if we should do multi-threading?
if (BLI_system_thread_count() > 1)
{
// no need to thread if the system has a single core
m_isThreaded = true;
}
}
If you skimmed it quickly you'll see it's not the nicest way to determine the capture input. Especially when you know Ffmpeg a little bit.
To test, I quickly changed line 625 to:
inputFormat = av_find_input_format("video4linux2");
And recompiled blender. I now have a working webcam. :)
We need some better code to select the webcam or camera. In Ffmpeg commandline this works by selecting the format, i.e. video4linux2 and the input device. Optionally you set the width and height of the camera and the capture rate. For example:
ffmpeg -f video4linux2 -s vga -r 15 -i /dev/video0 capture.mp4
The VideoFFmpeg::openCam function just prepares all settings. It doesn't do any capturing itself.
It basically sets the input format:
inputFormat = av_find_input_format("video4linux");
and sets some format parameters:
formatParams.time_base.num = frameRate.den;
formatParams.time_base.den = frameRate.num;
formatParams.width = m_captWidth;
formatParams.height = m_captHeight;
Finally it passes a job to openStream:
if (openStream(filename, inputFormat, &formatParams) != 0)
return;
//PS. What I found very weird was that openCam overwrites 'filename' with a string constructed from the camIdx variable. But that's for later worry. // So there are 5 parameters: 1. device 2. input format 3. framerate 4. width 5. height
The python code controlling all this is:
bge.texture.VideoFFmpeg(file[, capture=-1, rate=25.0, width=0, height=0])
So in python we have 5 parameters as well: 6. file (which is the device when capturing) 7. capture 8. rate 9. width 10. height
The 'capture' parameter is basically a switch to determine if openCam should be used or if openFile should be used. It's later used to construct the filename parameter which is IMHO a very weird construction.
Basically best setup would be to use the filename and input format parameter together to determine input. This is also how Ffmpeg works. Let's see different ffmpeg input examples: * video4linux2: ffmpeg -f video4linux2 -s vga -r 15 -i /dev/video0 capture.mp4 (tested) * dshow: ffmpeg -f dshow -i video="Camera" * dshow: ffmpeg -f dshow -video_device_number 1 -i video="Camera" (second camera, this won't work) * vfwcap: ffmpeg -f vfwcap -i 1 (second camera) * x11grab: ffmpeg -f x11grab -r 25 -s cif -i :0.0 (capture display 0.0) In theory this should work as well.... * ffmpeg -i http://stream.some.site/webcam.mp4 [^1]
[^1]: As I heard on IRC http streaming already works in BGE, thanks Kassandra!
If we would just add an optional parameter input_format we can manage most setups.
bge.texture.VideoFFmpeg(file[, capture=-1, rate=25.0, width=0, height=0, inputfmt=""])
So I think the easiest for now would be to expand the python function:
bge.texture.VideoFFmpeg(file[, capture=-1, rate=25.0, width=0, height=0, capturefmt=""])
The VideoFFmpeg class will be expanded with a 'char * m_captFormat;' to hold the format. The 'initParams' gets an extra optional argument for the format. We than have all the ingredients and only need to change the logic.
I'm not going to explain all the logic. You can see a diff the VideoFFmpeg class here:
Index: VideoFFmpeg.h
############### ===========================
1.-- VideoFFmpeg.h (revision 44807)
+++ VideoFFmpeg.h (working copy)
@@ -77,7 +77,7 @@
virtual ~VideoFFmpeg ();
/// set initial parameters
2. void initParams (short width, short height, float rate, bool image=false);
+ void initParams (short width, short height, float rate, bool image=false, char * format=NULL);
/// open video/image file
virtual void openFile (char * file);
/// open video capture device
@@ -150,6 +150,10 @@
/// frame rate of capture in frames per seconds
float m_captRate;
+ /// format short name of capture (dshow, vfwcap, video4linux2 etc)
+ */ see: http:*ffmpeg.org/ffmpeg.html#Input-Devices
+ char * m_captFormat;
+
/// is file an image?
bool m_isImage;
Index: VideoFFmpeg.cpp
############### ===========================
3.-- VideoFFmpeg.cpp (revision 44807)
+++ VideoFFmpeg.cpp (working copy)
@@ -62,8 +62,8 @@
m_frame(NULL), m_frameDeinterlaced(NULL), m_frameRGB(NULL), m_imgConvertCtx(NULL),
m_deinterlace(false), m_preseek(0), m_videoStream(-1), m_baseFrameRate(25.0),
m_lastFrame(-1), m_eof(false), m_externTime(false), m_curPosition(-1), m_startTime(0),
4.m_captWidth(0), m_captHeight(0), m_captRate(0.f), m_isImage(false),
5.m_isThreaded(false), m_isStreaming(false), m_stopThread(false), m_cacheStarted(false)
+m_captWidth(0), m_captHeight(0), m_captRate(0.f), m_isImage(false), m_captFormat(NULL),
+m_isThreaded(false), m_isStreaming(false), m_stopThread(false), m_cacheStarted(false)
{
// set video format
m_format = RGB24;
@@ -153,12 +153,13 @@
}
// set initial parameters
6.void VideoFFmpeg::initParams (short width, short height, float rate, bool image)
+void VideoFFmpeg::initParams (short width, short height, float rate, bool image, char * format)
{
m_captWidth = width;
m_captHeight = height;
m_captRate = rate;
m_isImage = image;
+ m_captFormat = format;
}
@@ -593,52 +594,68 @@
do_init_ffmpeg();
memset(&formatParams, 0, sizeof(formatParams));
+
+ if (m_captFormat == NULL )
+ {
#ifdef WIN32
7. // video capture on windows only through Video For Windows driver
8. inputFormat = av_find_input_format("vfwcap");
9. if (!inputFormat)
10. // Video For Windows not supported??
11. return;
12. sprintf(filename, "%d", camIdx);
13.#else
14. // In Linux we support two types of devices: VideoForLinux and DV1394.
15. // the user specify it with the filename:
16. // [<device_type>][:<standard>]
17. // <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394. By default 'v4l'
18. // <standard> : 'pal', 'secam' or 'ntsc'. By default 'ntsc'
19. // The driver name is constructed automatically from the device type:
20. // v4l : /dev/video<camIdx>
21. // dv1394: /dev/dv1394/<camIdx>
22. // If you have different driver name, you can specify the driver name explicitly
23. // instead of device type. Examples of valid filename:
24. // /dev/v4l/video0:pal
25. // /dev/ieee1394/1:ntsc
26. // dv1394:secam
27. // v4l:pal
28. if (file && strstr(file, "1394") != NULL)
+ // video capture on windows only through Video For Windows driver
+ inputFormat = av_find_input_format("vfwcap");
+ if (!inputFormat)
+ // Video For Windows not supported??
+ return;
+ sprintf(filename, "%d", camIdx);
+ #else
+ // In Linux we support two types of devices: VideoForLinux and DV1394.
+ // the user specify it with the filename:
+ // [<device_type>][:<standard>]
+ // <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394. By default 'v4l'
+ // <standard> : 'pal', 'secam' or 'ntsc'. By default 'ntsc'
+ // The driver name is constructed automatically from the device type:
+ // v4l : /dev/video<camIdx>
+ // dv1394: /dev/dv1394/<camIdx>
+ // If you have different driver name, you can specify the driver name explicitly
+ // instead of device type. Examples of valid filename:
+ // /dev/v4l/video0:pal
+ // /dev/ieee1394/1:ntsc
+ // dv1394:secam
+ // v4l:pal
+ if (file && strstr(file, "1394") != NULL)
+ {
+ // the user specifies a driver, check if it is v4l or d41394
+ inputFormat = av_find_input_format("dv1394");
+ sprintf(filename, "/dev/dv1394/%d", camIdx);
+ } else
+ {
+ inputFormat = av_find_input_format("video4linux");
+ sprintf(filename, "/dev/video%d", camIdx);
+ }
+ if (!inputFormat)
+ // these format should be supported, check ffmpeg compilation
+ return;
+ if (file && strncmp(file, "/dev", 4) == 0)
+ {
+ // user does not specify a driver
+ strncpy(filename, file, sizeof(filename));
+ filename[sizeof(filename)-1] = 0;
+ if [^1]
[^1]: p = strchr(filename, ':' != 0)
+ *p = 0;
+ }
+ if (file && (p = strchr(file, ':')) != NULL)
+ formatParams.standard = p+1;
+#endif
+ } else
{
29. // the user specifies a driver, check if it is v4l or d41394
30. inputFormat = av_find_input_format("dv1394");
31. sprintf(filename, "/dev/dv1394/%d", camIdx);
32. } else
33. {
34. inputFormat = av_find_input_format("video4linux");
35. sprintf(filename, "/dev/video%d", camIdx);
+ inputFormat = av_find_input_format(m_captFormat);
+ strncpy(filename, file, sizeof(filename));
}
+
if (!inputFormat)
36. // these format should be supported, check ffmpeg compilation
+ {
+ // input format not supported??
+ printf("%s not supported\n", m_captFormat);
return;
37. if (file && strncmp(file, "/dev", 4) == 0)
38. {
39. // user does not specify a driver
40. strncpy(filename, file, sizeof(filename));
41. filename[sizeof(filename)-1] = 0;
42. if [^1]
[^1]: p = strchr(filename, ':' != 0)
43. *p = 0;
}
44. if (file && (p = strchr(file, ':')) != NULL)
45. formatParams.standard = p+1;
46.#endif
+
//frame rate
if (m_captRate <= 0.f)
m_captRate = defFrameRate;
@@ -1096,12 +1113,14 @@
short height = 0;
// capture rate, only if capt is >= 0
float rate = 25.f;
+ // capture format, only if capt is >= 0
+ char * capturefmt = NULL;
+
+ static const char *kwlist[] = {"file", "capture", "rate", "width", "height", "format", NULL};
47. static const char *kwlist[] = {"file", "capture", "rate", "width", "height", NULL};
48.
// get parameters
49. if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|hfhh",
50. const_cast<char**>(kwlist), &file, &capt, &rate, &width, &height))
+ if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|hfhhs",
+ const_cast<char**>(kwlist), &file, &capt, &rate, &width, &height, &capturefmt))
return -1;
try
@@ -1110,7 +1129,7 @@
Video_init<VideoFFmpeg>(self);
// set thread usage
51. getVideoFFmpeg(self)->initParams(width, height, rate);
+ getVideoFFmpeg(self)->initParams(width, height, rate, false, capturefmt);
// open video source
Video_open(getVideo(self), file, capt);
So now let's see what this does.
The original code is still in place but that always refused to work on my machine since it was hardcoded to use video4linux. In Blender I use the following code to test this:
import bge
from bge import texture
from bge import logic
cont = logic.getCurrentController()
obj = cont.owner
# the creation of the texture must be done once: save the
# texture object in an attribute of bge.logic module makes it persistent
if not 'video' in obj:
# identify a static texture by name
matID = texture.materialID(obj, 'IMRedellious.jpg')
# create a dynamic texture that will replace the static texture
obj['video'] = texture.Texture(obj, matID)
# define a source of image for the texture, here a movie
obj['video'].source = texture.VideoFFmpeg('/dev/video0', 0, 15, 320, 240)
obj['video'].source.scale = True
# quick off the movie, but it wont play in the background
obj['video'].source.play()
# you need to call this function every frame to ensure update of the texture.
obj['video'].refresh(True)
This is just code pasted from the documentation. It doesn't work like I said before but if I had camera with video4linux support it would. With my patch I can now change VideoFFmpeg line like this:
obj['video'].source = texture.VideoFFmpeg('/dev/video0', 0, 15, 320, 240, "video4linux2")
That works like a charm. 8-)
In theory the following should work as well:
obj['video'].source = texture.VideoFFmpeg(':0.0', 0, 15, 320, 240, "x11grab")
But it doesn't. Reading lib/linux64/ffmpeg/Readme.txt I noticed the supplied ffmpeg is compiled without x11grab support:
52.-disable-x11grab
Too bad... but I could compile my own version of Ffmpeg.
Other options could consist of:
#dshow
obj['video'].source = texture.VideoFFmpeg('video="Camera"', 0, 15, 320, 240, "dshow")
#video for windows
obj['video'].source = texture.VideoFFmpeg('0', 0, 15, 320, 240, "vfwcap")
# blackmagic linux (not sure about this)
obj['video'].source = texture.VideoFFmpeg('/dev/blackmagic/serial0', 0, 15, 320, 240, "rawvideo")
Anyway enough for now. I think my first patch is ready.