Raspberry Pi, Soft-emu vs Hard float. [Updated]

I have had chance to build LuxRender on two distributions. The official Debian build, which uses software emulated floating point, and Raspbian, a build which uses hardware floating point and compiled to the specific ARM chip the Raspberry Pi has.

For numerically intensive functions, Raspbian should drastically improve performance, so in practice how does it fair with Lux?

I had two Raspberry Pis with Debian and one with Raspbian, in terms of peak samples per second, they break down as follows,
Debian 160 S/s
Raspbian 400 S/s
Raspbian 850 S/s in turbo mode…!!!!

This suggests around a 2.5x speed increase. After approximately one week of running for two of the Pi’s the total number of Samples per Pixel accumulated are as follows.
Debian 107 S/p
Raspbian  257 S/p

Which backs up the above giving about a 2.4x speed increase. Merging the two flm files together we get this result.

For comparison, using one thread of a 2.56 GHz Core 2 Duo on my mac we see about 16,000 S/s so the Raspberry Pi is indeed quite slow. But hey, anyone expecting it to be enormously speedy for stuff like this should have spent more time reading the FAQs.

A classroom of Raspberry Pi’s could make a nice lesson about cluster and parallel computing with LuxRender ūüôā

| 1 Comment

LuxRender on the Raspberry Pi

[UPDATE] 15 April 2013 – Due to some drastic code changes in the lux and the recent work by the developers into making SLG an active render engine, it is likely that these instructions will not work for latest versions of the code.
Versions 1.1 of both Luxrays and luxrender will compile however. To get these, once you have cloned the repositories use the following commands
hg update v11
hg update luxrender_v1.1
Many thanks to Carlos for running through these instructions and informing me of the code break.



So from start to finish, this is how to get both Luxrays and Luxrender working on the Raspberry Pi Computer. You should note however that performance is very very slow, however for simple scenes it is able to still produce nice results, and if a bunch of them where networked together into a cluster… dare i say ‘bush?’ a classroom could give the same output as slow modern PC. (The arm is not optimized for multiple simultaneous operations so suffers alot compared to a modern x86)


1) Open a terminal and Install *some* of the dependancies of luxrender. Some of them are the wrong version and such we will have to build those ourselves.

>sudo apt-get install mercurial build-essential bison flex libopenexr-dev libtiff4-dev libpng12-dev freeglut3-dev qt4-dev-tools libxmu-dev libxi-dev libfreeimage-dev libbz2-dev

2) Make a dev directory to keep your home area tidy, and cd into it

>mkdir dev
>cd dev

3) Download cmake and uncompress it and compile as the one which downloads through apt-get is too old.

>wget http://www.cmake.org/files/v2.8/cmake-2.8.7.tar.gz .
>tar zxvf cmake-2.8.7.tar.gz
>cd cmake-2.8.7
>sudo make install

4) Download and build python 3.2 once again, because the version that comes with the distribution is too old… ****This isnt strictly required as it is only required for pylux but im including it because it works.

>wget http://www.python.org/ftp/python/3.2.2/Python-3.2.2.tar.bz2
>tar zxvf Python-3.2.2.tar.bz2
>mkdir python32_build
>cd Python-3.2.2
>./configure¬†–enable-shared –prefix=/home/pi/dev/python32_build
>make install

There will be a few things that look like errors, but you can ignore them

5) Get Boost v 1.47 , make a cup of tea and drink it
>wget http://iweb.dl.sourceforge.net/project/boost/boost/1.47.0/boost_1_47_0.tar.gz
>tar zxvf boost_1_47_0.tar.gz
>cd boost_1_47_0

next edit the file¬†project-config.jam¬†with what ever text editor you like… i used emacs
>emacs project-config.jam
Go to the line that says
using python : 2.6 :/usr;

and add the following under
using python : 3.2

Be sure to use a ; at the end of the statement or it will not work. Now build boost as follows.

./bjam python=3.2 stage

Fingers crossed all targets should build *

6) Get and build / install libglew

> wget http://downloads.sourceforge.net/project/glew/glew/1.7.0/glew-1.7.0.tgz
>tar zxvf glew-1.7.0.tgz
>cd glew-1.7.0
>make install


7) Clone the luxrays repository and prepare the build with cmake

>hg clone http://src.luxrender.net/luxrays
>cd luxrays

8) Down to the nitty gritty
Lux and LuxRays wont compile out of the box because of the use of SSE extensions of x86 ¬†arch, which ARM does not use. So we need to do some fairly invasive modifications of the code. Unfortunately at this point, me offering a list of changes is the best I can do at the moment. This gives a few limitations, SSE is used in the qbvh accelerator, so those parts of the code need to be commented and the headers removed. ¬†If you try to make, you will get as far as the qbvhaccel and it will fail with a bunch of errors related to a missing header, ¬†“xmmintrin.h”. Essentially all the instructions below are to remove any and all references to qbvh and mqbvh from the code. First by removing SSE compiler flags, Second by making the builder ignore the corresponding headers and cpp files and finally cleaning up any references that left in the code which stops it building. Not very clean, but, no way around it (that i know of). For most of these when I remove/comment lines, they are found by searching for qbvh… so if you comment them the lines might not match with those below, but they will give you a good idea of location.

I) emacs cmake/PlatformSpecific.cmake
Line 107 and 107, remove all references to msse, msse2 etc….
The purpose of this is so all make files wont include those compiler flags, if it still complains about msse extensions, comb that file and remove all references to -msse, -msse2 etc

II) emacs src/CMakeLists.txt
Line 30 remove mqbvh and qbvh from the SET(…..
Perform a search and remove all lines that contain qbvh, including mqbvh lines too.

III) emacs src/core/dataset.cpp
Comment line 31  and 32 (headers)
Line 51 accelType = ACCEL_QBVH change to accelType=ACCEL_BVH

Search the rest of the file and remove all further lines/sections containing qbvh or mqbvh

IV) Make
>cmake . -DBoost_INCLUDE_DIR=/home/pi/dev/boost_1_47_0 -DLUXRAYS_DISABLE_OPENCL=1

After waiting for a while (maybe an hour or so) LuxRays will compile. I have tested this on my images and I can say that it works, if you are in the luxrays head directory, you can test this by simply running


A window will open and slg will run a standard luxball scene. Performance is slow, i would recommend if you would use it to render anything, then making sure it only uses a single thread would be an advantage, it stops the ARM waisting cycles.


So at this point we dont have to go off and grab any more dependancies just grab luxrender from the repo, make modifications and build

9) Clone the repo
>hg clone http://src.luxrender.net/lux
>cd lux

Make the following modifications

>emacs CMakeLists.txt
Search for and remove sse flags, specifically around line 309

>emacs cmake/liblux.cmake
Search for and remove references to the qbvh accelerator, just do a search for it and comment or remove lines (im assuming you remove the lines so the numbers below might not work)

>cmake . -DLUXRAYS_DISABLE_OPENCL=1 -DBOOST_INCLUDEDIR=/home/pi/dev/boost_1_47_0 -DPYTHON_LIBRARY=/home/pi/dev/python32_build/lib/libpython3.2m.so -DPYTHON_INCLUDE_DIR=/home/pi/dev/python32_build/include/python3.2m

If everything is well and you followed things reasonably well you should now have a working build of luxrender!

A few notes, if you try to run luxrender, you will get a complaint about boost, to get rid of this add the boost libraries to your LD_LIBRARY_PATH environment variable. Alternatively copy the libs to /usr/local/lib which should allow them to be picked up without any fuss.

I will update with some pictures of luxrays and luxrender working/scenes rendered on the Pi when i get chance.

Here is a scene with volumetric scattering at 25S/p rendered in about 12 hours

Posted in LuxRays, LuxRender | Tagged , , , | 40 Comments

Raspberry Pi

The RaspberryPi is a small cheap ARM based bare-bones PC targetted at school kids and educators for teaching basic and probably some more advanced level computer science at schools. I think it is an amazing project and something I have been saying is needed in schools for many many years.

My statements always went along the lines of “The problem with IT in schools is that they only teach the use of admin based applications, and the level of information regarding how the computer works, its construction etc is minimal and often incorrect. It breeds a sense of techo-fear where people are afraid to use a PC incase they break something.”

Truth is, you don’t need to be an enormous nerd or a geek to get a lot of use and flexability out of a computer. Even a basic understanding of a terminal and the general workings of a PC would be 100x more useful than “Double click on the excel icon…. dont touch anything else!” that appears to be what the kids get today. (Not to poke at teachers, I believe its more the curriculum that needs to change, I’m sure IT teaches find it soul destroying the level they have to teach)

The R_Pi is an ARM System on a Chip running at 700MHz base (though i think it could be over-clocked… you naughty people) with 256MB ram. It features network, USB connectivity, RCA connection, Audio and HDMI output along with some other nice stuff. Full details here….

While many of you will take a breath and say “You cant do anything with that” The truth is, maybe you are correct, but for a mere ¬£25 it will make you take a look at the CPU and memory usage of typical applications and make you see how needlessly bloated they all are and as a challenge, make you think more about efficiency and getting the most out of something rather than “Oh it should be pretty more than functional”.

Once in the hands of nerds and educators alike, this epic little PC will hopefully start an exciting step forward in IT education.

My own plans? Well… compile LuxRender for it of course!!!! I have managed to do this already on a debian arm distribution, emulated in QEMU…. however I would like to do this again using the official R_Pi disk image.

Instructions to follow!

| Leave a comment

LuxRender 0.9dev – Normal Mapping Support

Thanks to the great efforts of LordCrc, the development builds of LuxRender now have Normal mapping support.

It can be accessed by defining a luxrender texture type as a normal map in the luxblend25 exporter. Once defined like this, it is used in the ¬†bump map slot and… bam… awesome happens.

What does this mean? Well as far as I understand, a normal map is used in a similar way as a bump map and give the appearence of a highly structured surface that is intact flat or low poly. So you might ask… so what is different?

The best way I think I can understand it in terms of lux is that, a bump map represents a height field, and lux determines the normal of the surface based on that. However, the normal map, gives the normal as a colour vector. So in a round about way, they are identical but from different starting points. Normal maps are also relative in height, rather than absolute, this  changes how you manipulate them in lux. Typically you bake the depth into the map, and set the height to 1, the height cannot be set any higher than that, though can be reduced if needed.

The other nice thing is that it is handled by the bump map code and such, makes it possible to apply both a bump map and a normal map to an object via a mix texture in the way you would expect, where the normal map is relative and the bump map is absolute.

However from my own experience playing with normal maps and lux, normal maps appear to give an altogether more appealing appearance, and are a little easier to manipulate than bump maps. This i think is more to do with image manipulation when generating the normal map, over that of forming a height map. This is possibly due to the heavy use of Normal maps in game engines, and so the software is setup to produce the results you want as normal maps, more than displacement maps.

I made a simple scene with two cubes, both are uv unwrapped and have a simple cloud texture, with a luxrender logo set on a white boarder. Here i want the luxrender logo to be raised on a flat panel, and then the text stamped into the cube.

The first thing you will notice is the softness of the normal map, and yet, it still achieve a good effect of depth, (right). The bump mapped cube is a lot rougher and doesn’t quite give the same effect of depth. Again, I think this is more to do with generating the normal map than anything else.

Here i used GIMP to make the height map texture, and then CrazyBump to make the Normal map.

In this next example, both cubes have normal maps with exactly the same properties, however, in this case, generated by GIMP with a hight of 10. You can see a big big difference, and goes to prove what i said that the difference in the two is most likely image manipulation over actual different treatment in lux.

Here we still have a smooth surface, and the logo is far less pronounced.

Running the same map at a hight of 30 in GIMP gives me this,

As expected, the surface is rougher, and the height of the luxrender logo is all together deeper.

The next example is the same map as above, put into Crazy bump with some settings changed, it is by doing this text that i see that CrazyBump likes to produce rather blurry normal maps with exaggerated depth. This isn’t totally a bad thing, it all depends what you want to do.
Here is the result,
 The most obvious part of this is that the luxrender logo is more visible, but is softer.

Next – i did some blurring and re-toning in GIMP, I can’t give instructions exactly how i did this as it did just involve two layers and a bit of blurring, simple as that. I then used the 9×9 filter in GIMP and generated a normal map,

The nice thing here is that the text and logo are well defined, and the map isn’t too sharp, it I might have got it just how i want it… however it is still a little too intense, so lets reduce the normal map hight a little bit in luxblend.

Looks good.

I think in the end it is all about playing with the tools to make them do what you want, there is no magic combination, however still think using normal maps rather than bump maps is a very powerful tool and a welcome addition to the features ūüėÄ

The ground was produced by manipulating a high resolution plasma map generated in GIMP using crazy bump. As you can see, close up, it looks fairly poor, however at distance it looks quite impressive. Once again, i have NEVER achieved this kind of effect using a bump map. Maybe this is my own poor ability at generating height maps, either way its nice to see the enhancements

| 1 Comment

LuxBlend in Blender 2.58

I have recently been trying to practice and get used to LuxBlend in the new-ish Blender 2.58. After using 2.49b for so long, the changes dont come naturally. That said, I need to move with the times and get used to the new.

To do this I took an old scene file of mine, the spider scene I think I have posted here before.

Here is a reminder

Lets start out with the table. In the above, I also used an image map for lighting, but I have replaced the main light sources with two light planes. Not shown below is the glass, which doesnt change at all. It is simply glass2

So far so boring, I set the table to glossy, and made the diffuse colour 0.26 across the board, and the specular colour I set the texture 0.24 across the board. u and v roughness are both set to 0.05.

Lets add some texture, for the original scene i used a texture from the indigo material database. It is the Elm wood texture. in blender 2.58 i recommend splitting the material pannel in half so you can see the textures, and material settings next to each other.

To explain the panel a little, first look at the material panel. It is exactly the same almost as 2.49b. ¬†to define the Diffuse colour to have a texture i have pressed the T button… The nice thing about LuxBlend25 is that, I have to make my textures in the texture panel before i can assign them, once it is defined, when pressing T, a box appears with the name of all the textures defined for that material.

So look at the Texture panel, Tex.001 is my diffuse colour, it is defined as a blender type image map. The ElmWood texture is set as the file name, and i have defined it to be UV mapped and scaled down 5 times. This is because, usually the textures are high res, but not a high res scan of a whole table!

NB, when defining textures, you can use the blender Texture -or- the LuxRender Type texture, for image maps they are equivalent i think. As you can see, I have set the texture to the Diffuse colour and checked the M button, which means Lux should multiply the colour of the texture by the colour defined (in this case 0.26) so it will appear dull.

Great, looks ok

I cannot remember if the packed texture contains a bump map or specular map, so if it doesn’t, i think i likely grey scaled and played with the diffuse map. Either way, I did the same for the specular map as i did with the glossy.

Here you can just start to see the difference made by the specular map in the reflection on the surface.

The fun part now is the bump map. On the original scene I only used the grey version of the elm wood texture and set an appropriate scale which gives the slight bump you see in the above.

For this, i tried to do something extra… a scratch texture too to give me a table that looks like it has seen some action, like you might get in a pub. Doing this is simple in either 2.49 or 2.58, but i found the interface to be much more user friendly and simple, allowing me to define a few textures and have them all to play with, rather than a HUGE long vertical list like it would be in 2.49, where each texture has to be defined by hand. Here things are defined and then put into slots, the system is wonderful.

I googled before for a scratch texture, once more, i cant give the address to go and get it because i cannot remember, sorry. So i defined the scratch texture in the same way as I did the glossy and diffuse. I then used a mix texture to give me a 50:50 merge of the scratch texture, and the Glossy texture. This time i didnt NOT press the M button on the textures, As i didnt want to weight them.

For this scene the approximate scale height for the bump map as you can see from the above panel is 0.0004, (0.4mm), I put the mix texture into the bump map slot and selected the M, so the texture height is scaled appropriately. I think this could be a little too high a scale for the bump map when looking back at the original image.

Either way – the final image, along with the spider, and another spider + beer mat is as follows.

The colours look a luttle different, mainly due to tonemapping. However, the extra experience I have had with material setup between the original and now shows a little. I am very happy with the table and will likely feature it more often.

This is the final result, sorry for not including more pictures of panels, much of it is i believe self explanatory from the images shown, and the text. I found that once i got started and defined a few textures, everything actually fell into place.

My only word of caution is that, it often appears like textures do not work when you hit preview… if this happens, it usually means you have mixed up a float texture with a colour. A float texture is when a numerical value is given as the texture, such as in a luxrender procedural map. The best solution to this is to do a mix texture… a two step texture where you define the colour in the mixed textures, and the mix amount to be the procedural texture. Getting this right and getting it into your head in the right way is a bit difficult. I have had a few occasions where textures seem not to work at all…

This problem was taken care of in LuxBlend2.49, so it is something that was hidden to the user. I am not sure if there are plans to change this for LuxBlend25, but it isn’t too bad once you understand the differences.

Bump maps for example tend to cause most heart ache for this… but… from advice, bump maps should be grey scale or floats in general… and things will be ok.


Python + Boost + LuxRender = working pylux?

Yesterday I tried to fix an issue which I think i noted before on my Linux environment in which,  Python Boost and LuxRender would compile seemingly fine, however when i try to use pylux in blender 2.5 I am left with an error in the console.

The Error was this
ImportError was: /usr/local/lib/libboost_python.so.1.43.0: undefined symbol: PyUnicodeUCS2_AsUTF8String)

I was helped alot by a member in the luxrender IRC channel, (THANKS ONCE AGAIN! Orbisvicis) and would like to share what we think was the problem and the fix. For my records and for anyone who is unlucky enough to see the same error. I say unlucky because from experience, getting and compiling boost and having it link with everything, can be extremely painful. I lost count of the number of times i compiled python boost and luxrender yesterday.

OK so it appears to be a unicode problem, the exact cause could be a number of things

  • unicode support was not included with the python I compiled.
  • pylux is linked against an incorrect version of python.
  • the python implemented/packaged with blender is causing the issue.

The easist one of these to check is the linking in pylux.so use ldd pylux.so to check the library dependancies. Look for things that look odd or out of place. NOW this is were I had my problem. Despite Boost being built against the correct version of Python, when make finally linked pylux.so, it fell back to my systems pre-installed Python2.6… so it was getting confused by this.

HOWEVER for completeness I will detail the final full solution of building python correct for the job and luxrender.

First i got the latest version of python 3.2.1rc1
I applied a patch to the python, for the boost libraries

Python was then built with the following command.

./configure –prefix=/usr –enable-shared –with-threads –with-computed-gotos –enable-ipv6 –with-valgrind –with-wide-unicode –with-system-expat –with-system-ffi; make

I changed –prefix to my desired build location. and did a make install after the above command. NB you might get an error during the compile, however, these shouldnt be anything you need (for LuxBlend or LuxRender) if you need to fix them, unfortunately that will be up to you, sorry.

I also got a later version of boost, 1.46, however the build process is IDENTICAL to that talked about previously however, be sure to check you used the correct include directories, as the normal
directory will have a ‘u’ on the end (signifying unicode?)

Boost should (at least it did on mine) compile perfectly fine.

The next part is LuxRender. The part here is cmake will often default to system directories in case of issues, and unfortunately this causes alot of issues. Cmake in luxrender was reorganised recently, and is likely still undergoing some change. there is a directory, cmake in the head directory of lux which contains a few useful files in which you can change cmake include directories and search paths.

in my case because python was likely my issue when i did the cmake i used the following commands

-DPYTHON_INCLUDE_DIR=<link to include dir>
-DPYTHON_LIBRARY=<link to actual lib file>

A full list of all options can be found in the CMakeCache.txt file which cmake generates… there are MANY MANY MANY and with the new cmake rewrite, if you have problems you can write your own custom config file and load it. Once you have picked out the ones you need you can use them by

cmake -DPYTHON_LIBRARY=<link to actual lib file> .
for example.

I was able to use blender with pylux.so and all was right with the world. ūüėÄ

Once again thanks to the help i was given in IRC, and I hope these notes can come in handy to anyone suffering similar issues. Even if the error isnt exactly the same as shown here, the cmake notes at the end can fix so many issues. Also for library problems ldd is your friend.


Simple is Beautiful

With the release of LuxRender 0.8 comes a new competition. The competition has been running since early this month, and ends at the end (strangely enough!). The example previous competition was a to luxrender example scene that could ship with the installer. I didnt enter this one due to time constraints.

However I did enter the Simple is Beautiful competition. Long ago I attended a conference and during the tea/coffee bio break session there was a technology stand, it was quite striking and showed what appeared to be the fuel/control rods at the top of a reactor, surrounded in a bath of Cherenkov blue haze. It was very pretty, and since then i thought about making something similar in Lux.

However, back then to achieve such and effect was possible, however it was somewhat limited and involved using the Bound Volume interface, I never got it to work in a satisfactory way. With the advent of homogeneous volumes in lux, it became possible to do this far more easily, and quickly.

I cannot go into alot of detail, as the scene is infact very simple. It consists of a long tube, which is filled with smaller tubes of a similar length. The inner tubes are smaller and are supposed to represent the control/fuel rods. I then used the homogeneous volume settings to set the atmosphere to absorb and give me a blue haze, and the scattering value to something small, but large enough to give me a desired brightness for my ‘god rays’.

Physically this isn’t really correct, as the Cherenkov is an emission rather than a scattering, and the blue colour isnt from the absorption at all… however, to fake the effect I was after, for this scene it is ok.

The part which really gives the cross lighting effect in my scene is that the lighting is a disc placed at the bottom of the larger tube, it is small and only really covers the inner 4 tubes, hence any other light that is seen is an effect of scattering or multiple bounces around the scene.

The scene was rendered for about 1 week on 6 CPUs and here is the result.

| Leave a comment

Change of Linux Distribution

At home my PC has 3 OS’s installed, Windows 7, Windows XP and Ubuntu. I don’t really use a bootloader, i use the motherboard boot device selector to switch between the two drives (7 has a drive to itself, and XP and Ubuntu shares)

I decided that I would switch over to just having Windows 7 and Linux, as I haven’t used XP in a long time. I would also change distributions of Linux away from Ubuntu to… Scientific Linux.

What? Why? What the Hell is that?

Scientific Linux was the first linux I was introduced to during my PhD, back in those days (despite it being only 5 years ago) SL 3 was on the systems at Uni, and, while functional it was pretty terrible and getting anything to work was extremely hard work, getting the correct versions of everything was a big task. SL is maintained mainly by the particle physics community, and such it is loaded with useful things as a development environment pre-shipped, however like I said above, it was a little behind the pace back in those days.

Things are a bit better now with SL6 and in the whole is more up to date and compatible
with hardware.

I have collected together all the helpful and handy hints I used after SL was installed in order to get LuxRender built, along with useful things like… graphics drivers.

Graphics Drivers (Nvidia) The original is instructions are listed here but I present a shortened version

  1. Get the required dependancies : yum install kernel-devel gcc
  2. Grab the Nvidia Driver of your choice
  3. black list the nouveau driver
    edit : /etc/modprobe.d/blacklist.conf
    Add : blacklist nouveau at the end
    black list in grub by editing, : /boot/grub/menu.lst
    Add : rdblacklist=nouveau at the end of the line starting with kernal, of the one you want to install the drivers
  4. Set the default init level to 3 to stop the graphics drivers being loaded during start up
    Edit : /etc/inittab
    Replace id:5:initdefault: with id:3:initdefault:
  5. Reboot
  6. log in as normal and run the nVidia installer
  7. Reboot
  8. log in again, and type init 5
    If everything went well, the GUI will load and everything will work perfectly ūüėČ
  9. If all is good, reverse point number 4.
  10. If all is bad… follow all this backwards

My experience with this isnt great, so i take no responsibility if you brick your linux… I certainly have tried to cut corners in the past and ended up in a total mess. These instructions i followed to the letter and successfully installed a v275 series driver in Scientific Linux 6_x86_64.

Building LuxRender

I used a guide posted on the LuxForum  If you follow it to the word, everything should work fine. You can use yum or apt-get in general to install all the dependancies you need and in the case of boost and python, follow things to the letter of the guide and you should not get any problems.

Notes :

If you have installed the nVidia drivers, you will usually have OpenCL and Cuda libraries installed in your system, and thus not require the ATI Stream SDK as listed in the guide. HOWEVER you will need to make a /usr/include/CL folder and copy the OpenCL headers in there. This can be taken from the ATI Stream SDK or, from the khronos group page.

The only things i changed was that i didnt use the ATI stream SDK as explained. I also copied the compiled boost libraies to my system and removed version 1.41 which shipped with my system. ¬†Other than those ¬†you might need to get hold of is libGLEW¬†and compile and install which isn’t really listed as an instruction.

With a little discussion and following that guide i was able to get LuxRays and LuxRender built and running on Scientific linux… which was a nice moment.

| 1 Comment

[WIP] Downed Starship – Cleanup, Mountains and stones.

Working on the Downed Starship in blender 2.57 has brought a few challenges, one of them being placement of debris. Previously I used the game engine to simulate physics. however ,with some of the simulation occuring using the older 2.49 engine, parts of the scene seemed to be suffering from caching issues and for other parts were suffering from being, ‘un-modifiable’ that is, if i move an object to lay it better or put it somewhere a little different, when i hit the render button, blender resets the position of the object back to were it was in the animation.

I cant really complain since in this respect blender isnt really performing an animation, it is the game engine. I finally found a way to remove animation information, but it is somewhat tedeaous and im sure there is  a better way. All you do is go to the Dropesheet and select the object, and delete the position and rotation channels. For the most part this esensially clears all the information. For my scene it didnt reset the position or rotation but left the object where it was on that frame. I should experiment more, but for my cleanup it worked out just fine.

In order to make my background a little more interesting i added a second set of mountains/hills by instancing the original background and pushing it back a little, and rotating. I experimented adding a few rocks, this looks reasonable, however I think i need to remove the texture from them as they look like sandy blobs rather than rocks.

I also used arch glass rather than the full glass refraction, this was an experiemnt to speed up the render time. I also changed the volume integrator to single over multi, and in effect boosted my render speed, from about 500S/s to 10kS/s which is nice. I had to increase the scattering scale however to get a scene similar to what i had before.

Here is the result.

I have some kind of grid artefact in the sky, likely caused by numerical accuracy, but i will be checking with the people at Lux to see if anything obvious can cause this.

| Leave a comment

[WIP] Downed Starship, in helmet render

I wanted to do some kind of real view, as though looking through the eyes of someone standing at the scene of the downed starship. This presents a few issues, scale, focal length and getting the materials and volumetrics to behave. All this within blender 2.57!

I spent some time modelling a helmet, this was a challenge for me but rewarding none the less,

One helmet, as you can see i placed the camera inside it, the focal length of an eye is somewhere between 17 and 24mm however for a camera this doesnt appear to give the same sense of peripheral vision. So i made it shorter to get more of the scene in.

The next part is working out the reflections for the inside. Now i did a number of test renders with glass and the number of reflections where minimal, this is good you might say, as a helmet would hope to reduce the number of reflections as not to confuse the user. And you would be right… however, to have zero reflections is also not in my opinion realistic and takes away from the fact it is an inside helmet view.

I thus added a mix material of glass2 and a 1%mirror and tested that. I also placed a mesh face inside the helmet. As you can see the face is placed far far far behind and not exactly where the real viewing place would be. again, this will need some moving and adjusting.

The last thing i tried was to make a HUD, in order to have some kind of computer overlay to the scene, im not sure i like this but will at least keep it there for the time being.

The last point to make is the volumetrics,

I initially had problems that the volumetrics would be reset when the light passes through the glass of the helmet… that is, the scattering simply would not be there and it would be perfect daylight outside. It appears that if a volume is planar and thin, then this can happen for a scene like this because of rounding errors and the step size of the volume. In total i had to use a thickness of 0.04 using the blender solidify tool, which is somewhat unphysical, but i will look at that later.

Here is the resulting 10hr render

As you can see i dint play around with all the materials this time, because im in blender 2.57 i have to rework all the materials. I think i need to raise the scattering a little bit, but other all, it isnt too bad a progression. The soil needs better detailing i think, probably a displacement map or higher bump map.

| Leave a comment