Archive for October, 2006

Xbox…360?!

Tuesday, October 31st, 2006

A few weeks ago Tony and I discovered the co-op mode of Spliter Cell: Chaos Theory. It contained four delicious missions of shadow cloaked tension. Then it was over, all too soon.

I quickly entertained the thought of getting an Xbox 360 and the new Splinter Cell simply to have more co-op missions. My hopes were dashed as I learned that the Xbox360 version has a paltry 3 co-op missions, which are actually dressed up versus missions.

Yesterday news hit me that Splinter Cell: Double Agent for Xbox is completely different than the Xbox360 version. Hope against hope I discovered that the Xbox version contained what I needed, in abundance: 12 co-op missions.

The new co-op missions tie directly into the single player story, giving objectives much more weight and intensity. The game even costs a mere $40. Compare that to the $60 Xbox 360 version which is missing everything I want.

I have postponed my purchase of an Xbox 360 once again.  It won’t be long now though, I pre-ordered Gears of War.  Mind you, it’s the Limited Edition.

I can’t wait to play it in 480i.

Progress: Unabated

Sunday, October 29th, 2006

I’ve been extremely pleased with Collada and the ColladaMaya plugin. It has allowed me to get at all the data I want with much less overhead than using the Maya API to get the same data. I spent a few weeks switching the asset pipeline over to use Collada hoping the time spent would eventually be made up in the future. This last week I was able to add awesome new functionality to the asset pipeline, effectively I have already made back my minor investment in time.


Materials
The focus of the past week has been spent supporting material creation directly in Maya. I wanted to see what ColladaFX was all about. Long story short, ColladaFX is an invaluable tool for working with hardware shaders in Maya.

Using ColladaFX is very simple. Briefly it goes something like this:

  1. Create a ColladaFX material in Maya
  2. Select the Cg vertex and fragment shader files
  3. Assign and manipulate shader parameters

I have been using Cg for hardware shaders in the Engine, so it was especially easy to migrate the shaders into a ColladaFX material. I created a simple tool that finds all the materials in a Collada file and writes Catharsis Engine format materials. It has drastically simplified creating and testing materials. If it works in Maya as a ColladaFX material, it is almost guaranteed to work identically in Catharsis. It sure beats the hell out of editing an XML material definition or spending months creating an interactive shader editor.

An example of a simple distortion shader in Maya using ColladaFX. Between the two images, the range parameter has been increased so the distortion increased.

ColladaFX Distortion Material

A really cool feature of ColladaFX materials is that arbitrary vertex data can be bound to shader parameters. In the above example the map1, Maya’s default texture coordinates, are bound to TEXCOORD0 in the shader. In the shader I then specify that TEXCOORD0 binds to the variable TexCoord0 which is then used in the shader program. I could create a new vertex data set in Maya called ‘distort_intensity’ which would modify how distorted the texture is over the surface. Then I could paint the values of ‘distort_intensity’ on the mesh in Maya. Then I would associate the ‘distort_intensity’ per-vertex data set with TEXCOORD3 and add the functionality in the Cg shaders.

This is excellent, but it would have been really cool for the per-vertex data names to be parsed out of shader file like is done in Material Parameters. The shaders would be self documented, as it wouldn’t be more apparent what the per-vertex data is used for in the shader.


Collision
A couple weeks ago I began to look at adding collision volumes to the Collada asset pipeline. I thought that the physics functionality was a part of the standard ColladaMaya plugin. I quickly learned that Feeling Software had created a completely separate (and MUCH more powerful) Maya plugin for physics called Nima. The Nima plugin offers ways to create and simulate many standard physics objects. It currently supports rigid bodies, cloth, and rag dolls. The physics objects can even be interactively manipulated during simulation (IE: You can grab a skeleton and it’ll flop around as you move it in Maya).

All of it exports to Collada, alleviating my collision volumes export needs. It will be interesting to see what other physics objects I can support in the future.


Boo!
Last night, Halloween festivities were held on State Street. This year tickets were required for State Street and the entire event was shut down promptly at 1:30am.

Though wholly disinterested in the celebrations someone threw a costume at me. In moments I was as seen below. I did some programming so attired, delighting Gavin in the process. The costume lasted for about two hours until I nearly passed out because the hat breathed like a plastic freezer bag.

me.jpg

In case you’re wondering, I am supposed to be The Man with No Name. Turns out I inadvertantly dressed up as a jackass.

noname.jpg
The Man with No Name

Collada Exporter

Monday, October 16th, 2006

Over the weekend I was able to get the Collada exporter to the same state that the old Maya exporter was in. It can export scenes and geometry for use in the engine.

Getting the Collada exporter up to spec took a little more work than the Maya exporter. With the Maya exporter I was able to do very high-level operations on geometry. For instance, I used the obj exporter to compute the world space transforms of a set meshes in one step. There is no such functionality in Collada, but it was simple enough to implement.


Efficient Geometries

While writing the Collada exporter I took the opportunity to modify the mesh architecture in the engine. Previously the exporter collapsed geometry by material. All meshes with the same material would be collapsed into one mesh object. Each mesh stored its own vertex data and index buffer. This is a straightforward approach but is a bit naive. It results in many small vertex buffers and a lot of vertex buffer switching.

An observation I made is that vertex data and material are not explicitly linked. In other words, there is no reason to split up the vertex data by material. On the other hand, the index buffer must be split up by material so that the renderer can stop and bind a new material before drawing the next set of polygons.

The modified architecture splits up vertex and index buffers into Models and Meshes. A Model contains all vertex data and a list of Meshes. A Mesh contains a material and index buffer to index it’s parent Model’s vertex data. Using this architecture all vertex data, regardless of material, is collapsed into one vertex buffer. This system prefers larger vertex buffers and requires much less vertex buffer switching.


Back to Collada
In the end, Collada exporting is much faster. The command line Maya exporter had to dynamicly link with Maya’s HUGE DLLs. This process caused the exporter to stall for 3-5 seconds at startup. This lag time quickly added up when exporting multiple scenes. With the Collada exporter the export is instantaneous.

Things I’m looking forward to:

Collada Refinery
There is a collection of utilities for “conditioning” the data in Collada files. Examples of conditioning include tri-stripping to increase rendering performance. These utilities are collected under the name Collada Refinery.

Collada Physics
Collada handles physics attributes and many collision volumes. Building a physics system into the engine that uses this data would be very interesting.

ColladaFX
Integrated shader parameters and shader code in the Collada files. ColladaMaya even supports rendering using these shaders in Maya views.

Software Craftsman

Friday, October 6th, 2006

I found a great blog post about what motivates software developers and differentiates them from “paycheck programmers”. In summary:

Building software is a very creative and constructive process but the intangible nature of software makes the parallels to traditional engineering difficult… Yet we do still share many of the same feelings and priorities as conventional craftsmen.

  • A tendency towards perfectionism
  • Pride for the end product
  • Strong sense of ownership
  • Criticism of other work
  • Responsibility for flaws
  • Strong affection for tools of the trade (editors, IDEs, utilities, home grown tools)
  • Strong need to use new tools and processes

I couldn’t agree more! Especially with the need to use new tools.

I’d be much less interested in game engine development if the tools and hardware never changed. In this new generation of game consoles people tend to do one of two things: complain about them; or quietly learn to use them. If it distresses you that everything is always changing in the game industry, why do you want to make games?

I have no complaints, and a lot to look forward to.

IGDA

Thursday, October 5th, 2006

Denrei’s recent post reminded me that I should reflect on attending the IGDA…

This was the first IGDA Madison meeting and was necessarrily geared towards organizing and aquainting. It was interesting to hear about peoples experiences. We talked with a guy at Human Head about their build system and how they are re-evaluating the system for their next project. They had tried Subversion for assets management, and came to the same conclusion we had: it blows. He eventually mentioned that they were now using PerForce which is basically like Subversion but much faster. This guy was a rendering programmer and I would have liked to talk to him longer, but didn’t get the chance.

A startup company called Frozen Codebase was there. They recently setup shop in Green Bay and received some venture capital. Currently they have a team of 7 people working on their first game and a few interns from the local ITT Tech Institute. I found this very interesting because it’s exactly how my current place of employment operates. Except there are only three people.

Someone asked why I didn’t just use the Torque Engine. I explained that I wanted to try developing an engine with the lightweight shader system that I explained in a previous post. I was interested in making engines and tools with new technology, not making games with the same tools.

Later someone said that he heard the PS3 was hard to develop for. Compared to the development system of the Xbox 360, it must seem unbelievably difficult. The PS3 doesn’t tie directly into an IDE like Visual Studio, you can’t hit F5 to compile and run your game. I told him I found it an interesting platform to develop for and plan on looking at the PS3 linux development kit. He seemed shocked…

All in all it was a good meeting. Everyone there was making games in some form, so there was a common ground. (At the UPL’s game SIG there was often no common ground which was frustrating). I’m looking forward to the next meeting even more. SIGs are going to be formed to focus on specific aspects of game development. I will be attending the programming SIG. I can’t wait.

Provocative thought: Raven Software seemed to be the only non-independent developer in attendence.

Maya Tangents

Thursday, October 5th, 2006

Did you know that Maya 6.5 and up completely supports tangents?  They are as easy to use as normals!  Everyone upgrade.

Collada

Sunday, October 1st, 2006

A long time ago when I was reading about the Khronos Group and enjoying their wonderful open standard specifications I happend upon Collada, an open standard asset exchange specification. The idea behind Collada is to provide a standard intermediate format between the content creation tools (Maya, Max, etc) and the content processing tools that create engine content.

I use Feeling Software’s ColladaMaya and FCollada. Both of these are Open Source allowing me to add extra features when necessary. Instead of dealing with content creation tool file formats and Apis now I can just work with Collada. The process goes something like this: 1) Source assets are exported from the content creation tool to a Collada file; 2) Lots of complicated tools process Collada files into engine assets.

Though it does add an additional step at the very top of the asset pipeline it has some major advantages:
1) The tools programmer only has to learn Collada’s Api
2) Any content creation tool that can export to Collada can be used to create engine assets
3) The FCollada Api is very easy to use

ColladaMaya is very straightforward to use. Simply load it in Maya and when exporting select Collada. ColladaMaya has a number of export options which are easily configurable during export.

FCollada can be used to both create and read Collada files, but most people will use it to read them. It differs from the standard Collada DOM in that it loads the entire file into a tree which can be traversed and queried for data. It is a much easier way to deal with this kind of data. Once I learned the basics of the FCollada Api I was able to create a basic exporter in about a day.

All in all Collada provides much more flexibility to the asset pipeline without tying your asset processing tools to any one content creation tool.