MAR 20

Know Your Spell

On Saturday, February 25, about two weeks ago, we were part of WandWorks, a fantastic event in the Cotsen Children’s Library. We had accepted our first commission work months earlier, for which we were supposed to create a Harry Potter spell casting simulation game for kids. It was amazing to see so many positive reactions!

Dana Sheridan, the Education and Outreach Coordinator at Cotsen Children’s Library, and one of the main organizers of the event, wrote a blog entry featuring the game. We're very grateful to her for all the energy and passion she put into this event. Please check out the blog post here!

The game is free for download, though it might melt your computer. Sorry not sorry; ain’t nobody got time to optimize stuff. Check out the games section of the website for the download link.


OCT 19

Storm O.S.

Ludum Dare 36 is over, and our entry is finished!

In Storm O.S. you play as an android that has been mysteriously freed of its Master, and attempts to solve the mysteries of the ancient Storm Operating System. This short game is the product of 3 days of work.

Visit our Ludum Dare entry page to play the game! Available for both Windows and Mac.

Many thanks to Massimo Pericolo for the incredible soundtrack.


AUG 26

Ludum Dare 36

Kapricorn Media will participate in Ludum Dare 36!

For those of you who don't know, Ludum Dare is an accelerated game development event (a "game jam"). For teams, the event consists of creating a game entirely in 72 hours, and it starts today, Friday, August 26, at 9:00 PM eastern time. We're going to be working all weekend on a new, simpler game for the competition, and we'll have to finish it on Monday, August 29 at 9:00 PM.

The Ludum Dare organizers have yet to announce the theme, so we don't have any details about what the new game will be about. We might not be able to update the blog during the weekend, but we'll at least announce our finished product on Monday. The image we posted shows an interesting 2D game idea that we implemented as a warm-up for the competition. It turned out similar in essence to our big project, and we might develop it further in the future.

We'll be streaming the development process on Twitch, starting today at 9:00 PM, on my twitch channel. Feel free to join and watch the game grow to completion over the weekend!


JUL 27

Deferred Shading

As we continue our effort to materialize the huge list of game ideas, it’s becoming clearer to me that the game’s lighting engine needs to be very robust, and that it must be able to handle multiple light sources efficiently. The traditional forward rendering pipeline isn't particularly great in this regard, as the environment is also prone to have multiple overlapped objects on screen, resulting in wasted per-fragment calculations.

Given these observations, I've decided to implement a deferred shading pipeline. This pipeline works by rendering data from the scene into several textures, which are collectively called the geometry buffer, and deferring or delaying the actual lighting calculations until the end, when the geometry buffer is filled up. There are many resources available online that explain this subject in much greater detail. There are so many helpful tutorials out there in general; a big shoutout to anyone that writes them... I owe you at least 70% of my coding knowledge!

Anyways, back to deferred shading - so far, I'm storing the geometry buffer data in three separate textures, which I've shown in the image above.

Position and depth (top left): The position of the game object at the given pixel and its depth as stored in the depth (z) buffer.

Normal (top right): The surface normal of the game object at the given pixel.

Material properties (bottom left): The properties of the material (color, diffuse and specular coefficients) of the game object at the given pixel.

Once I have these textures, I can run a fragment shader to pass through every pixel on the screen and compute its shading, according to all the scene lights. This reduces the complexity of light calculations from O(L * N) to O(L * P), for L lights, N game objects, and P total pixels on the screen (in short, adding objects to the scene now doesn't affect shading performance significantly). This comes at the cost of having difficulties representing a wide variety of material properties, such as transparency, but that can be fixed with some work.

Having said all this, I'll leave you with the mystery of who the creature in the screenshots might be. It has something to do with trees.


JUL 6

Alien Landscapes

Working the looks of the world. Blue suits it perfectly. Very very alpha stage renders. Absurd amounts of grain, as always.


JUL 6

Terrain Gen Update

If you had a computer with infinite memory, you could walk forever in this stage of the game. I programmed a chunk-based terrain generation system which generates simple wavey plains that stretch out, in principle, as far as the eye can see (a little closer at the moment, for testing purposes). Chunks can be generated from scratch, loaded into memory, and displayed as needed.

Each chunk is a square heightmap grid. It is given a random noisy/wavey shape using the diamond-square algorithm. Chunks that are generated beside existing chunks have their edges "stitched" together. The props modeled by Luciano are scattered randomly throughout.

We aren't very happy with the grid-like look of the terrain. In the future, we want to replace it with a more triangular look. I plan on slicing up the heightmap surface with a not-so-random triangulated mesh. We also plan on designing and adding more interesting landscape features, like mountain ridges and depressions.


JUL 5

Rocks

I just realized we are not making a "low poly" looking game anymore. If you take a look at these rocks from a distance, you'll notice they're blooming with detail.

Falling in love with the (now very popular) excessively low polygonal retro look is very easy. If you're not careful, those huge simple triangles can hurt the intentions of your project! If you want more content, add a bit of detail.

In order to burn the features in such simple geometry I worked the initial meshes in Zbrush. The blue-ish background in the first pic is borrowed from Cinema 4D for testing purposes.


JUN 27

PhysX Demo

Project Morph now officially supports NVIDIA PhysX! After struggling with annoyances such as C runtime library linkage options in the Windows platform, I finally managed to build the PhysX library from source and integrate it properly into the game engine.

This lovely showcase video was captured and edited by Luciano. Props to him for the amazing montage.


JUN 23

Basic lighting engine

I've had this done for about a week now, but we've both been busy shifting our focus back to the game itself rather than our website. In any case, I've built a very basic lighting system, which has, so far, four very standard components:

1. Ambient light: A flat, uniform light that reaches all objects in the scene. In the future, this will contain an ambient occlusion component, which we believe will be one of the most important components of our lighting engine.

2. Diffuse reflection: This is a light component that depends only on the orientation of the surface being lit with respect to the light. This supports both directional lights and point (omni-directional) lights.

3. Specular reflection: This is a more "metallic" light component. It occurs when light from a source directly reflects away from a surface and goes into the viewer's eye. Thus, it depends on the position of the light, the orientation of the surface, and the position of the viewer. This supports both directional light and point lights.

4 Hard-edged shadows: What do you know? Lights cast shadows. Computing these is tricky, especially getting the edges to soften in a nice-looking way. This is still very much a work in progress. So far, only directional lights cast shadows. Point lights require a 3D shadow map, represented by a cubic texture. Haven't quite gotten there yet.


JUN 13

Landing page animation

Glitch art looks amazing, and so does film grain. A combination of both sounds perfect. This is what Luciano came up with for our logo in the new website's landing page. I really liked it. However, we soon realized it wasn't really perfect, for purely practical reasons. It was an mp4 file of about 50MB, and with more glitch transitions, it would be even larger (disclaimer: there were probably better codecs and/or export options, but I believe my technique ends up beating most video formats in file size, simply because it's tailored very specifically for our needs).

The solution: I programmed what you could consider a very specific codec for our logo animation. It has two main features:

1. A custom film grain overlay. This is calculated client-side (JavaScript), and is what "compresses" the animation the most. Instead of sending a 10-second long mp4 file of a still image with film grain, I simply send the image, and the client's computer is kind enough to calculate the film grain effect for me. Thanks, JavaScript!

2. A frame-by-frame animation system for each glitch transition. The most important part is that the frames for the transition animation are downloaded in the background, while a still image with film grain is being displayed. I can delay the transition until all the frames are loaded. This is usually not necessary, because the image files by themselves aren't that big.

This "codec" (sounds like a bigger deal when I call it a codec) did involve a lot of nasty hard-coding, but it's readable and easily changeable. It reduced the amount of data that you need to download in the landing page from about 50MB to about 8MB. I'm also quite certain that it's one of the most efficient ways to transfer this information.


JUN 5

Things got too crazy

Okay, backtrack…

I might have gotten a little too crazy by starting with the aesthetics for the ant tunnels. In the last log, I coded some procedural generation for the way the tunnels look (entangled cylinders that move randomly around a cylindrical shell to form spaghetti textured walls). However, this has many possible complications, and in the end, is almost completely for aesthetic purposes. I’ve decided to take a step back and think of more practical issues, such as coding the procedural generation of the overall tunnel structure. This should be simpler and much more practical, and will allow us to implement other important things, such as the actual ant NPCs.

I also really need to make some more progress on the game engine itself.


JUN 2

Two more major updates

1. Game engine from scratch. Due to some of the game’s main characteristics – most importantly, the low-poly graphical style and the need for different world scales (microbiomes) – Unity is looking less and less promising as a main game engine. I’ll be working on our own C++ / OpenGL game engine, built from scratch. Wish me luck!

2. The ant ecosystem. Ant tunnels will be long, cylindrical, procedurally generated structures beneath the surface world. They will be made up of narrow, intertwined branches, forming a texture similar to a block of instant ramen. Both the overall structure of the tunnels and the small cylindrical tubes that make them up will be procedurally generated. I’ve written a simple MATLAB script that generates a cylindrical shell of N tubes that rotate randomly but remain within a ring of a specified width. These N branches form a dense layer, a cylindrical shell, which will serve as the ant tunnels.


JUN 1

Headmaster v2.

Aproaching slowly towards final sculpt.


MAY 31

Ascended Being

The only character you're going to be able to have a conversation with in this game is a huge floating head.

THE HEADMASTER is a dark, timeless voice of the occult and the pilot of the main character's ship. More to come...


MAY 31

Two new features

1. First-person “soul” controller. Three creatures in the game support this. When clicked, the first-person perspective and control of the player transfers to the clicked entity. Each creature has a unique controller that reacts differently to player input.

2. Crawling. The ant creature is tiny and crawls through the low-poly landscape. It can even climb up trees! It uses an internal gravity vector that adjusts smoothly such that it points opposite to the ground’s surface normal. The surface is detected using a simple downward ray cast. Collision is handled through Unity’s built-in colliders. It’s a little buggy on very sharp edges, which are very common in low-poly models. It’s also tricky to detect when the ant is standing stably on a surface.


MAY 30

Test #1: Morph Prototype

Figuring out low poly. Very alpha-stage development. Walkable environment made in Unity.