<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:media="http://search.yahoo.com/mrss/"
	
	>

<channel>
	<title>Matt Ebb</title>
	<link>https://mattebb.cargo.site</link>
	<description>Matt Ebb</description>
	<pubDate>Tue, 15 Aug 2023 12:16:01 +0000</pubDate>
	<generator>https://mattebb.cargo.site</generator>
	<language>en</language>
	
		
	<item>
		<title>Slices</title>
				
		<link>https://mattebb.cargo.site/Slices</link>

		<pubDate>Tue, 15 Aug 2023 12:16:01 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Slices</guid>

		<description>2023
commission





&#38;gt;&#38;nbsp;Project breakdown, presented at SIGGRAPH 2023
Slices

	Slices is a short animation exploring the relationships of geometry with the dimensional space it can exist in. The project juxtaposes physical simulation manipulated through physically impossible spaces, visually deconstructing 3D computer graphics.
	The project was commisioned by SideFX software with an open brief, the main goal to stress test the in-development version of Houdini for Apple Silicon CPUs in a practical scenario from start to finish.&#38;nbsp;



Four dimensional geometry
	The piece takes lower dimensional forms, pushing and distorting them through four spatial dimensions, before slicing the result back into three dimensional space.&#38;nbsp;

	Each shot contains different themes, manipulations, and dimensional objects being transitioned through different types of dimensional space, ranging from simple rotations, through to a 3D analogue of slit-scan video, where the one piece of geometry exists at multiple points in time simultaneously, in different areas in the frame.

&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/8cdf8b26a1d21d03e95c7939c7ebdb62105b4944f3785090e784f52fd73593d8/slices_2.3.1.jpg" data-mid="188354674" border="0"  src="https://freight.cargo.site/w/1000/i/8cdf8b26a1d21d03e95c7939c7ebdb62105b4944f3785090e784f52fd73593d8/slices_2.3.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/12a6623ff63c028b1b9576d5f6034c019914473755740f54d1fe911887d85f4d/slices_2.3.2.jpg" data-mid="188354675" border="0"  src="https://freight.cargo.site/w/1000/i/12a6623ff63c028b1b9576d5f6034c019914473755740f54d1fe911887d85f4d/slices_2.3.2.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/3925b880039abaca63c741f9ad4ec9d1bc25b20dccb4a49519e848f6ec7a35f0/slices_2.3.3.jpg" data-mid="188354676" border="0"  src="https://freight.cargo.site/w/1000/i/3925b880039abaca63c741f9ad4ec9d1bc25b20dccb4a49519e848f6ec7a35f0/slices_2.3.3.jpg" /&#62;


&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/d40bfb3a02f9c3a6033aa37fc903dbaf3cedd693cd702f3a1dd7c4225a7c5efa/slices_1.5.1.jpg" data-mid="188354667" border="0"  src="https://freight.cargo.site/w/1000/i/d40bfb3a02f9c3a6033aa37fc903dbaf3cedd693cd702f3a1dd7c4225a7c5efa/slices_1.5.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/3820d2b778da12633a8ac2c5e3d8770164423f63c77985e24109397e9d89e83d/slices_1.5.2.jpg" data-mid="188354668" border="0"  src="https://freight.cargo.site/w/1000/i/3820d2b778da12633a8ac2c5e3d8770164423f63c77985e24109397e9d89e83d/slices_1.5.2.jpg" /&#62;


&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/bb025359b502588dd66a3c0861cf09fbf088e7e0cfdbf64df8fa66df86ca67ef/slices_1.3.1.jpg" data-mid="188354666" border="0"  src="https://freight.cargo.site/w/1000/i/bb025359b502588dd66a3c0861cf09fbf088e7e0cfdbf64df8fa66df86ca67ef/slices_1.3.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/eec1268f77bf8b001b8805a9cd45b6acfa706f367f139ce720960cb31ad9767d/slices_1.2.1.jpg" data-mid="188354665" border="0"  src="https://freight.cargo.site/w/1000/i/eec1268f77bf8b001b8805a9cd45b6acfa706f367f139ce720960cb31ad9767d/slices_1.2.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/a94a72a9e3c1879400e5629e251171cf6eae7fcc356a0ae7ec9a6a2fa65c133b/slices_2.1.1.jpg" data-mid="188354672" border="0"  src="https://freight.cargo.site/w/1000/i/a94a72a9e3c1879400e5629e251171cf6eae7fcc356a0ae7ec9a6a2fa65c133b/slices_2.1.1.jpg" /&#62;

&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/84b72fa7e6317c4c65fd71446ebb07e748ec077627e5a099a5196effabad71f5/slices_2.2.1.jpg" data-mid="188354673" border="0"  src="https://freight.cargo.site/w/1000/i/84b72fa7e6317c4c65fd71446ebb07e748ec077627e5a099a5196effabad71f5/slices_2.2.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/3ac16f05de33123c5c7118431dfe83967cf0f6600299ae5f183c6aa95bf1c109/slices_1.9.1.jpg" data-mid="188354671" border="0"  src="https://freight.cargo.site/w/1000/i/3ac16f05de33123c5c7118431dfe83967cf0f6600299ae5f183c6aa95bf1c109/slices_1.9.1.jpg" /&#62;
&#60;img width="1920" height="960" width_o="1920" height_o="960" data-src="https://freight.cargo.site/t/original/i/22eb26d5ed4ff2711ac3542cbc6a30c918678ae7a55d539cc11a2ce2e7e35086/slices_2.4.3.jpg" data-mid="188354678" border="0"  src="https://freight.cargo.site/w/1000/i/22eb26d5ed4ff2711ac3542cbc6a30c918678ae7a55d539cc11a2ce2e7e35086/slices_2.4.3.jpg" /&#62;</description>
		
	</item>
		
		
	<item>
		<title>Wandering Nets</title>
				
		<link>https://mattebb.cargo.site/Wandering-Nets</link>

		<pubDate>Sat, 05 Mar 2022 00:45:20 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Wandering-Nets</guid>

		<description>2022

personal








function hidewn() {
	var ctrl = document.querySelector("#control");
	var wn = document.querySelector("#wnoutput");
	var wnframe = document.querySelector("#wnframe");

	if (wn.style.display == 'none') {
		wn.style.display = 'block';
		ctrl.textContent = '︎';
		wnframe.style.height = '';
	    wnframe.style.backgroundColor = '';
	}
	else {
		wn.style.display = 'none';
		ctrl.textContent = '︎';
	    wnframe.style.height = '1px';
	    wnframe.style.backgroundColor = 'black';
	}
}

︎

An iteration of the realtime generative system - mouse drag to explore
	




Wandering Nets


	Wandering Nets is a generative art piece resulting from personal research and exploration in realtime volume rendering. Networks of light are created and connected via random walks in space and colour. Emergent structures are revealed by the ambient interaction, tension and collision of these networks within themselves and orbiting shadowed spherical bodies.





	The piece runs in realtime in the browser using webgl. Scene data is generated and animated with Javascript and sent through to a single GLSL fragment shader, which renders the final pixels to a fullscreen quad. No external libraries are used, it’s a bespoke creation that’s tuned for purpose.




	
I’ve dabbled in OpenGL over the years for various purposes but this is the first fully realtime and interactive creative project I’ve executed. While my work often involves creative coding with procedural and generative techniques it’s usually complex, heavy and rendered using offline renderers, so it was both challenging and exciting to work in this medium.

The piece was released on fxhash, an online platform for generative art which allows users to generate and collect individual and unique iterations of code-based artwork - the edition contains&#38;nbsp;over a hundred different outputs.





&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/77d4319b8f653a67d80aef9aa0ad46255708246158dc216aa0f4c16aef9209e4/wn03s.jpg" data-mid="135532043" border="0"  src="https://freight.cargo.site/w/960/i/77d4319b8f653a67d80aef9aa0ad46255708246158dc216aa0f4c16aef9209e4/wn03s.jpg" /&#62;
&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/72b1e612944c532920ad70dde6ac2f7a1330fe528754801dbfd5e707b008f8bf/wn05s.jpg" data-mid="135532041" border="0"  src="https://freight.cargo.site/w/960/i/72b1e612944c532920ad70dde6ac2f7a1330fe528754801dbfd5e707b008f8bf/wn05s.jpg" /&#62;



Inspiration



	My original idea was to create an atmospheric light sculpture, with colour radiating and mixing through fog. I wanted to take advantage of the realtime medium and create something dynamic, that would randomly, slowly unfold into different shapes and compositions over time. I’ve always been fascinated by the idea of fluorescent lights glowing on their own underneath high voltage power lines, energised by the magnetic field - this provided some inspiration but more interesting was the idea of a network of lights, with connectivity and interaction.
	On their own the lights were interesting, but needed more of a sense of contrast, tension and rhythm. The circling orbs visually provide figure and ground, silhouette and shape, and heaviness in opposition to the lightness of the beams. This is the case not just in terms of visual composition, but in their animated behaviour - the lights collide and deflect off the orbs but the orbs are unaffected by the lights.

After a while, the scenes began to evoke an idea of constellations and planets, a kind of celestial diagram. This inspired the name Wandering Nets, referencing the original Greek planetes, meaning “wandering stars”.




&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/7a007b3d6ab9ab03819b5bf397caa51db50524127bee4190fa2550203d997d55/wn01s.jpg" data-mid="135531837" border="0"  src="https://freight.cargo.site/w/960/i/7a007b3d6ab9ab03819b5bf397caa51db50524127bee4190fa2550203d997d55/wn01s.jpg" /&#62;
&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/790e03bed42ab9355eaaa5173ff5d8b75109b77fa331d676fa8998b8769511c3/wn02s.jpg" data-mid="135531836" border="0"  src="https://freight.cargo.site/w/960/i/790e03bed42ab9355eaaa5173ff5d8b75109b77fa331d676fa8998b8769511c3/wn02s.jpg" /&#62;






Variation



	The light network structures are generated via a random walk process. Starting with a single point, edges are created and stored in a graph data structure. Each edge makes two different random decisions, for the first edge vertex it chooses to either:
Create a new starting point in space

Share an existing vertex in the graph and branch off

... and then either:
Leave the endpoint floating freely, a random distance offset from the initial

Connect the endpoint back to an appropriate existing point in the graph

These rules create a range of different network topologies, from discrete disconnected segments, to chains of branching edges, and in the rarer cases, fully consolidated shapes such as a tetrahedron.


Additional variation is revealed by emergent behaviour in the networks’ animation. A simple semi-physically based physics simulation generates orbital motion. The connected networks are constrained by edge tensions, and collisions (between spheres ︎︎︎ lights and spheres ︎︎︎ spheres) which prevents interpenetration and contribute to unpredictability.

The parameters driving the animation and physics are randomised either per-element or per-iteration.






	







Colour schemes are also generated via a random walk process. For each edge in the graph, a random step is taken in the ab plane of the oklab colour space, to then be converted back to rgb. Randomly varying the size of the steps, or randomly constraining overall saturation gives a range of colour schemes varying between desaturated, monochromatic, or highly varied colour selections.


&#60;img width="688" height="506" width_o="688" height_o="506" data-src="https://freight.cargo.site/t/original/i/1eff211aeda550cd494c3b019d84495c5b0bd03657e3939311b750dc8420ec87/abplane.png" data-mid="135727544" border="0" alt="Random walk in the ab plane" data-caption="Random walk in the ab plane" src="https://freight.cargo.site/w/688/i/1eff211aeda550cd494c3b019d84495c5b0bd03657e3939311b750dc8420ec87/abplane.png" /&#62;













Features

	

The fxhash platform has a concept of ‘features’: naming and grouping various traits to classify each individual unique iteration.

Rather than using features prescriptively, to set procedural parameters or presets used in the generative process, I found it more interesting to use them descriptively. When the scene data is first generated, with all the layers of randomisation involved, a secondary process then inspects the scene (i.e. analysing the generated network topology, or doing a simple statistical analysis on the colouring) to then describe what it found. I felt this allowed for a richer set of results than purely hard coding special cases.


	
    // ---- fxhash features

function f_topology(graph) {
    if (graph.points.length == 2*graph.edges.length)
&#38;nbsp; &#38;nbsp; &#38;nbsp; &#38;nbsp; return "Discrete";

    if (graph.edges.some( ed =&#38;gt; ed.points[0].edges.length == 1 &#38;amp;&#38;amp; 
    	ed.points[1].edges.length == 1))
        return "Varied";

    if (graph.edges.length == 6 &#38;amp;&#38;amp; 
    	graph.points.every( pt =&#38;gt; pt.edges.length == 3 ))
        return "Tetra";

    if (graph.points.every( pt =&#38;gt; pt.edges.length &#38;gt;= 2 ))
    	return "Consolidated";
    else
    	return "Branched";
}

Analysing topology to classify networks



Rendering Scene Data



The generation and animation of the lights is handled in Javascript using a simple adjacency list graph data structure. Since there are so few elements, speed is not a concern, but it’s mainly for convenience of manipulating and inspecting the network.

After each frame’s animation, the graph is converted to a simpler disconnected representation for rendering,&#38;nbsp; with some common values pre-calculated. These are passed to the fragment shader as arrays of GLSL uniform structs.

Rays are then intersected from the camera position, against all the objects in these arrays to find the nearest shading point and normal. Volume lighting is calculated along the ray segment in between. Much of the primitive intersection code is derived from Inigo Quilez’s&#38;nbsp;comprehensive examples.


	
struct Light {
	float intensity;
	vec3 colour;
	vec3 pa;
	vec3 pb;
	float w;
	vec3 L;
	vec3 nL;
};

struct Sphere {
	float r;
	vec3 P;
};

#define MAX_LIGHTS 20
uniform int u_numLights;
uniform Lights
{
    Light lights[MAX_LIGHTS];
};

#define MAX_SPHERES 6
uniform int u_numSpheres;
uniform Spheres
{
    Sphere spheres[MAX_SPHERES];
};

GLSL Uniforms






Volumes

Rendering volumes in realtime is challenging, and it’s been a while since I’ve worked on developing a volume renderer, so I wanted to use up to date techniques. The biggest challenge was getting it to be performant without compromising too much on the level of visual quality I wanted. Optimising it involved two main strategies:
Being smart (using effective sampling strategies optimised for low sample counts)And being dumb (hacking and cheating to avoid spending cycles where they won’t be noticed)

	The dumb part is mostly about avoiding work wherever I could get away with it. For example:
Limiting pixel samples in dark areas - after taking a few samples from each light, if a pixel is still dark it stops taking further samples. If you expose up you can see noise in the shadows but it’s not noticable at normal brightness.
Not bothering to send visibility rays if the remaining terms of the rendering equation components of the current sample’s rendering equation will provide low radiance anyway (eg. geometric term/inverse square falloff/pdf)
&#60;img width="958" height="519" width_o="958" height_o="519" data-src="https://freight.cargo.site/t/original/i/96168bab6494b915c3d460379e24466d3263681dd90025bb07dadec65afc7049/noisy_shadows.png" data-mid="135738877" border="0" alt="Undersampled shadow regions (exposed up x8)" data-caption="Undersampled shadow regions (exposed up x8)" src="https://freight.cargo.site/w/958/i/96168bab6494b915c3d460379e24466d3263681dd90025bb07dadec65afc7049/noisy_shadows.png" /&#62;



Light Sampling





Sampling lighting in thin volumes requires specific strategies. Sampling at every step as in a naive ray marcher is slow and wasteful, importance sampling according to extinction is ineffective since extinction due to density is not the dominant term. However a highly effective sampling strategy is equiangular sampling, presented by Christopher Kulla (Sony Imageworks) which samples lighting in the volume at distances proportionally to where light contribution is the strongest. It’s essentially sampling proportionally to the solid angle projection of the viewing ray, on the sphere around the light source.

Although the technique was originally intended for use with point lights, it extends to other shapes - first find a sample location on the emitter, then use equiangular sampling to find a distance along the ray to sample lighting (with optional MIS for areas where the chosen location on the light is suboptimal)




	

 


&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/2503ed6ca9d451246b909b2bc047150cf9a59ee9a6be4bb56ce52516c39153a8/s_hash_extinction_64.jpg" data-mid="135937686" border="0" alt="Random hash, uniform distance sampling (64 spp)" data-caption="Random hash, uniform distance sampling (64 spp)" src="https://freight.cargo.site/w/959/i/2503ed6ca9d451246b909b2bc047150cf9a59ee9a6be4bb56ce52516c39153a8/s_hash_extinction_64.jpg" /&#62;
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/a5940832f1a4f12f7724857d14543f0b8f9840e6afb04fd3346f4fcc789ddac8/s_hash_uniform_64.jpg" data-mid="135937684" border="0" alt="Random hash, extinction distance sampling (64 spp)" data-caption="Random hash, extinction distance sampling (64 spp)" src="https://freight.cargo.site/w/959/i/a5940832f1a4f12f7724857d14543f0b8f9840e6afb04fd3346f4fcc789ddac8/s_hash_uniform_64.jpg" /&#62;
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/e60eb95305d78103ed18437704adf787d4fb841252de19df6a48b543db92e3a1/s_hash_64.jpg" data-mid="135837692" border="0" alt="Random hash, equiangular distance sampling (64 spp)" data-caption="Random hash, equiangular distance sampling (64 spp)" src="https://freight.cargo.site/w/959/i/e60eb95305d78103ed18437704adf787d4fb841252de19df6a48b543db92e3a1/s_hash_64.jpg" /&#62;

 





Rather than just specifically for distance sampling in a volume, equiangular sampling can be thought of as a generic method for importance sampling a line segment relative to the solid angle as seen from a point. I found it’s possible to reuse this technique in other situations that I haven’t yet seen documented - one example is flipping it around to importance sample line lights from single shading points on solid geometry. I’ve added a simple demonstration of this on shadertoy:

 

	

Importance sampling line by solid angle from surface shading point



Another potential use case might be in initially sampling a position on a line light relative to the view ray, before then sampling a distance along the ray. By projecting the line light against the plane defined by the view ray, we can collapse the ray into a single point, giving us a point-line segment relationship where we can sample by solid angle using the same techniques. After finding a position along the light line segment in projected space, we can then map that back to the original light to find a sample position in 3D.

The end result is that areas on the light source close to the ray are sampled more densely than areas further from the ray, concentrating work where it will have more of an effect on the final radiance.

This works well for infinite rays, but for clipped view rays (eg. if there is an occluding object) it reveals artifacts. This makes sense since if there is a light further beyond the end of the clipped ray, the solid angle projection is different to how it would be it it was more ‘perpendicular’ to a long/infinite ray, as this technique assumes. In the end, while promising, I didn’t find a solution for these artifacts so didn’t end up using it for this project. I think there’s potential though, so if anyone smarter than me has ideas, please take a look at the shadertoy and share your thoughts.
	
&#60;img width="766" height="639" width_o="766" height_o="639" data-src="https://freight.cargo.site/t/original/i/a1625c6a939c7cc1e68d96d5ca234233ccda0d8e81d0e499085b5b146b41ef2a/solidangle.png" data-mid="135726366" border="0"  src="https://freight.cargo.site/w/766/i/a1625c6a939c7cc1e68d96d5ca234233ccda0d8e81d0e499085b5b146b41ef2a/solidangle.png" /&#62;


Importance sampling line by solid angle from collapsed view ray






(Quasi) Random Sequences



Raytracing/pathtracing requires liberal use of random numbers. In this case, for each pixel sample choosing:
A ray offset for antialiasingA light to use
A position on the lightA distance along the view ray to sample radiance
Often better than pure random numbers are low-discrepancy (quasi-random) sequences. These still exhibit similar properties to random numbers, but with a more even distribution that reduces noise. The last time I looked deeply into low-discrepancy sequences was about a decade ago; since then Martin Roberts has presented the quite brilliant&#38;nbsp;R-sequence, which is simple and with many desirable qualities.


I initially substituted the R-sequence in place of a random hash function, resetting/offsetting into the R-sequence for each pixel, and stepping through the sequence for each pixel sample. Because the sequence was getting randomly disconnected between pixels, it was not giving that much better results than a pure random hash.



Due to the way new samples cover the domain by filling in the areas between existing samples, when randomly indexing you might get adjacent pixels utilising parts of the sequence with similar patterns. This will introduce noise if you get ‘clumping’ of results, i.e. a few adjacent pixels which sampled bright areas on a light clumping together.


I’d seen examples of people using the R1 sequence for dithering. Rather than a random offset into the sequence per pixel, one can use the index of a space filling curve like the Hilbert curve as it covers the pixel grid. This takes advantage of spatial coherence - nearby pixels will get adjacent indices in the R1-sequence, which will be evenly distributed and reduce noisy clumping.




	All the instances of this technique that I’d seen though, only dealt with a single sample per pixel, so I sketched out some tests on shadertoy with various ways of combining/averaging results from multiple samples per pixel.







 The best results came from continuing the sequence ‘through’ the pixel, incrementing the sequence offset for each pixel sample before continuing on to the next hilbert index.


&#60;img width="960" height="600" width_o="960" height_o="600" data-src="https://freight.cargo.site/t/original/i/a5f5395d74ba773a355e884fd66321748eb37ed84e262cf65b8719850d37a3c0/hilbert_index.png" data-mid="135840663" border="0" alt="Example R1 sequence indices through Hilbert curve, 3 samples per pixel" data-caption="Example R1 sequence indices through Hilbert curve, 3 samples per pixel" src="https://freight.cargo.site/w/960/i/a5f5395d74ba773a355e884fd66321748eb37ed84e262cf65b8719850d37a3c0/hilbert_index.png" /&#62;

In the case of line lights, I found more improvements in not continuing the sequence through all pixel samples, but separately for each light. This ensured a well distributed set of sampled positions along each emitter, rather than the sequence being scattered across multiple lines. This is implemented with an array of sample counters (one for each light), incremented each time a light is sampled, and then used to calculate the sequence index.



	
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/a2e090d6b017f056e84a55887035e25aad7e9980b788718544b4a69365f94bc7/s_final_64.jpg" data-mid="135837694" border="0" alt="Final (64 samples per pixel)" data-caption="Final (64 samples per pixel)" src="https://freight.cargo.site/w/959/i/a2e090d6b017f056e84a55887035e25aad7e9980b788718544b4a69365f94bc7/s_final_64.jpg" /&#62;


	
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/60618641a60f1c42b71dde64a079b9f3ab9638d64e20a15c5046e3310e3c987a/s_r1perpixel_64.jpg" data-mid="135837691" border="0" alt="Random hash (64spp)" data-caption="Random hash (64spp)" src="https://freight.cargo.site/w/959/i/60618641a60f1c42b71dde64a079b9f3ab9638d64e20a15c5046e3310e3c987a/s_r1perpixel_64.jpg" /&#62;
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/e60eb95305d78103ed18437704adf787d4fb841252de19df6a48b543db92e3a1/s_hash_64.jpg" data-mid="135837692" border="0" alt="R-sequence, randomly offset per pixel (64spp)" data-caption="R-sequence, randomly offset per pixel (64spp)" src="https://freight.cargo.site/w/959/i/e60eb95305d78103ed18437704adf787d4fb841252de19df6a48b543db92e3a1/s_hash_64.jpg" /&#62;
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/aa505331a9b71710cf6e8dd6ebeaedaeac3d9fe5bb4336a9089b01b2a4bd73b4/s_hilbert_lightsel_64.jpg" data-mid="135837690" border="0" alt="R-sequence, offset by pixel Hilbert index and samples per pixel (64spp)" data-caption="R-sequence, offset by pixel Hilbert index and samples per pixel (64spp)" src="https://freight.cargo.site/w/959/i/aa505331a9b71710cf6e8dd6ebeaedaeac3d9fe5bb4336a9089b01b2a4bd73b4/s_hilbert_lightsel_64.jpg" /&#62;
&#60;img width="959" height="799" width_o="959" height_o="799" data-src="https://freight.cargo.site/t/original/i/a2e090d6b017f056e84a55887035e25aad7e9980b788718544b4a69365f94bc7/s_final_64.jpg" data-mid="135837694" border="0" alt="R-sequence, offset by pixel Hilbert index and samples per light (64spp)" data-caption="R-sequence, offset by pixel Hilbert index and samples per light (64spp)" src="https://freight.cargo.site/w/959/i/a2e090d6b017f056e84a55887035e25aad7e9980b788718544b4a69365f94bc7/s_final_64.jpg" /&#62;





Performance

Being presented in a realtime web based medium, there’s little control over the viewing environment. I wanted to ensure that it would remain as fluid as possible, and true to intention on a range of different devices. By default the sampling rate is automatically adjusted to attempt to maintain as close as possible to 30 frames per second playback. A running average of fps is taken - if it is too low, it reduces the number of samples per pixel, otherwise it increases sampling to maximise visual quality. This allows the piece to be functional at a lower level of visual quality even on devices like phones.


	&#60;img width="305" height="234" width_o="305" height_o="234" data-src="https://freight.cargo.site/t/original/i/830aeeabdf18af509e2b8932a1538455c49fe7253623ceb8b4f71bb5cf161b62/adaptive_spp.png" data-mid="135938498" border="0"  src="https://freight.cargo.site/w/305/i/830aeeabdf18af509e2b8932a1538455c49fe7253623ceb8b4f71bb5cf161b62/adaptive_spp.png" /&#62;



Colour



I wanted the piece to support very intense levels of saturation with a robust colour process. The image is rendered in a linear wide gamut ACEScg AP1 working space, and is converted to sRGB after merging pixel samples. I’m used to working in this colour space anyway, so it was a natural fit.


It was important for the intensity and gradation of light to be faithfully reproduced without clipping, so doing a simple gamma 2.2 sRGB approximation was out of the question. Initially I used a polynomial fit approximation of the ACES RRT tone mapping curve, but was suffering from colour skew and gamut clipping as the rgb ratios were scaled non-linearly, especially with bright and highly saturated lights.




	
An experiment from Björn Ottosson provided a solution, which I borrowed with permission - it converts the final colour to his Oklab space and performs the tone mapping there, weighted by the derivative of the tonemap curve. It does a great job of preserving the correct hue and desaturates out of gamut colours, giving a more filmic colour response.

The below images demonstrate the effectiveness - even though the per-channel tone map is a lot better looking than the horribly clipped gamma 2.2, the colours skew, causing the redder lights to appear overly yellow.&#38;nbsp;&#38;nbsp;
&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/801a919d4853b0275b48ebfb128bb4d7d441021534ef0978b25f047e3a6a9496/b_gamma2.2.jpg" data-mid="135643617" border="0" alt="Gamma 2.2" data-caption="Gamma 2.2" src="https://freight.cargo.site/w/960/i/801a919d4853b0275b48ebfb128bb4d7d441021534ef0978b25f047e3a6a9496/b_gamma2.2.jpg" /&#62;
&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/138cddc54898ec126f4a4fa21df085ace82f97cc2c5a5da7d6d4f670c4e93218/b_perchannel.jpg" data-mid="135643616" border="0" alt="RGB per-channel tone map" data-caption="RGB per-channel tone map" src="https://freight.cargo.site/w/960/i/138cddc54898ec126f4a4fa21df085ace82f97cc2c5a5da7d6d4f670c4e93218/b_perchannel.jpg" /&#62;
&#60;img width="960" height="800" width_o="960" height_o="800" data-src="https://freight.cargo.site/t/original/i/8401f087f6926d84b03b5066f68c886cc14d51eccce80c46827be5115c81c2e0/b_hue_preserving.jpg" data-mid="135643615" border="0" alt="Hue preserving tone map" data-caption="Hue preserving tone map" src="https://freight.cargo.site/w/960/i/8401f087f6926d84b03b5066f68c886cc14d51eccce80c46827be5115c81c2e0/b_hue_preserving.jpg" /&#62;
</description>
		
	</item>
		
		
	<item>
		<title>Black Hole</title>
				
		<link>https://mattebb.cargo.site/Black-Hole</link>

		<pubDate>Fri, 22 May 2020 02:31:05 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Black-Hole</guid>

		<description>2020
personal research

&#38;gt;&#38;nbsp;.hip file available here

&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/1da10fdbe8f98a68c1a8a1dc56bc59238c9006eb7e253adc5ace070510d6bebe/Untitled00086402.jpg" data-mid="72954676" border="0"  src="https://freight.cargo.site/w/1000/i/1da10fdbe8f98a68c1a8a1dc56bc59238c9006eb7e253adc5ace070510d6bebe/Untitled00086402.jpg" /&#62;&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/6ce02e4cfba62dcd874aa1ca6f92393af550a5caddeb0ea27231607c8c09fa1d/Untitled00086401.jpg" data-mid="72954435" border="0"  src="https://freight.cargo.site/w/1000/i/6ce02e4cfba62dcd874aa1ca6f92393af550a5caddeb0ea27231607c8c09fa1d/Untitled00086401.jpg" /&#62;&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/a149f0d9f45564d73ff8db38d5a6ae8fa9651424c4ef3bfa6e8f2b307f42d026/Untitled00086404.jpg" data-mid="72954455" border="0"  src="https://freight.cargo.site/w/1000/i/a149f0d9f45564d73ff8db38d5a6ae8fa9651424c4ef3bfa6e8f2b307f42d026/Untitled00086404.jpg" /&#62;

&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/d6c0f19156e6f3c2af93813bf60128bea10c690b2f5df75061e81b8ef484d854/Untitled00086403.jpg" data-mid="72954449" border="0"  src="https://freight.cargo.site/w/1000/i/d6c0f19156e6f3c2af93813bf60128bea10c690b2f5df75061e81b8ef484d854/Untitled00086403.jpg" /&#62;

Breakdown

	

During pandemic isolation I’ve had time to do some research and experimentation into rendering black holes, implemented in Houdini and Mantra.
One of the best references for this is dneg’s paper ‘Gravitational Lensing by Spinning Black Holes in Astrophysics, and in the Movie Interstellar’. Their approach is much more complex, with a lot of clever techniques that are way over my head to make it scientifically accurate and performant, but there’s still a lot of practical information to be gleaned. I’m not a mathematician or physicist, and what I’ve done here is all a big hack in the service of making passable looking visual effects. Please let me know if I’ve got something horribly wrong!


	

This project was made much easier by the flexibility and tight integration of Mantra in Houdini. While Mantra may not be as tightly optimised as other renderers, the level of flexibility it offers, especially with Houdini’s VEX libraries, is hard to match.Most of the techniques used in the render process were first prototyped in 3d geometry form. The VEX code for calculating ray paths is in the form of a Houdini Digital Asset that can be re-used in both SOPs and at render time, and is how the visualisations below were made.






Rendering with curved rays


	This all came about after experimenting with ‘taking over’ a Mantra render, by rendering a dummy surface in front of the camera with a custom shader, and using a VEX snippet to re-spawn shifted rays into the scene. A curved ray path can be approximated by taking many small steps through the scene (like ray marching) and modifying the ray direction at each step.To mimic gravitational lensing, I’m just integrating the curved ray paths from the camera out into the scene, as if each ray sample was a particle going through a physics simulation. The gravity of the black hole&#38;nbsp; acts as a force, bending the ray paths towards it.


	





&#60;img width="1260" height="351" width_o="1260" height_o="351" data-src="https://freight.cargo.site/t/original/i/44908765b08ba573e6ab9e6af85a4147ac19e0060a440224a2f3f36e0f4abafe/out.gif" data-mid="72421603" border="0" alt="Ray paths visualised in SOPs" data-caption="Ray paths visualised in SOPs" src="https://freight.cargo.site/w/1000/i/44908765b08ba573e6ab9e6af85a4147ac19e0060a440224a2f3f36e0f4abafe/out.gif" /&#62;



	

At each ray segment, it calls Mantra’s VEX trace() function to shade and return luminance for any objects encountered along that ray segment, in this case an accretion disk. The disk is a thin, circular volume, orbiting the singularity. In the final image, the forward-facing ring around the black hole is apparent due to camera rays bending around the top and bottom of the singularity and intersecting the flat accretion disk behind.


	



&#60;img width="1260" height="351" width_o="1260" height_o="351" data-src="https://freight.cargo.site/t/original/i/4bdcc62cc135e8265a4d51316956a4fb34f04dc03928efdc7463077b198ead81/curved_rays_disk0000.jpg" data-mid="72423205" border="0" alt="Accretion disk visualised in SOPs" data-caption="Accretion disk visualised in SOPs" src="https://freight.cargo.site/w/1000/i/4bdcc62cc135e8265a4d51316956a4fb34f04dc03928efdc7463077b198ead81/curved_rays_disk0000.jpg" /&#62;
&#60;img width="1260" height="351" width_o="1260" height_o="351" data-src="https://freight.cargo.site/t/original/i/a20993e3c65d5bc1e6d74822919cf01d78660f815a1b65a56546098d0ee96872/curved_rays_disk_render0000.jpg" data-mid="72423808" border="0" alt="Rendered accretion disk" data-caption="Rendered accretion disk" src="https://freight.cargo.site/w/1000/i/a20993e3c65d5bc1e6d74822919cf01d78660f815a1b65a56546098d0ee96872/curved_rays_disk_render0000.jpg" /&#62;


	

StarsCorrectly rendering stars turned out to be much more complicated than rendering the black hole and accretion disk itself. A simple approach would be to do it as you’d render any other CG sky, by using the final ray directions to look up an environment map. This would probably be fine in a normal sky render, but under gravitational lensing it reveals unrealistic artifacts.


	



&#60;img width="1440" height="600" width_o="1440" height_o="600" data-src="https://freight.cargo.site/t/original/i/0252e36bf08f1b8403b2ad28d0e836894e0601555b333efbe02dd9cf1c91deae/rendered_starmap0000.jpg" data-mid="72306587" border="0" alt="A sky map showing radially smeared distortion (rendered with a static camera)" data-caption="A sky map showing radially smeared distortion (rendered with a static camera)" src="https://freight.cargo.site/w/1000/i/0252e36bf08f1b8403b2ad28d0e836894e0601555b333efbe02dd9cf1c91deae/rendered_starmap0000.jpg" /&#62;


	

The problem is related to the concept of ‘solid angle’, which is the angle that something covers in a field of view, relative to the sphere of all possible directions around the viewpoint. For example, if you have an image taken with a 50° field of view that’s 100 pixels wide, then each pixel covers roughly 0.5° degrees in one dimension.
In reality, stars are extremely far away, so the solid angle that a star takes up in our field of view is absolutely tiny - much, much smaller than a pixel. Stars are still visible, because when you take a photograph of the night sky, even though the star itself doesn’t cover the area of a pixel, the thin beam of light that’s emitted is strong enough that the photosite in a camera’s sensor will still collect it and light up. 
Sometimes in a photograph of the sky, stars will appear larger than a pixel, but that’s due to other factors such as diffusion of light within a lens or sensor.


	

The result of this is that when you render the sky using a starry environment map, the pixels in the map are much larger than the actual solid angles of the real stars in the sky.In a usual render of a sky, when there’s a close to 1:1 relationship between the sky map pixels and the final render frame pixels, you can get away with it, but when bending the light rays due to gravitational lensing, multiple camera rays can converge on the same pixels in the sky map, causing the star to smear across the rendered image.



	
&#60;img width="800" height="494" width_o="800" height_o="494" data-src="https://freight.cargo.site/t/original/i/f96a6222869be5ba4d9517311b69bd900a471e53eaf653674dfbc0cdc43b53fc/pixelgrid_sky_col0000.png" data-mid="72302940" border="0" alt="Sky map pixels are averaged to form final pixel colours" data-caption="Sky map pixels are averaged to form final pixel colours" src="https://freight.cargo.site/w/800/i/f96a6222869be5ba4d9517311b69bd900a471e53eaf653674dfbc0cdc43b53fc/pixelgrid_sky_col0000.png" /&#62;

	
&#60;img width="800" height="494" width_o="800" height_o="494" data-src="https://freight.cargo.site/t/original/i/8d7c4ae56e4e8f6d4c542555eb9f344eb7ae0022fb4355764ba1fcae0cda8c1c/pixelgrid_curved_compress0000.png" data-mid="72303059" border="0" alt="Bent rays can converge on sky map pixels, causing smearing" data-caption="Bent rays can converge on sky map pixels, causing smearing" src="https://freight.cargo.site/w/800/i/8d7c4ae56e4e8f6d4c542555eb9f344eb7ae0022fb4355764ba1fcae0cda8c1c/pixelgrid_curved_compress0000.png" /&#62;


	

Rather than collecting pixels from the environment map to determine the final rendered pixel’s colour, we can do a better job if we treat things more physically. Instead we want a pixel to only receive luminance if a star, treated as an infinitesimally small point source, is within. To figure this out, we need to know how the rays from that pixel warp through the scene and finally end up projected on the celestial sphere.

Most renderers have a way of measuring the dimensions of a pixel, known as the derivatives or differentials. 




	



In Mantra these are the vector variables dPds and dPdt - they’re vectors representing the change in position with respect to the s and t coordinates of the frame, you can think of them as tracing out a pixel’s edges in space.


If we trace two additional rays through the gravitational lensing setup, offset by the dPds and dPdt vectors then we’ll know the shape that the pixel represents when it ends up projected against the celestial sphere, and we can then determine if a star is inside.






	
&#60;img width="800" height="494" width_o="800" height_o="494" data-src="https://freight.cargo.site/t/original/i/72e910a67a4846d20f6236b3fe69a4e93d48a7d16dd35d157e4d6f4f9a36f0e3/trace_derivatives0000.png" data-mid="72303258" border="0" alt="Explicitly trace pixel derivatives and project on sky sphere" data-caption="Explicitly trace pixel derivatives and project on sky sphere" src="https://freight.cargo.site/w/800/i/72e910a67a4846d20f6236b3fe69a4e93d48a7d16dd35d157e4d6f4f9a36f0e3/trace_derivatives0000.png" /&#62;

	
&#60;img width="800" height="494" width_o="800" height_o="494" data-src="https://freight.cargo.site/t/original/i/e8bd5f08359c38a14cce1f7ef9b37605ed50fe14223e8c2cd7b1eacd93b9c0ee/points_curved0000.png" data-mid="72303256" border="0" alt="Star is within only a single pixel's footprint - no smearing" data-caption="Star is within only a single pixel's footprint - no smearing" src="https://freight.cargo.site/w/800/i/e8bd5f08359c38a14cce1f7ef9b37605ed50fe14223e8c2cd7b1eacd93b9c0ee/points_curved0000.png" /&#62;
&#60;img width="800" height="494" width_o="800" height_o="494" data-src="https://freight.cargo.site/t/original/i/8ada42587b74750422c947b27363415466b28d4660032e5f0d570b2872b13632/points_sky_col0000.png" data-mid="72303257" border="0" alt="Return luminance if star is inside projected pixel footprint" data-caption="Return luminance if star is inside projected pixel footprint" src="https://freight.cargo.site/w/800/i/8ada42587b74750422c947b27363415466b28d4660032e5f0d570b2872b13632/points_sky_col0000.png" /&#62;




	



When we’re rendering stars, it’s assumed that the rays have done all the warping they’re going to do, and are now just extending off into infinity. At infinite length, we don’t care about the rays’ positions, just their directions, so things become much simpler if we convert to polar coordinates and work in two dimensions rather than three.



The stars are stored as a point cloud inside Houdini, unwrapped into their 2D polar coordinates. In this coordinate space, the projected pixel and its derivatives become a parallelogram, and we can use a point-in-polygon test to determine whether there’s a star inside.


Using pcfind(), limited to the bounding radius of the pixel footprint, we can accelerate the point-in-polygon test by narrowing down the number of stars to query in advance.



	
&#60;img width="520" height="268" width_o="520" height_o="268" data-src="https://freight.cargo.site/t/original/i/804ddb74dab3d8b6382427bbab5010bd714a36bd37f68f41cbdfed6fba384547/star_inside_pixel0000.png" data-mid="72318045" border="0" alt="Projected pixel and stars, transformed into polar coordinates" data-caption="Projected pixel and stars, transformed into polar coordinates" src="https://freight.cargo.site/w/520/i/804ddb74dab3d8b6382427bbab5010bd714a36bd37f68f41cbdfed6fba384547/star_inside_pixel0000.png" /&#62;



&#60;img width="1440" height="600" width_o="1440" height_o="600" data-src="https://freight.cargo.site/t/original/i/13e4c538def46c3aa948c40a366bc9e93d98b36cce6b917b09f99e494bc8acd4/stars_static0000.jpg" data-mid="72333760" border="0" alt="Rendered stars (glare added in post)" data-caption="Rendered stars (glare added in post)" src="https://freight.cargo.site/w/1000/i/13e4c538def46c3aa948c40a366bc9e93d98b36cce6b917b09f99e494bc8acd4/stars_static0000.jpg" /&#62;
&#60;img width="1440" height="600" width_o="1440" height_o="600" data-src="https://freight.cargo.site/t/original/i/13e4c538def46c3aa948c40a366bc9e93d98b36cce6b917b09f99e494bc8acd4/stars_static0000.jpg" data-mid="72333760" border="0" alt="Rendered stars (glare added in post)" data-caption="Rendered stars (glare added in post)" src="https://freight.cargo.site/w/1000/i/13e4c538def46c3aa948c40a366bc9e93d98b36cce6b917b09f99e494bc8acd4/stars_static0000.jpg" /&#62;


Motion blur




	

In order to correctly render motion blurred stars, it’s possible to expand this technique. The usual way of thinking about motion blur is ‘Does this object’s path cross this pixel, during the duration of the shutter?’ but we can flip this around and test if the current pixel’s path, projected on the celestial sphere, crosses a star. 



	

With star rendering already working, this is relatively simple. Rather than testing a single pixel (and its derivatives’) projection on the sphere, we want to trace a ‘swept pixel’ - the shape that the pixel covers as it moves across the starfield. We can do this by tracing two sets of rays, representing the state of the pixel at the shutter open time and shutter close time.





	












&#60;img width="520" height="268" width_o="520" height_o="268" data-src="https://freight.cargo.site/t/original/i/da1fd69be1225c7d7ff07ee846a343b08476178be2094b9516735670d5d49257/star_shutter_openclose0000.png" data-mid="72336919" border="0" alt="Pixels projected at shutter open and shutter close" data-caption="Pixels projected at shutter open and shutter close" src="https://freight.cargo.site/w/520/i/da1fd69be1225c7d7ff07ee846a343b08476178be2094b9516735670d5d49257/star_shutter_openclose0000.png" /&#62;

	
&#60;img width="520" height="268" width_o="520" height_o="268" data-src="https://freight.cargo.site/t/original/i/e45f33fe0499740f715c3a8f7bfce6c878cb3856f937f6af730ceec3be0d62c3/star_convexhull0000.png" data-mid="72336928" border="0" alt="A convex hull representing the pixel's path" data-caption="A convex hull representing the pixel's path" src="https://freight.cargo.site/w/520/i/e45f33fe0499740f715c3a8f7bfce6c878cb3856f937f6af730ceec3be0d62c3/star_convexhull0000.png" /&#62;


	

In polar coordinates we can calculate a convex hull of those two parallelograms, approximating the pixel’s motion across the shutter, and re-use the point-in-polygon test to determine if a star falls inside.


	

For a small increase in render time (much less than sending more primary rays), this generates perfectly smooth and accurate motion blurred stars. The streaks visible below are the paths of the stars as they are warped in screen space by the gravitational lensing over time, not distortion artifacts.



&#60;img width="1440" height="600" width_o="1440" height_o="600" data-src="https://freight.cargo.site/t/original/i/b66d82c01b223f6f894da5982b060ad9de69810b766fd7890fee025f83cbcf3d/stars_motion0000.jpg" data-mid="72333758" border="0" alt="Rendered stars with analytic motion blur (camera is orbiting the black hole)" data-caption="Rendered stars with analytic motion blur (camera is orbiting the black hole)" src="https://freight.cargo.site/w/1000/i/b66d82c01b223f6f894da5982b060ad9de69810b766fd7890fee025f83cbcf3d/stars_motion0000.jpg" /&#62;


</description>
		
	</item>
		
		
	<item>
		<title>The Invisible Man</title>
				
		<link>https://mattebb.cargo.site/The-Invisible-Man</link>

		<pubDate>Mon, 17 Feb 2020 10:19:50 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/The-Invisible-Man</guid>

		<description>2020CG Supervisor, Cutting Edge

Cutting Edge was the sole VFX vendor on The Invisible Man, with a small team of skilled artists. I had many areas of responsibility including artist support, managing the colour pipeline, pipeline development and supervision, along with creative work building procedural setups and assisting in look development.

The bulk of our CG work was in creating a dynamic, digital version of the antagonist’s invisible suit; multiple variations included a startup/shutdown scene where the suit is revealed and a glitchy, malfunctioning version.







&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/207c9b92952beaadc4099855a0ff23e7abb74e0c04e7d702415d34c17f06e099/suitroom_ots1.jpg" data-mid="72455069" border="0"  src="https://freight.cargo.site/w/1000/i/207c9b92952beaadc4099855a0ff23e7abb74e0c04e7d702415d34c17f06e099/suitroom_ots1.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/b2194dade0ffc95a9e9b4a79e85a478df90ced3334ae4690779651449411813b/suitroom_transparent.jpg" data-mid="72455070" border="0"  src="https://freight.cargo.site/w/1000/i/b2194dade0ffc95a9e9b4a79e85a478df90ced3334ae4690779651449411813b/suitroom_transparent.jpg" /&#62;




	The CG suit was a digital replica of a practical suit that was worn in some shots on set, enhanced with intricate detailing to bring the optical projection mechanisms to life. In its malfunctioning state, the suit stutters as it attempts to repair its projected image, showing subtly imperfect mirages of its environment alongside the inactive physical cells.
	



&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/ef565f72e720593edebb73cbe0d2e68959be5c7be96df38606305030ed96f17b/taser.jpg" data-mid="72455071" border="0"  src="https://freight.cargo.site/w/1000/i/ef565f72e720593edebb73cbe0d2e68959be5c7be96df38606305030ed96f17b/taser.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/2162c28a1180e691113d144c1f0a9e90cdaee5845937db950feab20225105629/stab.jpg" data-mid="72455067" border="0"  src="https://freight.cargo.site/w/1000/i/2162c28a1180e691113d144c1f0a9e90cdaee5845937db950feab20225105629/stab.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/b858a63cd4ea464ec562aa73ae87df5776e0ef85cce1da17461dd9ab67468b59/stab2.jpg" data-mid="72455068" border="0"  src="https://freight.cargo.site/w/1000/i/b858a63cd4ea464ec562aa73ae87df5776e0ef85cce1da17461dd9ab67468b59/stab2.jpg" /&#62;

&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/68ecb3cab4a00de637354fab9adab41f5e3f452db6655cdaa64c971016d0b96d/carboot.jpg" data-mid="72455057" border="0"  src="https://freight.cargo.site/w/1000/i/68ecb3cab4a00de637354fab9adab41f5e3f452db6655cdaa64c971016d0b96d/carboot.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/540c6886f778cac6fe2fc11ffbe104c0531d3cc24d272ae3ac199bd440778013/rain_side.jpg" data-mid="72455066" border="0"  src="https://freight.cargo.site/w/1000/i/540c6886f778cac6fe2fc11ffbe104c0531d3cc24d272ae3ac199bd440778013/rain_side.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/c5dd3398ecdb3a2ad5028ab89d01df9f331f4c7631cc8ebe005a0e0f4c928e85/rain_ots2.jpg" data-mid="72455064" border="0"  src="https://freight.cargo.site/w/1000/i/c5dd3398ecdb3a2ad5028ab89d01df9f331f4c7631cc8ebe005a0e0f4c928e85/rain_ots2.jpg" /&#62;



	

Other significant CG work included a digi-double with clothing simulation, aiding some very challenging cleanup and compositing during a motion-controlled long take.


	




&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/2b4fa0d3af22e05958005fa454869d4031a0567593315b8b0e35a366fcb4d197/jumper_lift.jpg" data-mid="72455062" border="0"  src="https://freight.cargo.site/w/1000/i/2b4fa0d3af22e05958005fa454869d4031a0567593315b8b0e35a366fcb4d197/jumper_lift.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/a43f4365d999f964835be585225f8afe5ca6a9d14d7799a52b29cd3e857ee78a/jumper_ground.jpg" data-mid="72455061" border="0"  src="https://freight.cargo.site/w/1000/i/a43f4365d999f964835be585225f8afe5ca6a9d14d7799a52b29cd3e857ee78a/jumper_ground.jpg" /&#62;
</description>
		
	</item>
		
		
	<item>
		<title>Captain Marvel</title>
				
		<link>https://mattebb.cargo.site/Captain-Marvel</link>

		<pubDate>Mon, 17 Feb 2020 10:18:23 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Captain-Marvel</guid>

		<description>2019Effects Lead, Animal Logic


	

From the early days of the project I worked closely with our VFX supervisor doing design work for the ceiling, water and islands of the Supreme Intelligence chamber environment, distorted melting footage in the mind fracking memory sequence, and Supreme Intelligence energy blast effect. Most of the work was in exploring the vast space of possibilities with the client, to narrow in on a favoured design to refine through to final.
As the effects lead I was responsible for creative and technical direction for the effects team, under the guidance of the VFX and CG supervisors. As the crew grew throughout the project, my responsibilities shifted to a leadership and support role and much of this work was handed over to be beautifully developed and executed by other members of the team. One exception was the energy blast effect, which I brought through to final shots with artist Jacob Santamaria.


	



&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/26672314b008ea8933e5a622a82226c1ec6fa9f428caa83a5538f2d64f8d42e8/siblast1049.jpg" data-mid="71847190" border="0"  src="https://freight.cargo.site/w/1000/i/26672314b008ea8933e5a622a82226c1ec6fa9f428caa83a5538f2d64f8d42e8/siblast1049.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/2b784633dce3d935a7c394dbf7a9f4ff398fa901fd7d466239b7be3ed643afd6/siblast2085.jpg" data-mid="71847192" border="0"  src="https://freight.cargo.site/w/1000/i/2b784633dce3d935a7c394dbf7a9f4ff398fa901fd7d466239b7be3ed643afd6/siblast2085.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/a8ff390077ff4274797ec0f181a78a8ac8d3e7903ee1dd5b2e45e3fa16fbe299/siblast1137.jpg" data-mid="71847191" border="0"  src="https://freight.cargo.site/w/1000/i/a8ff390077ff4274797ec0f181a78a8ac8d3e7903ee1dd5b2e45e3fa16fbe299/siblast1137.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/17bd7345de142583365f9354f8efe56d07d3bf051f6bb20b0ae025f6bf0c0f8d/siblast3103.jpg" data-mid="71847193" border="0"  src="https://freight.cargo.site/w/1000/i/17bd7345de142583365f9354f8efe56d07d3bf051f6bb20b0ae025f6bf0c0f8d/siblast3103.jpg" /&#62;



	
The highly detailed ceiling of the chamber went through several rounds of iterations, from pure white, to dark, and various other lighting treatments in between, as we worked collaboratively with lighting and comp departments. The final version, procedurally modelled with detailed patterning was finished by artist Jonathan Ravagnani.
	



&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/a40cfb22021d426d6276fed55aa514ef8fe24ed18afcb3ec283f1b304faab0c6/env107.jpg" data-mid="71847194" border="0"  src="https://freight.cargo.site/w/1000/i/a40cfb22021d426d6276fed55aa514ef8fe24ed18afcb3ec283f1b304faab0c6/env107.jpg" /&#62;

Our sequences had other one-off effects including digi-double simulation, walls made of non-newtonian fluid, additional energy effects, and destruction of the chamber environment itself. I contributed shot work with the SI character's introductory transition from ferrofluid to glass-like humanoid figure, while the simulated ferrofluid lake surface was simulated by artist Heather Cardew.

&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/ad32755a23958bcb0e80c417210018f24531a3f515a1b612ae1ce2c2f52ebcfe/croq075.jpg" data-mid="71847198" border="0"  src="https://freight.cargo.site/w/1000/i/ad32755a23958bcb0e80c417210018f24531a3f515a1b612ae1ce2c2f52ebcfe/croq075.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/6b576b167292d21aaa211872bf90b901ea41a24e4eaf03d97e2a1bfc3e3b3403/croq024.jpg" data-mid="71847195" border="0"  src="https://freight.cargo.site/w/1000/i/6b576b167292d21aaa211872bf90b901ea41a24e4eaf03d97e2a1bfc3e3b3403/croq024.jpg" /&#62;
&#60;img width="1920" height="800" width_o="1920" height_o="800" data-src="https://freight.cargo.site/t/original/i/858eab7c5a72155e7f1dc26fbf01131bf98085bb81e9bbd64344156cdd01269f/croq045.jpg" data-mid="71847196" border="0"  src="https://freight.cargo.site/w/1000/i/858eab7c5a72155e7f1dc26fbf01131bf98085bb81e9bbd64344156cdd01269f/croq045.jpg" /&#62;
</description>
		
	</item>
		
		
	<item>
		<title>Upgrade</title>
				
		<link>https://mattebb.cargo.site/Upgrade-1</link>

		<pubDate>Mon, 17 Feb 2020 09:41:06 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Upgrade-1</guid>

		<description>2018
Freelance

	
Working with 

Substance

, I produced the animated opening title sequence for Leigh Whannell's sci-fi horror film,

Upgrade.

The concept as briefed was to make a ‘type-less title sequence’, with the credits audible as spoken words from the voice of the artificial intelligence that is the central character of the film. The voice is visually represented by audio-reactive waveform filaments that ripple over a landscape, turning it from natural curves to increasingly complex, artificial structures and ending in a dark singularity which foreshadows events in the film.





	

Post-preliminary design work, I created the opening sequence largely from start to finish, with further design development, procedural modelling and animation, lighting and rendering. 




&#60;img width="2578" height="1080" width_o="2578" height_o="1080" data-src="https://freight.cargo.site/t/original/i/efd2f15ea4c6a4256ac509cc5e4e7c77d7ee99d92cf08499aa29d7bb50d1b92a/upgrade_openingtitle_final_2560_00409.jpg" data-mid="71847199" border="0"  src="https://freight.cargo.site/w/1000/i/efd2f15ea4c6a4256ac509cc5e4e7c77d7ee99d92cf08499aa29d7bb50d1b92a/upgrade_openingtitle_final_2560_00409.jpg" /&#62;
&#60;img width="2578" height="1080" width_o="2578" height_o="1080" data-src="https://freight.cargo.site/t/original/i/4acd80dd66a3d6c507a4a508e515049a30b260cb4b0a759caa08d4b6acaaf708/upgrade_openingtitle_final_2560_00431.jpg" data-mid="71847203" border="0"  src="https://freight.cargo.site/w/1000/i/4acd80dd66a3d6c507a4a508e515049a30b260cb4b0a759caa08d4b6acaaf708/upgrade_openingtitle_final_2560_00431.jpg" /&#62;
&#60;img width="2578" height="1080" width_o="2578" height_o="1080" data-src="https://freight.cargo.site/t/original/i/97685fad16040406d2e59533bf2cc77de72dcf1598d4918bcf12554b849e1799/upgrade_openingtitle_final_2560_00831.jpg" data-mid="71847200" border="0"  src="https://freight.cargo.site/w/1000/i/97685fad16040406d2e59533bf2cc77de72dcf1598d4918bcf12554b849e1799/upgrade_openingtitle_final_2560_00831.jpg" /&#62;
&#60;img width="2578" height="1080" width_o="2578" height_o="1080" data-src="https://freight.cargo.site/t/original/i/a676dc4fe0d17e547930b2343f48a33f157b80ad8402f387f94a306d255a5e3c/upgrade_openingtitle_final_2560_00898.jpg" data-mid="71847201" border="0"  src="https://freight.cargo.site/w/1000/i/a676dc4fe0d17e547930b2343f48a33f157b80ad8402f387f94a306d255a5e3c/upgrade_openingtitle_final_2560_00898.jpg" /&#62;
&#60;img width="2578" height="1080" width_o="2578" height_o="1080" data-src="https://freight.cargo.site/t/original/i/8aa451cc700d68e349d308f3834348e0d110954373af52072bd66812cbbb3ddf/upgrade_openingtitle_final_2560_00924.jpg" data-mid="71847202" border="0"  src="https://freight.cargo.site/w/1000/i/8aa451cc700d68e349d308f3834348e0d110954373af52072bd66812cbbb3ddf/upgrade_openingtitle_final_2560_00924.jpg" /&#62;</description>
		
	</item>
		
		
	<item>
		<title>Guardians of the Galaxy 2</title>
				
		<link>https://mattebb.cargo.site/Guardians-of-the-Galaxy-2</link>

		<pubDate>Sun, 07 May 2017 05:21:00 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Guardians-of-the-Galaxy-2</guid>

		<description>2017
Effects Lead, Animal Logic





	

Our work on Guardians of the Galaxy Vol. 2&#38;nbsp;involved creatively communicating some difficult yet important plot points, in a manner worthy of Kurt Russells’s larger-than-life Ego character. As well as providing technical and creative leadership for the effects team, my own work also contributed to research, effects look development, and tools.


	


&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/dd46a02df5b4c26f094e17b87c0b533a9ef96a936fc47247f25ceca518349a20/gotg_palace4.jpg" data-mid="71847210" border="0"  src="https://freight.cargo.site/w/1000/i/dd46a02df5b4c26f094e17b87c0b533a9ef96a936fc47247f25ceca518349a20/gotg_palace4.jpg" /&#62;
&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/f89cfd39e4e9c9e6d33724f4412b10055a0d9003eb74ff949d3cc213ff8198a1/gotg_palace1.jpg" data-mid="71847207" border="0"  src="https://freight.cargo.site/w/1000/i/f89cfd39e4e9c9e6d33724f4412b10055a0d9003eb74ff949d3cc213ff8198a1/gotg_palace1.jpg" /&#62;


	

A central component of our environments were fractals. Our brief was to create a palace that Ego had created on his home planet, with intricate complexity and a mathematical styling. I spent several weeks investigating fractals and developing tools to visualise and edit the complex generated forms, in the form of a fractal renderer built within Houdini, that generated extremely detailed 3D point clouds. This work evolved into a full pipeline for creating usable set-pieces, which our lead modeller used to curate areas of intricate fractal geometry, extracting and manipulating selected pieces to construct the palace architecture. Our brief later expanded into creating an alien outdoor garden, for which we re-used the fractal tools to create and lay out surreal plant life.


	

I presented a talk about this work at SIGGRAPH 2017 in Los Angeles:
Building Detailed Fractal Sets for “Guardians of the Galaxy Vol. 2





&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/2260f21f9e329590f4dfe856fe02e53beeba13cda58c827fdccc87aaad08e6fc/gotg_palace3.jpg" data-mid="71847209" border="0"  src="https://freight.cargo.site/w/1000/i/2260f21f9e329590f4dfe856fe02e53beeba13cda58c827fdccc87aaad08e6fc/gotg_palace3.jpg" /&#62;
&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/6687065d9fbd477b793e0f5f3b41a9a05266108428bab53ce2a5e1642f237887/gotg_garden1.jpg" data-mid="71847206" border="0"  src="https://freight.cargo.site/w/1000/i/6687065d9fbd477b793e0f5f3b41a9a05266108428bab53ce2a5e1642f237887/gotg_garden1.jpg" /&#62;

	

Another significant effect under my responsibility was Quill's celestial light which he compresses into a hilarious cosmic baseball to play catch with his dad. I developed the effect setup along with the compositing team, with various energy bursts, electric arcs, vapour trails, and sub-dermal tissue lighting combined together for the final result.


	



&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/10d69fe0b3fca2fbdb7419de01e7ab30f5dc5626b18f6c982ed546dbabfbffac/gotg_celestial1.jpg" data-mid="71847205" border="0"  src="https://freight.cargo.site/w/1000/i/10d69fe0b3fca2fbdb7419de01e7ab30f5dc5626b18f6c982ed546dbabfbffac/gotg_celestial1.jpg" /&#62;
&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/4d0a096265ebb17cd97f31226188157d5022c72f5ab8e397b954cf2698e55ecf/gotg_energy2.jpg" data-mid="71847212" border="0"  src="https://freight.cargo.site/w/1000/i/4d0a096265ebb17cd97f31226188157d5022c72f5ab8e397b954cf2698e55ecf/gotg_energy2.jpg" /&#62;
&#60;img width="1200" height="633" width_o="1200" height_o="633" data-src="https://freight.cargo.site/t/original/i/0d824ffd5eb5a5f8da9b8e1f2b7f3cec705e840c82181cfbf09e09a4be507c66/gotg_energy1.jpg" data-mid="71847211" border="0"  src="https://freight.cargo.site/w/1000/i/0d824ffd5eb5a5f8da9b8e1f2b7f3cec705e840c82181cfbf09e09a4be507c66/gotg_energy1.jpg" /&#62;



	

Additional publicity:

Art of VFX
FX Guide



	

I described some of the technical details behind this work at SideFX software's 'Houdini Hive' sessions at SIGGRAPH 2017: 
https://vimeo.com/228370204



</description>
		
	</item>
		
		
	<item>
		<title>Avengers: Age of Ultron</title>
				
		<link>https://mattebb.cargo.site/Avengers-Age-of-Ultron</link>

		<pubDate>Wed, 22 Jul 2015 13:17:57 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Avengers-Age-of-Ultron</guid>

		<description>2015
Effects Lead, Animal Logic


	

Our major sequence in Avengers: Age of Ultron was the birth of Ultron, a pivotal piece of exposition early in the film that shows the transition of the Ultron character from his beginnings as an intelligent program visualised in hologram form, to his self-discovery in an infinite cyberspace environment.
My main focus was the design, look development, generative animation and render setup of the holograms, mainly concentrating on the representation of Ultron. Ultron's brief was to look like a highly sophisticated, ‘unfathomably complex’ cyber-intelligence that seemed alive, constantly evolving and organic, in contrast to J.A.R.V.I.S - a more modernist, man-made construction. 


	

Ultron made heavy use of generative and procedural design techniques, creating constant motion and re-assembly, and a visual 'cause and effect' transfer of information throughout its form. Some of my work on the J.A.R.V.I.S hologram included the audio-reactive exterior rings of data that circumscribed the hologram, evoking spinning hard drive platters and reel-to-reel data tape.





&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/afa9f15390044d16b7465c58bdf639ee20ee384c44e3ce22eb519e0526cbe504/1024516-avl1310v041.1100-1copy-1200.jpg" data-mid="71847214" border="0"  src="https://freight.cargo.site/w/1000/i/afa9f15390044d16b7465c58bdf639ee20ee384c44e3ce22eb519e0526cbe504/1024516-avl1310v041.1100-1copy-1200.jpg" /&#62;
&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/b02af7dd5a56d754fbf81d719e2b2705e38790750782f2174b190bcf469a4ed9/1024516-avl1850srgbv001.1244-1200.jpg" data-mid="71847223" border="0"  src="https://freight.cargo.site/w/1000/i/b02af7dd5a56d754fbf81d719e2b2705e38790750782f2174b190bcf469a4ed9/1024516-avl1850srgbv001.1244-1200.jpg" /&#62;&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/084f74b64af06132d39ac18faa68f417c3c5e47cdd602ac853b2b7330596613c/1024516-avl1845srgbv001.1506-1200.jpg" data-mid="71847222" border="0"  src="https://freight.cargo.site/w/1000/i/084f74b64af06132d39ac18faa68f417c3c5e47cdd602ac853b2b7330596613c/1024516-avl1845srgbv001.1506-1200.jpg" /&#62;
&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/66ac58371a03129733d232bb71be6a3a751d2c030313b6341beb76257533183b/1024516-avl1845srgbv001.1183-1200.jpg" data-mid="71847221" border="0"  src="https://freight.cargo.site/w/1000/i/66ac58371a03129733d232bb71be6a3a751d2c030313b6341beb76257533183b/1024516-avl1845srgbv001.1183-1200.jpg" /&#62;
&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/935205bd8a9e1d85f4a19526e2244a10bc235ebb4b14ae7d577def25eb012f63/1024516-avl4245srgbv001.1030-1200.jpg" data-mid="71847224" border="0"  src="https://freight.cargo.site/w/1000/i/935205bd8a9e1d85f4a19526e2244a10bc235ebb4b14ae7d577def25eb012f63/1024516-avl4245srgbv001.1030-1200.jpg" /&#62;


	

The companion to this work was a visual representation of a vast cyberspace landscape, shot from Ultron's point of view as he retrieves data and gains self-awareness. The structure was designed using fractal formulas to generate a complex cavern of millions of 'blocks of data'. I was involved in the setup, shot design and set dressing of the chunks of data structure, along with procedural animation visualising information being copied and re-arranged in the background. 


	


&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/e42a756631fec197aa05daa205335ed5204e41d7810f89d4c1728971ada40c0e/1024516-avl1793srgbv001.1408-1200.jpg" data-mid="71847215" border="0"  src="https://freight.cargo.site/w/1000/i/e42a756631fec197aa05daa205335ed5204e41d7810f89d4c1728971ada40c0e/1024516-avl1793srgbv001.1408-1200.jpg" /&#62;&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/b6d363a90bba9809bbff46f5cb73593efde37e0044250713c117f6f8e3714c22/1024516-avl1803srgbv001.1264-1200.jpg" data-mid="71847216" border="0"  src="https://freight.cargo.site/w/1000/i/b6d363a90bba9809bbff46f5cb73593efde37e0044250713c117f6f8e3714c22/1024516-avl1803srgbv001.1264-1200.jpg" /&#62;&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/bdfafa245d567734a3f7e11ef42adce5d3b27ad3f47b4573b8cc2534df66bce5/1024516-avl1815srgbv001.1156-1200.jpg" data-mid="71847217" border="0"  src="https://freight.cargo.site/w/1000/i/bdfafa245d567734a3f7e11ef42adce5d3b27ad3f47b4573b8cc2534df66bce5/1024516-avl1815srgbv001.1156-1200.jpg" /&#62;&#60;img width="1200" height="503" width_o="1200" height_o="503" data-src="https://freight.cargo.site/t/original/i/df626b1633e6dfa0c323d4866d1696ef51fef0155a195988ca4f680679106ab7/1024516-avl1830srgbv001.1074-1200.jpg" data-mid="71847218" border="0"  src="https://freight.cargo.site/w/1000/i/df626b1633e6dfa0c323d4866d1696ef51fef0155a195988ca4f680679106ab7/1024516-avl1830srgbv001.1074-1200.jpg" /&#62;</description>
		
	</item>
		
		
	<item>
		<title>Allegiant</title>
				
		<link>https://mattebb.cargo.site/Allegiant</link>

		<pubDate>Thu, 01 Dec 2016 12:33:06 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Allegiant</guid>

		<description>2016
Effects Lead, Animal Logic

	

Our effects team’s work on the young adult sci-fi Allegiant was centered on a chase sequence through a post-apocalyptic desert. As the first project to complete at Animal Logic with a radically new renderer and shading/lighting pipeline, my role was both creative and technical leadership within the effects team and iterating and interfacing with R&#38;amp;D departments to ensure a smooth and functional technical transition.
My own creative work included design and setup of the camouflage wall force-field effect, and rigid destruction effects as part of an aircraft crash.


	



&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/ebc7d0ccef7e4ec9cdcb0670ef42eedce78534296ae9e49a0362c93e7d76d8ff/allegiant_cw1.jpg" data-mid="71847228" border="0"  src="https://freight.cargo.site/w/1000/i/ebc7d0ccef7e4ec9cdcb0670ef42eedce78534296ae9e49a0362c93e7d76d8ff/allegiant_cw1.jpg" /&#62;
&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/1726bed35f76ffc3b02116ad90112e5377e7a70b39b05bda1c246c70a6e5af09/allegiant_cw2.jpg" data-mid="71847229" border="0"  src="https://freight.cargo.site/w/1000/i/1726bed35f76ffc3b02116ad90112e5377e7a70b39b05bda1c246c70a6e5af09/allegiant_cw2.jpg" /&#62;


&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/71bc48b9dd9df6bf1fdbadde0533c7c535d0046dda2ad9d70be78933fadf55ea/allegiant_bf2.jpg" data-mid="71847226" border="0"  src="https://freight.cargo.site/w/1000/i/71bc48b9dd9df6bf1fdbadde0533c7c535d0046dda2ad9d70be78933fadf55ea/allegiant_bf2.jpg" /&#62;
&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/518785bc60ce29b6d4b21868d3556358294603b750f256a4dcc2424ac00249e3/allegiant_bf1.jpg" data-mid="71847225" border="0"  src="https://freight.cargo.site/w/1000/i/518785bc60ce29b6d4b21868d3556358294603b750f256a4dcc2424ac00249e3/allegiant_bf1.jpg" /&#62;
&#60;img width="1200" height="499" width_o="1200" height_o="499" data-src="https://freight.cargo.site/t/original/i/2770ca6af2641953c11e12fb32e2360e1b7d7c78c14f2e98fe2c4a44520b06b7/allegiant_bf3.jpg" data-mid="71847227" border="0"  src="https://freight.cargo.site/w/1000/i/2770ca6af2641953c11e12fb32e2360e1b7d7c78c14f2e98fe2c4a44520b06b7/allegiant_bf3.jpg" /&#62;</description>
		
	</item>
		
		
	<item>
		<title>Jupiter Ascending</title>
				
		<link>https://mattebb.cargo.site/Jupiter-Ascending</link>

		<pubDate>Sat, 25 Jul 2015 11:20:04 +0000</pubDate>

		<dc:creator>Matt Ebb</dc:creator>

		<guid isPermaLink="true">https://mattebb.cargo.site/Jupiter-Ascending</guid>

		<description>2014
Effects Artist, Double Negative

	

Amongst one-off effects tasks, my work on Jupiter Ascending focused on developing and executing the glowing re-entry burn up effects our lead character's spacecraft enters the harsh atmosphere of the planet Jupiter.


	



&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/c50648ab617e3677ebaf96f76ebdf34f34a273cec05cbf0451983cd89337246e/ja_reentry1.jpg" data-mid="71847240" border="0"  src="https://freight.cargo.site/w/1000/i/c50648ab617e3677ebaf96f76ebdf34f34a273cec05cbf0451983cd89337246e/ja_reentry1.jpg" /&#62;
&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/e03d03669432106e7fef5bd5c3e96bc32112815e07b2745606e76d8a31579220/ja_reentry2.jpg" data-mid="71847241" border="0"  src="https://freight.cargo.site/w/1000/i/e03d03669432106e7fef5bd5c3e96bc32112815e07b2745606e76d8a31579220/ja_reentry2.jpg" /&#62;

&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/e366116ef1fdad3b2ef17edc3af7d7ef07992dacfef416b38f22267b2f303c89/ja_reentry4.jpg" data-mid="71847243" border="0"  src="https://freight.cargo.site/w/1000/i/e366116ef1fdad3b2ef17edc3af7d7ef07992dacfef416b38f22267b2f303c89/ja_reentry4.jpg" /&#62;
&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/8756f165a0d23a2156d8382c697058f0688819af4e689151fc5537866bca6106/ja_reentry3.jpg" data-mid="71847242" border="0"  src="https://freight.cargo.site/w/1000/i/8756f165a0d23a2156d8382c697058f0688819af4e689151fc5537866bca6106/ja_reentry3.jpg" /&#62;
&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/146ac3512532bc0a2b2f3a4fa03bd3f106791981ef9fe700fd045db2a32c9fff/ja_reentry5.jpg" data-mid="71847244" border="0"  src="https://freight.cargo.site/w/1000/i/146ac3512532bc0a2b2f3a4fa03bd3f106791981ef9fe700fd045db2a32c9fff/ja_reentry5.jpg" /&#62;
&#60;img width="1200" height="500" width_o="1200" height_o="500" data-src="https://freight.cargo.site/t/original/i/36a00a7ff5f3dcf3f28af893e9148b27fc8e7070ab9f40765299677719fcf338/ja_debris1.jpg" data-mid="71847239" border="0"  src="https://freight.cargo.site/w/1000/i/36a00a7ff5f3dcf3f28af893e9148b27fc8e7070ab9f40765299677719fcf338/ja_debris1.jpg" /&#62;</description>
		
	</item>
		
	</channel>
</rss>