Content Aware Fill is a powerful tool. I would love the day Blender would have something like that integrate in. Until then, we can use a small trick with projection mapping, to bring the content aware fill functionality of Photoshop to Blender. Here is a short break down of this technique.
A very common problem in photogrammetry, is when you miss certain areas during your capture, which leads to certain surfaces having missing or bad textures. An example would be in the following dataset:
A short walk through of a little hack on how to convert a portrait painting or a drawing to a 3D model with little texturing or the usual hassles such as proper unwrap. For this I am using Blender, although you could use any software. As the base I am using this painting by Ninoriel. You can view the model on my Sketchfab.
Yesterday, I watched a fantastic talk by Ian Hubert, titled World Building in Blender. In the talk, he talks about how he uses projection mapping in VFX to quickly get very acceptable environments and probs…
A detailed break down of creating an animated aura effect for a sword or an object using shaders in Unity3d. The technique however is as usable for Unreal or any other environment.
There are some alternatives to the…
Break down of the matrix shader effect written by Will Kirby. And implementation of a real time matrix Shader in Unity 3D with Triplanar mapping. This means the shader can be dropped on any mesh and it would work without the need of a specific asset preparation.
Original Shader: https://www.shadertoy.com/view/ldccW4
My Github Repository containing the code: https://github.com/IRCSS/MatrixVFX
In this post I will first look at the shader toy implementation and explain the lines, with the hope of demystifying some of the standard techniques, and then explain very briefly the Unity implementation.
When you want to understand a shader, one…
How to add a skybox / panorama/ Equirectangular Sphere Mesh to your Sketchfab scene.
I recently uploaded some scenes to sketchfab with skyboxes. Since some people asked how to do the same, here is a quick guide.
You can see an example of this on one of my sketchfab meshes, a peak in south tyrol: https://skfb.ly/6TUZv
There is an official sketchfab blog on how to add a skybox, however I find the shape of that skybox abit weird and doesnt look good with panos: https://sketchfab.com/blogs/community/how-to-add-a-skybox-to-your-model/
Here are the general steps to get your pano in your sketfab scene:
I was recently reading the art book of Spider Man Into the Spider-Verse and got inspired by their art style choices for the portal scenes and the influence of cubism on it. So I decided to create a Cubism shader in Unity 3D.
As usual you can find the code on my Github: https://github.com/IRCSS/Cubism-Shader
You can also try it out in either normal PC or VR by downloading the build: https://github.com/IRCSS/Cubism-Shader/releases/tag/v1.0
I frequently receive messages from people interested in working in the game industry as programmers. Typical questions are what jobs are in the game industry, what type of skill sets do they need, what they should do to prepare themselves for applying, if it is very challenging etc.
Since I always end up giving the exact same answer, I thought I might as well write a post on it. The game industry is huge, and I am only experiencing a tiny section of it. All I can base my answer on is my very limited perspective, so my first suggestion…
Using BlueprintNativeEvent and inheritance to create a function that can be partially defined in C++ and partially in Blueprint. Using this setup you can create a rather simple Component system for sharing the same behavior between several Actors.
This post is one of my documentation posts. It is written quite fast and is not structured similar to my other posts.
The blueprint/ C++ system in Unreal Engine is a very powerful tool. For our latest project, I set out with the task of coming up with a system that fulfills two criteria. Firstly we wanted to have behaviours that are…
After importing a 3D mesh from a software like Blender, Maya or Max, in Unity or Unreal Engine, you might realize that the number of vertices in the engine is higher than the number of vertices you see in the 3D software.
I was recently writing a demo, where I ran a compute shader per vertex. I noticed that the number of vertices I had in Blender, were way less than the number of vertices Unity showed me. I thought exporting the mesh with smooth shading will solve that, but turned out there were things here I never considered. Researching…