Meaningful simulation of water background interference

in #guestposting3 years ago (edited)

In this post, I want to use WebGL and ThreeJS to generalise caustics computation in real-time. It is crucial to emphasise that this is merely an attempt; finding a solution that works in all scenarios and runs at 60 frames per second is challenging, if not impossible. However, as you can see, we can achieve some excellent results with this strategy.

What exactly is the meaning of caustics?

Caustics are light patterns formed when light is refracted and reflected from a surface, such as an air/water interface.

The light patterns visible as a result of reflection and refraction on water waves are produced by water acting as a dynamic magnifying glass.

This page discusses the caustics created by light refraction, specifically what occurs underwater.

We need to compute them on the graphics card (GPU) to keep a constant 60fps, thus we'll utilise GLSL shaders.

To compute them, we must perform the following steps:

Determine the refracted rays at the water's surface (which is straightforward in GLSL as a built-in function is provided for that)

Using an intersection technique, determine where those rays intersect the environment.

Calculate the intensity of caustics by examining where the rays converge.

3D Modeling Services provides clients of Information Transformation Services with a wonderful and amazing visual experience. We are completely committed to offering our customers with a wide range of appealing 3D designs that have been precisely crafted to meet a wide range of needs.

The well-known WebGL water demonstration

Evan Wallace's WebGL depiction of visually believable water caustics has long grabbed my interest: madebyevan.com/webgl-water

I strongly recommend reading his Medium post on how to compute them in real-time using a light front mesh and partial derivative GLSL algorithms. His approach is quick and visually appealing, but it has certain limitations: it only works with a cubic pool and a sphere ball in the pool. Because the shaders are hard-coded to think it's a sphere ball, you can't place a shark underwater and expect the demo to function.

The reason for immersing a sphere was that determining the intersection of a refracted light ray and a sphere was straightforward, needing just very basic math.

All of this is excellent for a presentation, but I wanted a more universal solution for caustics computation, so that any type of unstructured mesh might float around in the pool like a shark.

Let us now shift our focus to our plan. I'll assume you're already familiar with the fundamentals of 3D rendering via rasterization, as well as how the vertex shader and fragment shader work together to draw primitives (triangles) on the screen.

Working inside the parameters of GLSL

GLSL (OpenGL Shading Language) shaders can only access a subset of scene data, such as:

Characteristics of the currently sketched vertex (position: 3D vector, normal: 3D vector, etc.). You can send your own attributes to the GPU, but they must be of a GLSL built-in type.

Uniforms that are constant across the entire mesh you're drawing at the moment. It could be a texture, a camera projection matrix, a lighting direction, or something completely else. The following built-in types must be present: int, float, texture sampler2D, vec2, vec3, vec4, mat3, mat4.

However, there is no method to access meshes that are present in the scene.

As a result, the webgl-water example was limited to a bare-bones 3D scene. It was easy to compute the intersection of the refracted beam and a relatively simple shape represented by uniforms. A sphere, for example, can be defined by a point (3D vector) and a radius (float), which can be passed to shaders via uniforms, and the intersection calculation consists of fairly simple arithmetic that can be executed easily and rapidly in a shader.

My articles is a member of the guest posting websites family, which includes a big community of content creators and authors.

You are welcome to sign up and submit a guest post with a dofollow backlink, regardless of the niche in which you operate. Follow your favourite authors, form groups, forums, and chat rooms, and much more!

Some ray-tracing shader techniques communicate meshes between textures, but this is not feasible for real-time WebGL rendering in 2020. We must keep in mind that we must compute 60 images per second while using a sufficient number of rays to obtain a satisfactory output. If we compute the caustics using 256x256=65536 rays, we must perform a large number of intersection computations per second (which also depends on the number of meshes in the scene).

We need to figure out how to represent the sub-water environment as uniforms and compute the intersection while maintaining a reasonable pace.

Sort:  

Dear friend, I invite you to become a member of our community United We Stand . In our community, everyone gets a gift. The only rule is at least one photo, three hundred characters of unique text, one message a day. United We Stand

Coin Marketplace

STEEM 0.18
TRX 0.13
JST 0.030
BTC 57606.89
ETH 3164.26
USDT 1.00
SBD 2.28