← Back to blog list

Texture-Accurate Scanning: A Novel Approach

June 17, 2023 (1y ago)

Texture-Accurate Scanning: Bringing Objects to Life

When developing our scanner effect, we wanted to go beyond simple color representation. Our goal was to accurately represent the actual textures of scanned objects in our particle system. This presented a unique challenge, but we developed a novel solution that we're excited to share.

The Challenge

Other particle systems use a single color or a simple gradient- sometimes based on its distance from the camera. However, we wanted our scanner to emit particles that accurately reflected the texture of the object at the exact point where the scan ray hit. This required us to:

  1. Access the object's texture data efficiently
  2. Determine the exact texel (texture pixel) hit by each ray
  3. Use that texel's color for the emitted particle

Our Solution: MeshScannerSurface

We developed a MeshScannerSurface component that handles this texture-accurate particle emission. Here's how it works:

1. Texture Caching

First, we cache the texture data to avoid repeated texture reads:

MeshScannerSurface.cs
 
private static Dictionary<int, Color[]> textureColorCache = new Dictionary<int, Color[]>();
private Color[] textureColors;
private int textureWidth;
private int textureHeight;
 
private void Start()
{
    // ... other initialization code ...
    int textureID = texture2D.GetInstanceID();
    if (textureColorCache.ContainsKey(textureID))
    {
        textureColors = textureColorCache[textureID];
    }
    else
    {
        textureColors = texture2D.GetPixels();
        textureColorCache[textureID] = textureColors;
    }
    textureWidth = texture2D.width;
    textureHeight = texture2D.height;
}

This caching mechanism allows us to reuse texture data across multiple instances of the same texture, significantly improving performance.

2. Texture Coordinate to Color Mapping

When a scan ray hits the object, we use the hit's texture coordinates to find the corresponding texel color:

MeshScannerSurface.cs
 
void IScannable.EmitParticle(RaycastHit hit, VFXEmitArgs overrideArgs)
{
    Color color = defaultColor;
    // ... other code ...
    if (textureColors != null)
    {
        Vector2 pCoord = hit.textureCoord;
        int x = Mathf.FloorToInt(pCoord.x * textureWidth);
        int y = Mathf.FloorToInt(pCoord.y * textureHeight);
        // Convert 2D coordinates to 1D index
        int colorIndex = y * textureWidth + x;
        if (colorIndex >= 0 && colorIndex < textureColors.Length)
            color = textureColors[colorIndex];
    }
    // ... other code ...
}

This method takes the hit's texture coordinates (which are in UV space, ranging from 0 to 1) and maps them to our cached texture data array.

3. Particle Emission

Finally, we emit a particle with the exact color from the texture:

MeshScannerSurface.cs
 
ParticleCollector.Instance.CacheParticle(hit.point, color, size);

This particle is then rendered by our VFX system, resulting in a scan effect that accurately represents the object's texture.

The Result

This approach allows our scanner to create a particle-based representation of objects that is remarkably true to their original appearance. As the scan progresses, it's almost like seeing the object materialize out of particles, each one perfectly color-matched to the original texture.

Performance Considerations

While this method provides high visual fidelity, it does come with some performance overhead:

  1. Memory Usage: We're caching entire textures in memory. For projects with many unique textures, this could lead to high memory usage.
  2. Initial Load Time: There might be a slight delay when an object is first scanned as its texture is loaded into the cache.

However, the caching system ensures that subsequent scans of the same object (or objects sharing the same texture) are very fast.

Future Improvements

We're considering several enhancements to this system:

  1. Mipmap Support: For distant objects, we could sample from lower-resolution mipmaps to improve performance.
  2. Texture Streaming: For very large environments, we could implement a texture streaming system to load and unload texture data as needed.
  3. Compression: Implementing texture compression could reduce memory usage while maintaining visual quality.

Conclusion

Our texture-accurate scanning system demonstrates how thinking outside the box can lead to novel solutions in game development. By directly accessing and utilizing texture data, we've created a scanning effect that's not just a visual approximation, but a faithful particle-based recreation of the scanned objects.

This technique has significantly enhanced the immersion and visual quality of our game, and we're excited to see how we can further refine and expand upon this concept in the future.