Fix False Transparency by Noise Guided Splatting

1OpsiClear 2Case Western Reserve University
*Denotes equal contribution.

False Transparency

False transparency is a visual artifact in 3D Gaussian Splatting (3DGS) where solid, opaque objects are incorrectly rendered with a see-through surface. When viewed interactively, this causes distracting, constantly shifting patterns from the object's interior or background to become visible on its surface. This happens because of an ill-posed optimization; the training process can't distinguish between the object's front and back surfaces and incorrectly blends them together.

Abstract

Opaque objects reconstructed by 3D Gaussian Splatting (3DGS) often exhibit a falsely transparent surface, leading to inconsistent background and internal patterns under camera motion in interactive viewing. This issue stems from the ill-posed optimization in 3DGS. During training, background and foreground Gaussians are blended via α-compositing and optimized solely against the input RGB images using a photometric loss. As this process lacks an explicit constraint on surface opacity, the optimization may incorrectly assign transparency to opaque regions, resulting in view-inconsistent and falsely transparent output. This issue is difficult to detect in standard evaluation settings (i.e., rendering static images) but becomes particularly evident in object-centric reconstructions under interactive viewing. Although other causes of view-inconsistency (e.g., popping artifacts) have been explored recently, false transparency has not been explicitly identified. To the best of our knowledge, we are the first to quantify, characterize, and develop solutions for this "false transparency" artifact, an underreported artifact in 3DGS. Our strategy, Noise Guided Splatting (NGS), encourages surface Gaussians to adopt higher opacity by injecting opaque noise Gaussians in the object volume during training, requiring only minimal modifications to the existing splatting process. To quantitatively evaluate false transparency in static renderings, we propose a transmittance-based metric that measures the severity of this artifact. In addition, we introduce a customized, high-quality object-centric scan dataset exhibiting pronounced transparency issues, and we augment popular existing datasets (e.g., DTU) with complementary infill noise specifically designed to assess the robustness of 3D reconstruction methods to false transparency. Experiments across multiple datasets show that NGS substantially reduces false transparency while maintaining competitive performance on standard rendering metrics (e.g., PSNR), demonstrating its overall effectiveness.

Interactive 3D Comparison

Explore the visual impact of Noise Guided Splatting (NGS). The 2x2 grid below shows: Top-left: Standard 3DGS (no infill). Top-right: 3DGS with green infill to reveal false transparency. Bottom-left: Our NGS result (no infill). Bottom-right: Our NGS result with green infill, showing reduced transparency. Use the carousel to choose an item below. Note that false transparency is a distinct issue from other view-inconsistency problems like "popping artifacts," which are caused by sorting errors and are not the focus of this work.

3DGS (No Infill)

3DGS (With Green Infill)

Ours (NGS, No Infill)

Ours (NGS, With Green Infill)

Method: Noise Guided Splatting (NGS)

Overview of the NGS method (Figure 1 from paper)

Overview of NGS (Noise Guided Splatting). (a) Object-centric 3DGS render of a stone. (b) Noise Gaussians are introduced into the object's volume during the training process. These noise Gaussians have fixed size/location, random color, and trainable opacity for pruning. Surface Gaussians are fixed during this noise optimization phase. (c) Visible noise Gaussians that appear in front of surface Gaussians are removed during optimization, leaving subsurface noise filling the object. (d) This noise infill can be recolored (e.g., to green) and saved for evaluating transparency in any splatting-based method. (e) When the recolored infill is inserted into a vanilla 3DGS model, highly transparent regions on the surface become evident as the green infill "leaks" through. (f) In contrast, the recolored infill does not leak through the surface of an NGS-trained model, demonstrating its improved opacity.

NGS addresses the false transparency by strategically placing these noise Gaussians within the object volume. This obstructs direct lines of sight between front and back surfaces, forcing the optimization to prioritize foreground surface reconstruction and achieve higher opacity. The method is designed as a plug-and-play add-on, requiring minimal modifications to existing 3DGS frameworks, primarily the introduction of noise Gaussians and an alpha consistency loss (La). The noise initialization involves computing a convex hull from existing Gaussian primitives to approximate the object's volume, converting it to a coarse occupancy grid, and then populating it with multi-scale noise Gaussians.

Quantitative & Qualitative Results

Average Peak Signal to Noise Ratio (PSNR), Structure Similarity Index Measurement (SSIM), Learned Perceptual Image Patch Similarity (LPIPS) and Surface Opacity Score (SOS) results on DTU, OmniObject3D and our novel dataset. Metrics denoted by * were acquired when using a green infill, and +α denotes use of alpha consistency loss. NGS shows strong improvements in SOS and infill-conditioned metrics.
Dataset Method PSNR↑ PSNR*↑ SSIM↑ SSIM*↑ LPIPS↓ LPIPS*↓ SOS↑
DTU 3DGS25.57522.9670.8910.8740.1800.2500.147
GOF25.64821.1090.8800.8160.2090.2730.179
StopThePop22.81718.8850.8520.7800.2130.3150.135
GSplat+α25.43525.2630.8840.8830.1830.1860.598
NGS25.42825.4270.8810.8810.1920.1920.749
Stone 3DGS34.61027.5510.9490.9090.0550.2220.140
GOF31.46921.9980.8930.7800.1860.3240.126
StopThePop32.45723.0470.9450.8530.0780.2230.168
GSplat+α33.83233.8230.9480.9480.0620.0620.891
NGS34.14834.1480.9510.9510.0530.0530.922
OmniObject 3DGS29.30027.4560.9400.9290.0690.1160.215
GOF32.25924.9310.9700.8980.0620.1220.208
StopThePop32.27425.0950.9700.9000.0500.1130.265
GSplat+α33.57533.3500.9730.9720.0600.0640.642
NGS33.61933.5780.9720.9720.0600.0600.736

Visual comparisons highlight the effectiveness of NGS. Below, on our Stone dataset (Figure 5 from paper), green infill is used to reveal transparency. Baseline methods (a-d) show significant green leakage, indicating false transparency. Our NGS method (e) results in a much more opaque surface.

Qualitative Comparison on Stone Dataset (Figure 5 from paper)

Comparison based on Figure 5 from paper: Renders with green infill revealing transparency (top) and corresponding transmittance maps (bottom) for (a) 3DGS, (b) GOF, (c) StopThePop, (d) Gsplat+α, and (e) NGS. NGS (e) shows minimal green infill leakage.

Noise Infill Visualization

The viewer below allows you to inspect the standalone noise infill generated by our NGS method for each model. This is the colored noise that helps enforce opacity and can be used as a diagnostic tool. Use the carousel to choose an item below.

Datasets

We evaluate NGS on publicly available object-centric datasets, DTU and OmniObject3D, and also introduce our new Stone Dataset to specifically test scenarios with complex geometry and textures prone to false transparency. Use the carousel to choose an item below. Click on the images in the carousel below to see a larger version.

This dataset comprises over 100 distinct stone samples. For each sample, we captured 240 images (3000 x 4000 pixels, 16-bit raw Bayer data) from 6 latitudinal and 40 longitudinal angles, covering the entire upper hemisphere. Camera poses were estimated using COLMAP, and high-quality foreground segmentations were produced using MVAnet.

Noise Gaussian infill add-ons were also created for existing datasets like DTU to facilitate benchmarking of false transparency handling. The Stone Dataset and the noise infills will be made publicly available.

We also introduce our new Objects Mix Dataset to specifically test scenarios with complex geometry and textures prone to false transparency.

BibTeX

@inproceedings{ElHakie2025NGS,
    author    = {El Hakie, Aly and Lu, Yiren and Yin, Yu and Jenkins, Michael and Liu, Yehe},
    title     = {Fix False Transparency by Noise Guided Splatting},
    booktitle = {The Thirty-ninth Annual Conference on Neural Information Processing Systems},
    year      = {2025}
}

Note: Please update the BibTeX entry if conference details (e.g., NeurIPS track, specific proceedings name) or year change upon publication. If you post to arXiv, consider adding an @misc entry for that as well.