Augmented Reality (AR) has the potential of being an effective visualization tool for planning and operations design in construction, manufacturing, and other process-oriented engineering domains. One of the primary challenges in creating AR visualizations is to project graphical 3D objects onto a user’s view of the real world and create a sustained illusion that the virtual and real objects co-exist across time in the same augmented space. However regardless of the spatial relationship between the real and virtual objects, traditional AR scene composing algorithm displays the real world merely as a background, and superimposes virtual objects in the foreground. This creates incorrect visual occlusion artifacts, that in effect breaks the illusion that real and virtual objects co-exist in AR. The research implements and demonstrates a robust depth sensing and frame buffer algorithm for resolving incorrect occlusion problems in outdoor AR applications. A high-accuracy Time-of-flight (TOF) camera capable of suppressing background illumination (eg bright sunlight) in ubiquitous environments is used to capture the depth map of real-world in real time. The preprocessed distance information is rendered into depth buffer, that allows the interpolation of visual or hidden elements in the OpenGL color buffer to generate the composite AR scene. An optimized approach taking advantage of OpenGL texture and GLSL fragment processor is also proposed to speed up sampling distance value and rendering into frame buffer. The designed algorithm is validated in several indoor and outdoor experiments using SMART AR framework. The AR space with occlusion effect enabled demonstrates convincing spatial cues and graphical realism.