A LiDAR-assisted panoramic visual simultaneous localization and mapping (SLAM) system for a mobile mapping system (MMS) is presented in this paper. The feasibility research on the SLAM for MMS with a panoramic camera and a tilted LiDAR without GPS/IMU sparked our interest. Because of the significant disparity in spatial sensing coverage, we show that employing a panoramic camera as a primary sensor for SLAM is more suitable than using a tilted LiDAR in this particular sensor combination. Existing panoramic visual SLAM systems, on the other hand, produce up-to-scale results, making them inappropriate for many applications that require metrically-scaled results. We develop a panoramic visual SLAM system that uses LiDAR points to generate metrically-scaled outputs to address this constraint. First, the suggested SLAM system augments visual features with ranges generated from LiDAR points. Following that, the visual features are fed into the SLAM pipeline, which performs tracking, local mapping, and loop closing. Finally, the scale information in the ranges augmented to visual features is integrated into the SLAM pipeline via the production of metrically-scaled map points, eventually leading to metrically-scaled SLAM results. Extensive testing in challenging outdoor conditions has proven the effectiveness and robustness of the proposed SLAM system.