Recently, it has been shown that the capacity of certain Gaussian networks can be approximated by the capacity of the corresponding network in the discrete superposition model (DSM). The gap between the capacities is an additive constant only depending on the number of nodes in the network. Hence, the capacity in the DSM is a good approximation in the high SNR regime. Finding this capacity involves optimizing over a finite set of coding strategies. However, the problem space grows with both the number of nodes and with SNR, rendering the optimization infeasible. In this paper we find upper and lower bounds on the capacity in the DSM. We start with the point-to-point channel, and we extend our strategy to the multiple-access channel. We show that the gap between our bounds is at most an additive constant independent of the channel gains. Hence, combining our results with the results, we find closed form bounds on the Gaussian capacity to within an additive constant.