Despite numerous bounds and partial results, the feedback capacity of the stationary nonwhite Gaussian additive noise channel has been open, even for the simplest cases such as the first-order autoregressive Gaussian channel studied by Butman, Tiernan and Schalkwijk, Wolfowitz, Ozarow, and more recently, Yang, Kavc/spl caron/ic/spl acute/, and Tatikonda. Here we consider another simple special case of the stationary first-order moving average additive Gaussian noise channel and find the feedback capacity in closed form. Specifically, the channel is given by Y/sub i/=X/sub i/+Z/sub i/, i=1,2,..., where the input {X/sub i/} satisfies a power constraint and the noise {Z/sub i/} is a first-order moving average Gaussian process defined by Z/sub i/=/spl alpha/U/sub i-1/+U/sub i/, |/spl alpha/|/spl les/ 1, with white Gaussian innovations U/sub i/, i=0,1,.... We show that the feedback capacity of this channel is C/sub FB/=-log x/sub 0/ where x/sub 0/ is the unique positive root of the equation /spl rho/x/sup 2/=(1-x/sup 2/)(1-|/spl alpha/|x)/sup 2/ and /spl rho/ is the ratio of the average input power per transmission to the variance of the noise innovation U/sub i/. The optimal coding scheme parallels the simple linear signaling scheme by Schalkwijk and Kailath for the additive white Gaussian noise channel-the transmitter sends a real-valued information-bearing signal at the beginning of communication and subsequently refines the receiver's knowledge by processing the feedback noise signal through a linear stationary first-order autoregressive filter. The resulting error probability of the maximum likelihood decoding decays doubly exponentially in the duration of the communication. Refreshingly, this feedback capacity of the first-order moving average Gaussian channel is very similar in form to the best known achievable rate for the first-order autoregressive Gaussian noise channel given by Butman.