A link model-driven approach toward transmission control protocol (TCP) performance over a wireless link is presented. TCP packet loss behavior is derived from an underlying two-state continuous time Markov model. The approach presented here is (to our knowledge) the first that simultaneously considers (1) variability of the round-trip delay due to buffer queueing; (2) independent and nonindependent (bursty) link errors; (3) TCP packet loss due to both buffer overflow and channel errors; and (4) the two modes of TCP packet loss detection (duplicate acknowledgments and timeouts). The analytical results are validated against simulations using the ns-2 simulator for a wide range of parameters; slow and fast fading links; small and large link bandwidth-delay products. For channels with memory, an empirical rule is presented for categorizing the impact of channel dynamics (fading rate) on TCP performance.