Values of the American West have been preserved in television. We watch the history of the Wild West unfold in many broadcast television series that aired over three decades. What is lost to history? Was anything won back in the dramatizations?