Don’t be mistaken when it comes to desert regions, the expansive images of barren landscapes and oppressive sun that may come to your mind are oftentimes just a preconceived idea picked up from movies or storybooks. In reality, deserts in the western U.S. are home to thriving cities, fascinating plant growth and a resilience of life and spirit that is inspiring.