death-denying culture

Isn’t Hospice Care Just Giving Up?

American society is considered a death-denying culture. In general, we do not like to think about, talk about, or acknowledge death as an inevitable reality. While logically we understand that we will all die someday, it is generally a topic that is uncomfortable, and swept under the rug