Q1
In a lot of my classes we discuss America as a WASP society. This means White Anglo-Saxon Protestant is the dominant culture. Do you think that protestant Christianity is still the dominant religion in America? Are we still a Christian nation? Also, what did you find interesting?
Q2
Do you feel that health is completely defined by culture or is it a concept that is somewhat independent of culture? Since we are discussing health and this lecture states or argues that health is dependent upon culture, what do you think?