What used to be the norm in both Movies and Television is now a rarity. Stories about good marriages, everyday heroes (not those with superpowers), hard work and respect for one another seem to be considered boring or not sellable. The Media would rather focus on gotcha moments of politicians, and report on the small percentage of those that perform bad acts, as well as anything that can divide people instead of unite them. Many Americans that are still trying to figure out what they believe, how they should act and what is proper behavior are faced with many challenges. Hollywood and the Media seem to care more about the bottom line than what is best for our future. With all the special investigations going on I would like to call for one that investigates the negative impact Hollywood and the Media has on the American Culture.