More Americans are inclined to reduce their meat consumption and eat more plants. Meanwhile, negative stereotypes surrounding the vegan lifestyle have been slowly fading away, according to the Nutrition Business Journal’s 2015 Special Diets Report. Healthier eating habits being incorporated into the mainstream sounds like very encouraging news for the future of public health in this country.