Food Wi$e

Accessibility


| Share

Are Foods Labeled "Natural" Healthier for You?

I have seen some food labeled "natural." What does that mean? Is it healthier?
Are Foods Labeled "Natural" Healthier for You?

Photo used under license from www.istockphoto.com

The word "natural" on food products usually means the food has no added color, artificial flavors or other man-made substances. However, the Food and Drug Administration does not have a legal definition for the word "natural." "Natural" food isn't necessarily better for you. Compare the Nutrition Facts labels on foods to learn more about what you are eating.

Julie Garden-Robinson, Food and Nutrition Specialist, NDSU Extension Service

Featured in Food Wise April 2017 newsletter (PDF)

Filed under: food safety
Creative Commons License
Feel free to use and share this content, but please do so under the conditions of our Creative Commons license and our Rules for Use. Thanks.