Are Foods Labeled "Natural" Healthier for You?
I have seen some food labeled "natural." What does that mean? Is it healthier?
The word "natural" on food products usually means the food has no added color, artificial flavors or other man-made substances. However, the Food and Drug Administration does not have a legal definition for the word "natural." "Natural" food isn't necessarily better for you. Compare the Nutrition Facts labels on foods to learn more about what you are eating.
Julie Garden-Robinson, Food and Nutrition Specialist, NDSU Extension Service
Featured in Food Wise April 2017 newsletter (PDF)