WHAT ARE HERBS

HERBS ARE concentrated foods, known to help the body heal itself. Herbs can offer the body nourishment it may not be receiving due to poor diet, environmental problems and excess stress. Herbs work to balance the body, so that it can create the best possible health for itself.