Native plants heal us if we let them. Plants are the basis of all life on land. Without plants, there would be no animals. In modern scientific / capitalist society, though, we only value plants that can make money, those we can sell, eat, wear, or enjoy looking at. Plants…