With sustainability becoming much more critical to the public than in the past, many brands are changing their business models to be friendlier to the planet- or at least they say they are. This practice is called greenwashing. Greenwashing, specifically, when businesses claim that their products are environmentally friendly without that being the case.