A sign that reads, "Corporate social responsibility."

Image credit: Shutterstock

Corporate social responsibility is the idea that companies should be giving back to the community and the world at large. It can take on many forms, including everything from setting up a charitable department that finds worthy causes to donate to, to revamping production to be more environmentally friendly. It can even be as simple as treating employees better through improved wages and benefits.

Studies have shown that consumer perceptions of such actions are important. In fact, companies that make a habit out of giving back to their communities reap the benefits of doing so, generally in the form of profits. This couldn’t be more important than at a time like today, when an increasing number of people are becoming socially conscious.

Of course, there are always those cynics among us who see such actions as simply an attempt to cash in on good will. But it turns out that employees can see the difference, and they don’t take kindly to their employers faking it.

A study published in September 2016 by researchers from the University of Ottawa looked at responses from over a thousand workers. Researchers found that employees could tell when their company was simply trying to get attention in order to improve profits. But that’s not all they found.

Researchers found that workers who believed their employer’s charitable efforts were disingenuous were less likely to work as hard. Some responses revealed that the workers were even willing to look for another job.

Think about it for a second. If you’re working for a company that claims they care about the environment, but they actually engage in environmentally harmful practices, wouldn’t that give you a resentful attitude towards your company? And if the company can’t be trusted to tell the truth to consumers, why would you as an employee trust them either?