Has anyone heard that the classes that are now being required in ethics actually teach people how to get around being ethical? What do you think about ethics in our society? Do you feel like greed has taken us over to the point where we have let ethics be a thing of the past? A few years ago when I worked with people from various parts of the country, I made a joke one day that I felt that some people see politeness as a sign of weakness and go in for the kill. Do you feel like you have to put on a coat of armor to be able to deal with people and not be polite or be distrustful in order to survive in society?
Well, here is the article that brought up my questions?
Oh, one other thought, I believe that company culture starts at the top and how people act are based on what comes from the top. If you look at the CEO, you can really get a snapshot of the entire company culture. Do you agree?