A few days ago
jc_chicks2001

Should schools and businesses teach Ethics?

Should schools and businesses teach Ethics?

Top 8 Answers
A few days ago
lmnop

Favorite Answer

Absolutely! Of course, ultimately you get down to what is the ultimate authority for ethical conduct in the business world. In my case, the Bible is the guiding document. Others would say the law. Still others would say the “conventional practices of a certain industry.” Even if this question is never answered and the ultimate resolution is for each of us to be honest with at least ourselves regarding where we draw certain lines, this process, as you are suggesting, would still help each of us at least understand why we feel what we feel about these things.
0

A few days ago
Owl
Teaching ethics-most certainly. I don’t know if anyone can remember when so many West Point cadets were found cheating on their exams and I don’t know if it was 60 minutes that did a report on cheating in America however compared to years past the report indicated people today didn’t see anything wrong with cheating unless you get caught.
0

A few days ago
Meg…Out of Hybernation
Most accredited colleges and universities offer a course in Ethics (which is typically required in Business, Law, and Management level degrees).

My universities (attended 3 different ones) all had Ethics courses for my major.

0

A few days ago
Anonymous
Yes, it would help stop america from going down the crapshoot
0

A few days ago
Anonymous
Yes and many schools do. They call it character education.
0

A few days ago
Robsthings
They do, but by that time if parents have not touched on the subject then how can the students succeed??
0

A few days ago
Dragon’sFire
You should have some by then!.
0

A few days ago
gilgamesh
They do.
1