A recent Pew Research Center survey finds that only half of American adults think colleges and universities are having a positive effect on our nation. The leftward political bias, held by faculty members affiliated with the Democratic Party, at most institutions of higher education explains a lot of that disappointment.