Hardware vs. Wetware
Hoisted from comments: dr2chase:
Rapture of the Nerds Department: One place where we're not making a whole lot of progress is in power consumption; that's most of the reason that cpu clock rates have quit increasing at the glorious pace of previous decades. We're able to put ever-more transistors on a chip, but we can't run them ever-faster, so if we want our software to run faster, we must rework it so that it computes in parallel. It is generally believed that programming things to run in parallel is harder than programming them to run serially. And yes, people are earning a lot of money trying to make that be less true, and I am one of those people.
I think that this gets us to at least two "potential" singularities -- in one, we can construct a computer with the potential computing power of a human brain, but we won't have a clue how to program it like a brain. In the other, we can construct a computer with the potential computing power of a human brain, but the energy bill suggests that it would be cheaper to hire a human (at .15/kWH, 10000 H/yr, a kilowatt 24/7 costs $1500. A 42U rack, computing, draws 10kW ( http://scs.lbl.gov/html/planning.html ). So each rack costs $15,000 per year, running, not counting the cost of cooling -- how long till we can get a brain's worth of computing power, in fewer than 10 racks?
The other problem is simply the grubby world of business and what's-in-it-for-me. Who does it profit, to program a computer like a human brain? If that much computing power can more profitably be put to some other use (especially when it is expensive), that's what will happen to it. Events of the last decade have made me worry a lot more about what ideologues and greedy bastards would do with such technology.