Tuesday, July 3, 2012

Clausewitzian theory, and its potential in turning Critical Theory to Critical Policy


An idea I've been tossing around in my head
I'll probably eventually do an academic write up of this, but FIRST (IF YOU JUST LIKED THIS OR WHATEVER THIS IS AN EDIT) here is the problem: 
Theory gets turned, somehow, in to policy.  Critical Theory is a fantastical school of thought (and IMO is the only 'Big Idea' that the Social Sciences have had in at least a decade), but it hasn't yet addressed the implementation of policy so much as tried to change the theory which turns into policy.  Part of the problem is that even Critical students are taught in highly positivistic ways and these methods of thinking stay with us, so there is no real 'tradition' of thought within the social sciences which is able to completely part with the problems of positivistic/rationalistic reasoning.
UNTIL NOW (except it's more like UNTIL 1830)
Clausewitz understood, as a general, that it is impossible to recreate some event to the degree that one can use it as an example in an attempt to create some universal 'truth'.  Too many variables exist in any equation which aren't recorded, and even in modern examples where we know precisely the situation that a general or a policymaker was in, two massive blocks make it impossible to truly understand why a decision maker made a decision:
  • It is impossible to fully understand the baggage that a human takes with them
  • Language is private and personal experience, as well as a contextual one, and thus even if I were to write about what frame of mind I was in years ago the words I use now have a different meaning than the same words I would use then (a very critical idea for a soldier from the early 19th century!)
Thus, Clausewitz created an alternative method of teaching warfare, based on mental reenactment (note: this is similar to Collingwood's idea of mental reenactment of history.  I need to read Collingwood's The Idea of History but they're very similar ideas I've been told).  He understood that experience was the most important thing when it came to one's efficacy on the battlefield, so he endeavored to create a method of teaching which simulated experience.  Essentially, Clausewitz saw theory not as a way to create some version of absolute truth that can judge a right decision from a wrong one, but rather as a method by which individuals can understand why a decision can be difficult.
This could help transform critical theory into critical policy in several important ways: firstly it avoids the positivistic/teleological/antidemocratic methodology of the mainstream social sciences and offers an actual alternative at the root of the problem (ie the method of teaching).  Secondly it does away with the positivist notion that if we think hard enough we'll find some teleological policy/ideology/thought which will be so perfect.  Thirdly it incorporates personality into the problem of policymaking in a sincere and realistic way; rather than seeing emotion (or democracy for that matter) as an impediment to real Truth (/ideal Policy), we accept our human flaws in order to create better understanding.
Woah this turned into an academic write up
ANY IDEAS DUDES AND LADIES AND THEYSIES

Wednesday, April 18, 2012

Human Rights as postmodern authority


This is just a selection from an essay I'm doing, apologies if it's not 100% done seeming
The catastrophic failure of the League of Nations taught European policymakers a series of specific lessons: the need to keep the United States involved in European security, the need for positive Franco-Germanic relations, and more widely to disparage, avoid, and attack nationalism.  But the League offers yet another parable about the legitimacy of law.  From the start, the League was associated in the German, Soviet, and Fascist mind with the order of the Versailles treaty, that is, as something which was keeping Germans/the proletariat/etc down.  Because of this, the League would never be seen as legitimate by the revisionist powers, and her dictates would be followed or disobeyed strictly according to power politics.
On the other hand, in associating the United Nations with a concept of Universal Human Rights, the makers of the new world order had performed an act of legal salvation.  The late 19th century saw the creation of a school of Legal realism, that is, the realization that law is made rather than found (O’Brien 2011 pg69).  Although this realization did much to end judicial legislation, it also tremendously limited the importance of law: it brought with it the cynical, Austinian view of law, that a law is merely a rule plus coercion (Franck 1990 pg28).  This only gets at half of law: laws isn’t only enforced, but has moral authority as well, and a good law will possess both: without authority a law will never be followed, without enforcement a law will be casually broken.
So even though the League was imbued with the coercive power of two of the largest empires that the world has ever seen, it found its dictates rarely followed, not only because they were seen as products of a liberal order which benefited only the Western European nations (Armstrong 1996 pg35) (Perlmutter 1997 pgs58-59), but also because they were seen as mere legislation, representing principles which were not “made generally applicable but [were] confined almost entirely to the territories of the defeated powers.” (Franck 1990 pg159).
What changed after the Second World War was that none of the actors involved wanted another war to happen.  Furthermore, the language of human rights created the new ‘myth’ which the post-War and (increasingly so) the post-Cold War order has oriented itself around.  Rather than the Catholic hierarchy of Medievalism or the scientism of the Enlightenment and Industrial eras, the 20th and 21st century in Europe has been driven by differing perspectives towards emancipation and human rights.  Human rights and anti-war norms gave international laws their authority in the postwar period, to the degree that the rules of the current international system “display authority in themselves, which is to say that they are obeyed despite the fact that the system has no sovereign” (ibid pg27).
Legal Realism was a correct assertion which ended with desolate horror: the discovery that laws re constructed led to a view of might makes right which terminated in the horrifying policies of Nazi Germany and the Soviet Union.  Emancipation, and human rights, have an authority which acknowledges its own construction.  While emancipation may not have the solid existential authority of God behind it, emancipation is a form of authority which can survive secularism or modernism: it is a post modern authority. 

Sunday, March 18, 2012

The Way that International Relations is taught at an undergraduate level


Is a joke.  This isn't a knock against students of international relations in any sense, in fact with hindsight it's an amazing display of fortitude that any of us end up graduating.
Why do I make this claim, and with what information do I make this claim?  Certainly I'm not saying this after some massive study of the way that international relations is studied.  However I have a large number of friends who study international relations from places as disparate as Grinell, American, SIS, Tulane, and the SUNY system, and this criticism is a product out of conversations I've had with those students and my experience in SUNY Purchase and looking through the International Relations tag on tumblr.
The first problem with the way that IR is taught comes at introductory classes, and these introductory classes introduce an endemic problem with IR teaching: read any tumblr post on an IR intro class and you will be confronted with a dichotomy: the dichotomy between Realism and Idealism.
Anyone familiar with modern international relations theory would recognize that this dichotomy is ridiculous and decades old.  For one, no one calls themselves 'idealists' anymore.  The modern counterparts to political realism, going from liberal institutionalism to the English School to Critical International Relations, all have deeply developed methodologies and ontologies far beyond the blind faith that the moniker 'idealism' suggests, and many of these schools are so different from the methodology of idealism that to suggest that they are 'growths' from the school is to grossly oversimplify modern theory.
While I realize that introductory level classes must oversimplify by their very nature, but we need to ask if IR classes are oversimplifying why are they giving us a simplistic depiction of 40 year old theory rather than modern theory?
Part of this is simply due to the tenure system.  Professors who graduated 40 years ago are far more likely to be teaching with tenure than an international relations student who just got their PhD, even if the tenured professor hasn't read a single book on theory since they graduated.
But there is another, more problematic reason for teaching theory in a realist-idealist dialectic: IR is taught this way to the benefit of realism.  In fact the dichotomy was made by realist theorists as a way to disparage anti-realist theories.  There's nothing wrong with teaching realism to students, in say a class that deals explicitly with realist theory, but when a professor is teaching a supposedly inclusive theory class and is using a dialectic that benefits realism, then we get into the realm of indoctrination.
How many Freshmen students have I spoken to who have said "well I don't want to be a realist, but it make so much more sense than anything else"?  How many students have I met who have left international relations because they are sick of being taught a stale dichotomy that isn't relevant to our generation or our century?  The problem I've seen, of students (especially female students) leaving IR or polisci programs for another program that offers interesting methods of analysis?
In my experience, this phenomena actually gets worse the better the school you go to is.  The higher up schools, especially the prestigious universities in Washington DC, have even more incentive to teach in an overly scientific way because it makes the ridiculous amounts of money that they charge seem worth it. Furthermore, the fact that you're going to an insanely prestigious college with authoritative professors has in my experience discouraged individual study in all but  the most studious.  At Purchase, all of the students who survived the polisci program to senior year are writing their senior projects based on theory that they were not taught in class.  I thought that this was a problem specific to Purchase until I started talking to my friends from Geneseo and Binghampton.
Self-teaching isn't wrong.  It is an important part of anyone's learning and is necessary regardless of how you're being taught and how good the teacher is. But self teaching shouldn't be the majority of one's education, especially when you are going tens of thousands of dollars into debt in order to fund your education.  
With all of this said, we need to ask ourselves a question
Is the purpose of international relations to teach the next generation of international relations analysts, or is it to teach the next generation of political realists?

Saturday, March 17, 2012

How much do Americans get paid?


Seems like a simple enough question. Just look at your paycheck. Add
them all up and that’s how much you are getting paid in a year. Make
sure to account for your spouse’s salary, and that’s how much an
American household makes.
The reality is murkier and more complicated. Most employed Americans
also receive a substantial benefits package, and that definitely
matters when we determine how Americans are getting compensated. After
all, a company that’s paying a worker $80,000 and a huge healthcare
plan plus dental is giving more for that worker than one who’s paying
$85,000 and doesn’t even let the employee take a vacation.

It especially matters when we’re considering income inequality, and
growth in wages for Average Americans. One of the critiques of the
rising income inequality theory (usually from the right?) is that our
usual income stats don’t account for the huge advances in health care
that workers are getting, which will close the income gap. And show
that American workers aren’t getting exploited by companies.

Is it true? Well, a recent study shows…partially? Judge for yourself.
The graph is shown below. The red line apparently includes fringe
benefits.







What strikes me about this graph:

-The 2000s really did show some increase in compensation. Not as much
as the 90s, but some. Apparently it was all sucked up by health care,
though. So, if you want to improve incomes, you have to tackle the
health care cost problem

-The late 1980s were not a good time economically at all. Median
incomes were stagnant, and productivity growth wasn’t very strong
either. Why was this? It’s a big disconnect from the general trend of
the post-1970s. Maybe those big deficits do matter after all…

-The 1990s retain their standout performance, relative to the 2000s
and the 1980s. However, they don’t look QUITE as good compared to the
2000s anymore.

-For the most part, compensation really does track productivity graph,
which is different from the theory that they are totally disjointed
now. The only big change is in the late 1970s, which opened up a huge
difference that was never closed. What does this say about the 1970s?

Monday, March 12, 2012

Who Wins, Who Loses With the Minimum Wage?

Or, what economics is really about. At least from my point of view.

I've posed this question to several friends in the past. It's pertinent. It's politically charged. It engages passion. And it's a great way to show off your econ knowledge, whether it's econ 101 or PhD level. Over time, I like to think my own answer has grown more informed and qualified, but instructive at the same time.

Here's how I answer this question:

It's complicated. Duh. Economics always is.


First off, going to make the assumption that it's an EFFECTIVE minimum wage. If the min wage is set at 15 cents an hour, no one cares, because no one gets paid that. So, let's make the assumption that the minimum wage has been set high enough to actually increase some wages.

Since it is actually increasing wages from the market equilibrium, we can immediately assume the market is not going to be operating IN equilibrium anymore. Which means there is not going to be a price set so supply=demand. One is going to be higher than the other, which means there is either going to be a shortage or a surplus. 

In this case, it's a definite surplus. Price is high, which means demand has decreased and supply has increased, so supply>demand. Surplus. And since we're talking about job markets, that means unemployment. Since we're talking about an increase in unemployment, that means we are PROBABLY going to get some losers right there, namely the people that are now unemployed, who would have otherwise been employed. Not only that, but there are now more people enterting the labor market, because they think there are high wages, so there are even more unemployed.

Ex: Doctors get paid $100,000 a year and there are 100 doctors all employed. Raise the wage to $150,000 and only 80 jobs are availaible. 20 doctors are now unemployed, right off the bat. But 50 more people went to medical school, because they THOUGHT they were going to get $150,000 salaries. They're all unemployed now, too.

It's also important to note that the COMPOSITION is going to change. Because the wages are higher, you are going to be attracting more attractive candidates than what used to be in the pool. So, doctor wage was 100 grand and I was thinking about becoming a banker that makes 200 grand. Now the doctor makes 150 grand? Well, I am okay with taking a slight pay hit, because being a banker sucks. I'll go be a doctor instead.

That actually matters a lot in min wage. If you raise the min wage, it's going to attract a whole lot of people into minimum wage jobs, that are actually going to displace the people who are already working them. Especially since, now that they are paying more for labor, businesses are going to want to make more certain that their employees are good employees. Single mother? Out, too unreliable, in comes the college student who didn't think $8 an hour was worth his time but loves $10 an hour. Ex-cons? Screwed. Possible drug addicts, minorities? Eh...

But it definitely does help those workers who can get those jobs. And it helps ANOTHER group of workers, too, that DON'T make minimum wage.

Go back to the banker example. If the bank knows I might leave, now they will offer me a raise. This happens a lot, too. Increasing the minimum wage doesn't just raise the workers who are on minimum wage, it raises wages of a lot of people across the economy (generally on the lower end of the scale).

But, still, marginal workers might not necessarily benefit. Businesses? Well, that's an interesting issue...

To some extent, businesses can pass on the higher costs to their customers. Less profit, for sure, but not necessarily zero profit. Also, like I have said before, most companies have absolutely no clue how to hire or pay their workers. There are probably cases where increasing the minimum wage has HELPED companies. They are getting better workers, and actually making an effort to train them and retain them now that they have to pay them so much more. The company gets more productive. This is what Henry Ford did with his Model T. His workers all sucked, he had lots of turnover, so he dramatically increased his pay and got the best workforce in all of Detroit. And, no, he didn't pay his workers enough so they could all buy Model Ts, that's just the most ridiculous nonsense I have ever heard in a history class...

There's also a difference between long-term and short-term. In the short-term, yeah, a business might be hurting. In the long-term, the increased cost of labor is going to mean a substitute towards capital, which means a capital deepening in the economy that COULD make everything more productive in the long-run. It definitely means more money for whoever makes capital goods. IE, if you have to hire 4 people at $10 an hour, suddenly an automated computer system looks a lot nicer than if you only had to pay them $6 an hour. And that's good news for whoever makes the computer system.




So, again. Complicated.



You notice here that is there is no absolute answer, no policy conclusion, no recommendation, no statistical analysis, no paper, nothing of note, really.


Except for one thing. Consideration of lots of different variables, specifically unknowns.




What any good education SHOULD teach you is how to ask questions, figure out what you don't know, come up with hypotheses, and protect you from sexy narratives that look good but don't have any explanatory power.


The fun thing about economics education is that it does just that. And it's designed to do just that, because the focus on unintended consequences, and how one market changes another, is always there. It's intrinsic to the subject.

Saturday, March 10, 2012

To Reduce the Budget Deficit, you've gotta reduce the Trade Deficit.

One of the better innovations of viewing the economic world of the Modern Monetary Theory/Modern Monetary Realists (MMR being the non-political, non-policy branch of the similar school of thought, of which I consider myself closer to in some significant beliefs such as trade deficits) are its view of how the different sectors interact to promote growth. Primarily the government, private sector and foreign sectors, with the private sector sometimes broken down into household and corporate. Check this graph out of the US:

With this information in mind, to continue growing the economy there must be an expansion of one of the three sectors. With a continual rush to savings in 2008, much of the economic slack was blunted by a fiscal deficit and a blip in the trade deficit. What this chart of the last 50 years also shows is that the trade deficit or surplus truly determines what kind of growth the US will achieve and from what sectors. Thus, if you really want to reduce the budget deficit, you must first reduce the trade deficit. We have two major areas of where the trade deficit comes from: Oil, and China. On the first front, it's imperative that we implement smart government policy to reduce consumption, and at least in the short term, to increase production. This process is already under way  along with the price of gasoline at the pump driving people to consume less gasoline and change habits towards buying more fuel-efficient vehicles.

Much more could be done, via increasing the CAFE standards on cars/trucks/SUVs at a quicker date, subsidizing hybrid or electric vehicles even more, promoting another cash-for-clunkers on the onset of the next downturn (as a smart counter-cyclical measure), as well as furthering R&D grants towards things like battery technology and fuel efficient vehicles with ground-breaking technology. Additionally, some studies have found that 25% of gasoline is consumed as a result of congested traffic. It's also important that we further investment into our infrastructure, to reduce cars on the road in the form of mass transit systems (which are notoriously bad in places like in the South), as well as build new roads and expand roads to reduce congestion. This would not only provide a ton of jobs in an economically depressed time of excess capacity, but it would go a long way towards reducing our trade deficit via oil. On the oil production side, it seems pretty likely that the booms in Texas and North Dakota (Bakken Shale) will continue in earnest - perhaps the FHA and FEMA could coordinate adding extra housing and trailers there to those who want and need jobs in those places - I certainly would find it judicious to promote this boom in North Dakota to attract people from all around the country who need jobs. In any event, with extra investment, production and consumption of oil should converge at a certain point, and thus reduce nearly half of our trade persistent trade deficit, and hopefully lower the price at the pump in general!

On the Chinese front, what is important is that we continue pressure on China to continue to appreciate its currency, what seems to be a target of 4% during this year (although there are threats to the contrary). It's also important that we continue to negotiate trade liberalizations so that our exports can reach their market (like, for instance, media importation, as well as piracy). Finally, in general the United States has to start implementing pro-export policies that make it easier for American businesses to export to other countries. It is said that Canada has 3x the trade subsidies.. Can't we at least do what Canada is doing, if not Europe? Trade deficits and surpluses are zero-sum after all, as the experience of Germany and China can tell you, so we should extend loans, information, logistics and other subsidies to other countries including China. I suggest we be on the right side of it for our own sake of sustainable growth and to reduce our budget deficit and satisfy the fiscal hawks once and for all in a sustainable way. It would also lead private-sector cash out on the sidelines (who is currently undergoing a long deleveraging process), further reducing the budget deficit.

So that is what I propose doing on the spending side. Of course, I don't think it's completely necessary that we have to tax to pay for it, as it will pay for itself via a lower trade deficit and expanded growth. But, if you want to appease the fiscal hawks, then I suggest we pay for it by repealing most of the 2003 Bush tax cuts (estate, capital gains, dividends breaks in particular), and maybe the upper bracket (or millionaires, or whatever's necessary) of the '01 brackets. Or cut a little of defense spending. Or better yet, invest in R&D to produce more fuel efficient military technology that consumes less gasoline and thus saves money!

It should seem so obvious..

Wednesday, March 7, 2012

The Bad Economics of '60 Minutes aid'

Shannon Beebe, one of the founding members of and a high class officer in AFRICOM, said that one of the biggest problems in the African continent is the uncertain nature of life there.  This applies both to bad things: new diseases, unpredictable weather effecting an agrarian economy, unstable rather than merely corrupt and authoritarian governance, but it also applies to good things, namely what I call '60 Minutes Aid', or when I'm a little bit more intoxicated, 'sympathetic white kid aid'.

Quick explanation of the concept: By the labyrinthine processes of popular culture, a piece about some horrible event occurring in Africa sifts to the top of the public consciousness, maybe through 60 Minutes, maybe through tumblr or a chainmail.  But what ends up happening is a ton of well off Americans with disposable income see the somethingtastrophe and since, heck, they didn't know anything about the problem before an hour ago, they throw money at the problem, generally through Western NGOs.

What's wrong with this?  many would ask.  Even a little bit of money and attention going to a problem must be good, right?
Kinda no not really no, is the answer I'd give.  See, Jane Jacobs, in her fantastic study of urban economics, had a thing called disastrous money.  It generally came from government or massive corporate sources into poor neighborhoods, all at once, and had disastrous effects.  Why?  Let's take building as an example.  Normally, you would build one small building at a time in a city because, well, that's what most people can afford to build. However, let's say Robert Moses decides to build a colossal building that takes up more than a superblock.  This building will then dominate the area around it, economically: it will start with high rents (because it's a new building) and eventually start taking in high maintenance (because it's an old building).  The point I'm getting at is that disastrous money can end up dominating a local economy.

Why is that bad?  I would go so far as to say that '60 minutes aid' is a worse form of disastrous money, because it comes in to an economy once and there is no guarantee that this money will continue to come in.  So you have a massive amount of money (or food or whatever: money is actually infinitely worse because it's easier to steal for warlords) coming in to an area that very well may not have a money economy.  Noting that a lot of these resources are probably going to be picked up by the warlords that we're funding against anyway (sub-Saharan Africa has the highest use, by region, of private military forces), and noting the horrible effect that massive food aid can have on an economy that primarily makes food, this money will probably have little or negative effect.

An economy can't sustain itself on one massive 'hit' of money, it needs a constant flow of smaller transactions.  This massive 'hit', whether it's people volunteering, money coming in, or food, may, if successful, create a successful microeconomy in the short term.  But the problem is, that economy is reliant on this money you've thrown at it.  After a while that money stops coming in, and only the people who made long-term decisions with that money pan out.  Well, that and drug barons and warlords.

And this hits at the biggest problem of African aid (outside of government aid): we have a huge bias against established African institutions in the US.  Their military is corrupt, their government is corrupt, their civil society groups are probably corrupt.  So instead we give our money to Western aid groups who have nowhere near the knowledge of these established organizations, and we don't put any money into security because that's yucky and goes against our idealistic whatever.

And then that money comes to nothing, but at least we get to pat ourselves on the back

Saturday, March 3, 2012

Isolationism as provincialism

In any conversation you have, any large-based discussion on foreign policy (especially in the US), you will eventually get someone touting some form of isolationism.  "Hey, they're [X miles away], why do I have to care about them?"  This especially came up in light of the conflict in Libya, with isolationists painting Libya as a second Iraq, but it will come up in the context peace keeping mission or intervention.

I have...a lot of problems with this idea.  On a purely emotional level, as a foreign policy student isolationism seems ostrich-like--it is a refutation of the importance of anything going on outside our borders, and is ignorant of the transnational nature of modern threats.  But I dislike it on an intellectual level as well.  Isolationism has generally used unhistorical examples to support it--"Hey, we were isolationist in the 19th century, and that worked out pretty well for us!  Never mind that we expanded our territory via war just as much as all of the other imperialists!"

But lastly, and most gratingly, isolationists generally re-purpose the language of realism (a subject which I will get on later) to make an argument against any form of humanitarian intervention.  By this I mean, a scientistic view of some unchanging and objectively-determined "national interest" which we are going against by doing X or Y.  This ignores the fact that the 'national interest', as a social construction, is entirely subjective and is created and altered by whichever reader or writer is participating in creating it at that particular moment.

It is understandable, for multiple reasons, that people generally think of national security/the national interest as an objective fact, not least because the conception of national interest as an objective fact gives power to supposed 'experts' (think pundits), but it suffices to say that national interest isn't a real thing.

And this is what particularly jives me about isolationism, and how provincial it is--it assumes firstly that everyone else has the same national interest as we do, and that this conception of a national interest is more important than human lives and emancipation.  The reason that humanitarian intervention is necessary is that many countries don't have the same idea of national security as we do--that, say, to the Libyan government a year ago, the security of Qaddafi trumped the security of hundreds of thousands of people.  And, as the civil war in Libya was going on, the isolationist wing of American politics brought a series of high minded arguments against the intervention.  But we need to ask ourselves whether these high minded arguments and our imagined ideas of nationality and sovereignty trump the lives and well being of others.

Links to Economic Influences

Test post for myself. In the meantime, let me briefly give some links as to what's been influencing my mind concerning economics as of late:

Cullen Roche at Pragmatic Capitalist, as well Modern Monetary Realism (MMR) that he branched from MMT:
http://pragcap.com/
http://pragcap.com/resources/understanding-modern-monetary-system
http://monetaryrealism.com/

Dense stuff. On that same vein, some more general from Mosler et al on Modern Monetary Theory (MMT):
http://moslereconomics.com/
http://mmtwiki.org/wiki/MMT_Overview
http://www.washingtonpost.com/blogs/ezra-klein/post/you-know-the-deficit-hawks-now-meet-the-deficit-owls/2011/08/25/gIQAHsoONR_blog.html <--- the WaPost's little piece on it - not entirely accurate, but decent enough.

Steve Keen (a Post-Keynesian) and Richard Koo (a more neo-classical Keynesian) are still badass. Everyone should listen to what Koo says when he releases a note - it's prophetic!:
http://www.youtube.com/watch?v=lnwEGeMQRCs&feature=fvst (Keen)
http://www.debtdeflation.com/blogs/
http://www.youtube.com/watch?v=HaNxAzLKegU and http://www.youtube.com/watch?v=Tt3KdH1uk-c&feature=related (Koo): his general oratory.
http://www.alsosprachanalyst.com/real-estate/richard-koo-on-china-there-will-be-blood.html <--- prophecy on China's real estate bubble collapse, "There will be blood"
http://articles.marketwatch.com/2011-12-07/markets/30717491_1_japan-banks-japanese-banks-european-banks <--- prophecy on current EU crisis

I still like my old favorite neo-classical economists too, such as Paul Krugman, Joseph Stiglitz, Greg Mankiw, Ken Rogoff, etc. But I like this stuff better, as it tends to describe reality better than neo-classical theory does imo. It seems that all my reservations and confusions over neo-classical economics while I considered myself a Keynesian indeed had a nice literature of new ideas and new economists.

What say you?

Also Rob, I noticed that you used "Rational" instead of "Reason". heh, intentional or not?