Requirements for modern success

January 18, 2015 · Posted in Business, Culture · Comment 

 The much touted 'character hypothesis' (which has become a staple of a lot of modern intellectual discourse around success, often heard from writers like Malcolm Gladwell and Paul Tough) is very useful, and speaks to an understanding of the greatly changed nature of success in the post-Industrial era. However, I think the qualities associated with that hypothesis should only be considered necessary, but not sufficient. To review, here is a list of qualities generally associated with it:

  • persistence
  • determination
  • self-control / the ability to delay gratification
  • curiosity
  • conscientiousness
  • self-confidence
  • (occasionally) emotional intelligence
  • good communication skills and a willingness to listen 
  • grit

I'd personally add to the list 'the willingness to always learn' (i.e., be a dedicated autodidact for life.)

Based on what we've seen over the past ten years, especially with things like 'the gig economy' and our 'free agent nation', this hypothesis (perhaps model) holds up well. So what else is necessary? One or more of the following

  • a strong personal safety net (savings and/or relatives and friends to fall back on)
  • good credentials
  • a strong personal / professional network

These last three are exactly the ones that are generally not available to those who need them most, even if they have all the qualities of the first list (you could also substitute 'incredible luck' for these three.) These things may be understood, but still remain unspoken. We should remind ourselves that character alone may not be enough for success in today's world for the even the most determined, confident, and gritty of people. 

We need to talk about ‘We Need to Talk About TED’

December 29, 2014 · Posted in Culture, Press, Sociology · Comment 

A great discussion about media quality, filter bubbles, empty content, and the information/entertainment divide started by dredmorbius that I recommend reading in its entirety (including the discussion). I'm reproducing my contribution to the discussion here as its own post:

First, let's talk about democracy, particularly of information. What we've learned from a couple of centuries of modern rights-limited (aka liberal) democracy is that 1) it can be corrupted, even with the rights-limited part in place (with constitutions and the like) 2) where it works, it tends to help curb excesses, rather than actually always improve things in the more linear way it's sometimes imagined to. Anti-democratic forces have long warned about the downsides of democracy, one of which is demagoguery. Another is massive confusion. Our modern UpWorthy/HuffPo/Fox News-ified world demonstrates that these fears were not completely unfounded.

I'm in agreement with Umair about "emptent". I've long thought of a great deal of this stuff being akin to the empty calories in sugar. Another great comparison would be fast food and much of pop culture in general. Designed to appeal to a certain sense (of taste, texture, emotion, etc.), but ultimately without anything else behind it. It's not automatically wrong (cupcakes are tasty every so often), but when it replaces food, you have a problem. It also amplifies the "poverty of attention" problem. Being awash in information alone, even if it's valuable has problems with sorting and filtering; adding in voluminous amounts of noise makes the problem all the worse.

Filter failure. This is unfortunately likely going to not get fixed anytime soon, even with AI, in my opinion. Both "disputed facts" (especially when some regard certain facts as fiction) and the ultimate subjectivity of various forces and ends means that creating a taxonomy of ideas/information in a way that is useful in a general sense (i.e., doesn't just appeal to individual biases and preferences) is likely in the realm of impossibility. We have a fundamental problem, as we probably well known by now, that ideas, unlike physics, are (sometimes) fungible, contextual, and usually contingent. As for your question about "what to value", we have an interesting question. What Upworthy and friends are trying to do, in a sense, is really no different from any other past mechanisms or devices used to make various kinds of information more palatable or interesting (allegory, satire, or what have you), and in that way is actually somewhat laudable. The downsides are obvious, namely the sugar overtaking the nutrients, desensitization (via repetition, boredom, whatever), and the fact that we now have so much information of dubious value available we're drowning in it (right back to the wealth of information, poverty of attention problem.) In a sense, we're getting to witness and experience what marketers of commercial products and services have had to deal with for ages: a massive marketplace, full of dubious claims, competitors everywhere, consumer fickleness and boredom, and a host of other issues - and with all of their resources they haven't been able to crack it, because it's likely uncrackable. We're seeing what unmitigated speech on a large scale looks like, and (as the ACLU states, and I agree with) the answer to bad speech is more speech, if only because the opposite extreme is worse. That doesn't mean it actually fixes anything. I honestly don't think it does, at least not in the short-term, for reasons I'll get to below. So I think what will happen is what has happened in the past: we'll continue to attempt to come up with new (and sometimes old, but previously unfashionable) creative ways to communicate ideas that exploit the human desire for novelty.

Now I'll qualify the above by saying that it applies to non-concrete topics, like politics, social relations, etc. We can do a decent job (because of short testing cycles, among other things) with things like software development, math, some science, etc.. We of course have the problem of "minimal level of knowledge in order to judge properly", but at least we can demonstrate something (by running a program or showing a repeatable calculation on a calculator) without long lag times, variables which promote possible counterfactuals, etc. So I'd say that things are somewhat better in these areas. Fortunately, getting these things wrong don't usually wind up with people starving, homeless, or dead - but the non-concrete topics sometimes do. Even certain kinds of science are not immune here - climate science is an obvious one. Political forces, long lag times, difficultly of proving one-to-one causation means that a real "ClimateOverflow" would probably not work all that well (googling for climate science stack exchange brings up a whole lot of entries from the Skeptics SE - mostly in support of AGE, but based on citing the work of others, rather than from researchers themselves.) Even with the preponderance of evidence in support of it, we can see just how hard it is to get universal agreement.

As a former user of Advogato and Slashdot (and occasional current user of Reddit), I definitely agree that these were useful experiments, and were absolutely worth attempting. Quora would be another one. What these proved, however, is that while it's possible to make qualification systems better up to a certain point (which likely occurs long before an Eternal September arrives), ultimately the power of persuasiveness, cults of personality, and groupthink can overtake any system. Probably not the result the creators of these systems intended, but useful nonetheless. It does point us back in the direction of "higher quality in smaller groups", at least for now. As far as "solutions" for this problem at a large scale, I don't see any at all on the horizon. Sci-fi fantasies of Matrix-style insta-learning being realized (so we can get everyone to a base level of knowledge right away) would take care of one part of the problem, but still leaves us with the other, probably bigger one again: irreconcilable value differences. That one, I believe, won't go anywhere, and battle lines will continue to be drawn there for the foreseeable future and we'll just have to deal with it.

So back to Democracy, particularly democratized knowledge and the ability for anyone to become a "thought leader." Knowledge, interest, power are the three qualities I regard as a prerequisite for being what's often termed an "elite." You have to know a subject (knowledge), care about and participate in things around the subject (interest), and have the ability to effect change around said subject (power - and this power can be direct, like in politics, or indirect like influencing the behavior of people via your writing on a large-ish or greater scale.) Activists tend to have the knowledge and interest part, but often lack power. The angry rabble have only interest, but poor knowledge and no power. Politicians and thought leaders have all three (hence, they are elites.) The rest may or may not have knowledge, but definitely lack interest, often expressed via lack of participation in both conversations about subjects (in comment sections, for instance) and at the voting booth. Now the reason I think this is important is reflected in a recurring phenomenon in societies, and that's the bifurcation that results from this last group being separated from the rest. It's revealed in the statements of (for example) Obama regarding the "iPod government" - what he and his advisors understood is something that more knowledgeable groups with interest often don't - many, perhaps most people, do not want to be involved in the business of running anything. They have their own affairs, and want to live their lives without being bothered with the underlying substrate of society. Society should function like a utility - you pay for bill (taxes) and the things you need to happen, happen. I'm extremely sympathetic to this view, and I believe that a great many activists and others refuse to accept this, but should (and it does point to a future where we'll perhaps welcome a GAI government, with all its potential problems.)

Now how does this relate back to the thought leader issues? Many people, quite simply, want to be told what to do, want to be told who is good, and who is bad, and believe that the world is a well-ordered place. The world is an incredibly complicated place and only getting more complicated. Yes, we continue to abstract away a lot of complexity (so maybe most only understand the current level they are functioning at, or one level below it), but at the same time, we've increased total available information - which means we're still potentially processing a whole lot more information, even if it's highly abstracted. So what do many people do? They either shut off (and become non-participants) or they turn to their thought leaders. People have always done this, of course, but I think that the difference now is what's required to participate and control events around you - events, which are mired in consider complexity - in a useful way is much harder. Going to your local religious leader or town master to tell you which job to take or who you should trust is quite different than trying to process the macroeconomic trends that seem distant, but ultimately affect whether you'll have a job in 10 years. "You should become a blacksmith" is very different than "read Krugman, Autor, Ford, Cowen, Hanson, Thoma and the literature around technological unemployment to even be able to usefully comment usefully on one (particularly huge and impactful, but still just one) area of modern economies." With that heaping pile of requirements tossed on people, which are they likely to choose? Spend huge amounts of time poring over those materials, which is not only complicated, but has active disputes around it, or read some articles like "How These Five Robots Will Steal Your Job At Starbucks"? I can't blame many people for choosing the latter - how many forum discussions have you seen with a question like "How do I X?" (where X is some micro-subject related to money or health) only to be answered with "Oh, that's easy, all you have to do is these 10-50 things! I don't understand why this is so hard for people!" Those 10-50 things are then multiplied by all the other things people have to remember to do. Is it surprising they turn to useless pop-finance gurus or $latest_fad_diet - or in this case "One Superfood To Fix Your Health"?

Medicine is another example of an area affected by this. When the extent of our knowledge were how to mend fractures or the like, the intellectual distance between the doctor and the laymen wasn't all that great. Today, the minimal level of knowledge required to truly understand things as common as heart disease or cancer is incredible - so much so that a specialist in brain cancer might be reluctant to opine on your blocked aorta without deferring to a cardiologist. Our specialization and division of labor does wonderful things, but like so many things, is a double-edged sword.

To sum up:

  • The attention economy is with us for real, and will only grow.
  • Filter bubbles likely don't have any technological fix. If people continue to use them, they stay stuck in echo chamber land. If they decide to self-pop them, they may choke on the firehose and opt out - that's part of the reason they appeared in the first place.
  • A great deal of content is empty for the same reason fast food is pretty empty. It tastes good, is addictive, and people like it. We have the "reptile brain", and we also have the "mammal tongue."
  • Systems around concrete subjects fare better, non-concrete ones are in big trouble.
  • Small groups and careful management of membership in communities will probably still do better for the foreseeable future with regards to quality of communication, but aren't silver bullets, either. More experiments in trust and quality control systems are absolutely worth trying, but we should be realistic about their likely effectiveness.
  • Democratized information is not the panacea it was/is imagined to be, but the alternative is even worse, and that's not meant to be flippant. It's an incredibly ugly truth even the most hardcore supporters of free speech (among whom I count myself) have to accept, regardless of how painful that is.
  • Everyone trying to communicate a message, from politicians to activists to people selling shoes are now marketers, and have to understand and work with the human desire for novelty in an information/signaling marketplace of incredible size and velocity.
  • Thought leaders of dubious quality are unfortunately not going anywhere. The best we can do is try to illuminate, write exegeses, promote "good" thought leaders, change the culture, and hopefully change some minds.
  • Useful participation is likely to get harder rather than easier in the short-to-medium term. As we have job and wealth polarization, so too will we have "useful participation polarization." These things aren't completely unrelated or uncorrelated.

Some of us were born too early

Over the past century or so, we've been separating many of the artifacts of our history that were affronts to personal liberty and autonomy: we've separated sex and reproduction (birth control); reproduction and marriage (wide acceptance of unmarried reproduction); marriage and religion (civil unions and their equivalent); sex and marriage (sexual revolution, pro-sexual freedom laws and mores); and family and personal safety and security (welfare states.) Though we do have these things, it's still very rare to find people who embrace the 'core four' types of separation in their personal lives, rather than simply being accepting of the idea that others do these things. Tolerance, not embrace.  For myself, I long ago rejected reproduction (which I consider immoral due to violation of consent theory; a logistical, financial, and time drain; stressful; and ultimately, boring); marriage (completely outmoded); family (that is, treating people specially because of their genetic distance, rather than shared beliefs, pursuits, etc. ) and religion (unnecessary at best, dangerous at worst.) 

Recently, I attempted to think of the number of people who I've known in my life who share these ideas. Who not only accept them, but live them - and are ironclad in their rejection of these things. The number is very small. Single digits. While that's my personal sample, it probably wouldn't be too large a leap to assume that the number of people out there with the same outlook is small. So why is this? Two major things come to mind, one economic, and the other social.

Reason 1: fear of the loss of non-government safety nets. Is the marginal person who would reject these things afraid of being ostracized by those that provide tacit financial and logistical support if they were to openly reject them? How many people are out there having children, getting married, pretending to like their relatives, and faking devotion to maintain a financial lifeline from their parents, siblings, or 'friends'? If society gave people real autonomy - via a guaranteed income, a real universal health care system, cheap higher education, low housing costs - how many people would choose to reject these things and pursue their own desires, devoid of any of the 'traditional' constraints? Just as the birth control pill helped launch the behaviors and culture associated with modern sexual freedom and liberalized divorce laws gave women the ability to escape suffocating marriages and pursue their own desires more easily, so too might pro-autonomy economic policies allow others held back by the need to please the holders of their implicit personal safety net to truly go out own their own.

Reason 2: fear of social ostracism. A society of autonomous, self-actualized, knowledge and pleasure pursuing individualists that are freed from the economic constraints associated with day-to-day survival sounds like a wonderful thing to me. Even after the first constraint is fixed, we'd still have another problem - the lag time between it being possible, and wide-enough acceptance that people would actually do it. Why? The marginal person might also be afraid of social rejection and loneliness - especially in places that hold very tightly to these outmoded ideas. It would likely take decades for this one to sort itself out - but just as ideas and behaviors birthed by the sexual revolution took until the 1990s / 2000s to truly become a widespread (rather than countercultural) reality, it likely would too.

Eventually, I do think we'll get there, and we'll continue to develop technologies to unshackle ourselves from other constraints (anti-aging and mind uploading would completely remove the perceived need for reproduction, for instance), but until then, we have societies still look terribly primitive in some of our eyes. Some of us were born too early.

On being a ‘contrarian for its own sake’ in today’s world

October 23, 2014 · Posted in Culture · Comment 

Contrarians-for-contrarianism's-sake suffer from an interesting problem in these times of shifting alliances; blurring lines between everything; complicated heroes and nuanced villains; and counter-culture becoming mainstream and then going back to counter-culture faster than many can keep up with - a grasping for what's 'truly' contrarian at any given time. They find themselves adrift, confused, and constantly grappling with what to despise because of this. In a fast-moving, information-overload culture, being a contrarian for its own sake seems like an awfully tough job.

State of things to come, medium-term-ish edition mini-roundup

Two must reads:

"Trotskyite singularitarians for Monarchism! A political speculation" by Charlie Stross

"Automation, inequality and geopolitics" by Tyler Cowen


Is everyone naked in a Dyson sphere?

November 15, 2013 · Posted in Culture, Science and technology, Sociology · Comment 

There was a time, not long ago, when certain subcultures (most salient being the Cybergoth/Industrial subculture) reflected a look, or at least wish, about the future. When it began in earnest in the late 1990s and early 2000s, it still felt like we were all living on the edge of an interconnected "sci-fi" like world - and I think that feeling was correct. No one had a smartphone yet, the popular "second generation" web was just starting to hit its stride, and sci-fi yet to truly cement itself in every day popular culture (though it was getting there.) In the world of the time, we were still cargo-culting techno-fashion, as imagined by Geiger, "Tron", Anime/Manga artists, and The Wachowskis. Now that the actual technology that was being aped is here, things look quite different.

Today, when we look around at the subcultural landscape, what we see looks a lot like visions of the past, rather than any imagined future. The same way that sci-fi authors are now unable to keep up with the pace of change (as reality rapidly passes their ability to imagine, at least in any work we would deem a recognizable future), so too have our fashions stopped imagining. We're in an interesting place where the "future" that was imagined for so long is all around us, next to us, on us, with us, every second of the day. This, coupled with so many dashed dreams about what a glorious future would look like (we got the gadgets and connectivity, but haven't come close to solving the hard problems) have caused subcultures, and to a large degree, the mainstream culture (now) to use a backward-looking lens through which to view and imagine ourselves. Perhaps as a reaction to the ever-present technology, and lack of what "feels" like more than the mundane, many people have reacted by that oft-quoted search for "authenticity", where authenticity looks like "things that are old, made by hand or at least pretend to be, things that are associated with 'simpler' eras, or things that feel 'closer to the earth'" at least as they are imagined by many movie-makers and authors in  the developed world.

Of course, these things go in cycles, as it did with hippies, grunge, and now all things "twee, earthy, and old" but I do wonder if this time it's less simply cyclical, and more a reaction (due to fatigue and disappointment with how things have turned so far) or perhaps a kind of "hybridization by equilibrium." That is, as the environment gets more sophisticated, people compensate by looking or acting "more authentic and/or earthy" (the reaction to the way  Google Glass looks could be telling.) 

Writing this post, I was reminded of this quote by Fran Leibowitz, which echoes some of the thoughts here:

"I have a number of theories but one theory is that we live in the era of such innovation in technology,” Lewbowitz said. “It’s almost like we can’t do two things at once. If science or technology is going to be racing ahead, then the society is stuck. Also, I think it’s a way for people of my age to stay in the center of things."

So what happened to the future? We're living in it.

Share Or Die as a survival horror manual

Perhaps it should be titled “how to survive in an early post-apocalyptic zombie-infested wasteland”, with chapters like “collaborative consumption” guiding the way to how to divvy up scarce provisions and “unprepared” showing the disconnectedness and naivete of a formerly sheltered set of people (and how quickly the naivete dissipates when said people are thrust headlong intro reality.) In many ways, the rise of disaster fiction dovetails nicely with the follies and problems of our age. Originally created as a critique of mindless consumer capitalism, zombie survival-horror as a genre perhaps is now more apt as both a warning for those who have yet to experience this new world first-hand, and set of training videos for surviving it. Both the resurgence of these genres and the creation of this book, in retrospect, now seem to have been inevitable.

The book deals with a many issues related to our “New New Economy” (and I recommend that everyone read the original “The New Job Security” book, which I read back in the early 2000s. It turned out to be prescient quite quickly.) “Share or Die” deserves a lot of credit for putting personal stories and names to things that have been studied and detailed in the broad strokes, via discussions of policies like deregulation, downsizing, offshoring, outsourcing, widespread automation, and upper-income-bracket tax cuts; ballooning student debt, and the associated peonage of its debtors, as well as the fact that degrees have gone from proof of competence to the most basic of HR filters; short-lived traditions like “jobs for life” gone, and at-will employment treated the way you’d expect based on the label; the shrinking or disappearance of health benefits, our lack of universal health care, or paid leave; the dog-eat-dog economy where the dogs are getting hungrier as the opportunities shrink with each passing day; and a landscape which differs from the rosy  ”you can do anything if you only try” meritocratic fairytales pushed by our institutions, policymakers, teachers, and many others who grew up in an era (post WWII to Reagan) where standards of living seemed like they’d rise forever and opportunity for everyone seemed ever-present, and never-ending.

The book offers many ideas for coping – a few of which are worth entertaining – but are mostly just the ideas of people grasping for something, anything that will deliver us from this predicament. Unfortunately, the vast majority of ideas will not serve most people very well, or for very long. In this way, the ineffectiveness, indifference, or complete absence of societal institutions present in virtually every piece of zombie (and related disaster genres) fiction look like extremely fitting metaphors for our government, corporations, and charities. From the false promises of the “governments” of “Survivors” and “Threads” to their complete absence in “The Walking Dead” or “28 days later“, these pieces of fiction illustrate quite well the plight of those under 40, and increasingly, those over it, today. Those genres also shows what happens as the supplies run out, the crazy ideas which seemed to have a hope of working (the book) stop doing so, and the nerves are well past the frayed stage.

Share Or Die is an excellent, ground-level snapshot of our new age. An age without useful answers (only the same answers that have failed us for a long time), and without the promise of prosperity returning anytime soon.  An age which, to someone transported here from only two or three decades ago, would seem like a radically futuristic place (and it many ways it is.) The powerful devices in our hands, the always-connectedness, the truly instant communications, self-driving cars, the nascent “cyborging” of humanity, the medical advances, and a great many other wonderful things. It also looks like those other futures – the ones with the walled fortresses, and black seas; poverty; despair; wealth concentration; a growing underclass; social unrest and outright rebellions; rising political divisions; and a widespread lack of trust of just about everything

Lastly, there appears to be another parallel in the book to those movies and shows. At some point a few of the characters come to a terrible realization: that no help is coming.

The future of work, part 2

October 5, 2011 · Posted in Business, Culture, Government, Politics, Sociology · Comment 

Very interesting and important video. It starts getting especially interesting around 1:00:00 when Cowen and Ford start talking. A few things:

- The idea of "personal shoppers" being a growth area seems dubious. With recommendation and predictive desire systems getting better all the time, I think this is probably a no-go.

- Nannies might be a growth area for a few decades, but decent AI will likely obviate the need for these as well. I'd give this one 50-50 odds over the next 4-6 decades.

- Basically everyone on that panel endorsed the guaranteed income. The fact that people from so many different perspectives had their policy prescription converge to that is really striking.

- This also brings us back to the "how do we occupy the unemployed masses?" question yet again. Arts and leisure activities will likely not need much support from society, as they will just happen. Sports might (stadiums, etc.) Other things we'll probably be looking at will be extremely powerful, long-lasting, side-effect-free (or tolerable side effect having) drugs. That road of course leads to wireheading, but I think that denying that as a possibility now is extremely naive.

A cultural shift will be required, as several people in the video make clear. Having one's self worth (and respect of one's peers/society) be defined by a traditional support-one's-livelihood "job" is going to have to disappear. Demonization of those "lazy, ne'er do wells" is just not going to work in a future where much human labor is not needed.

Another thing this brings to mind is the effect on the labor market that does continue to function: a great deal less unemployment, at least for a long while. For those skills that are essentially not able to be automated any time soon (let's stay over, the next 50-100 years) those that stay in the labor market will not be doing much competing for open positions. It may in fact be an employee's market in those cases where someone is not simply their own business.

Source problem for public goods: culture (like so many other things)

September 7, 2010 · Posted in Culture, Economics · Comments Off 

In regards to "Krguman and Wells, The Slump Goes On: Why?"

How do you deal with this when the problem is not economics, but culture? A culture that says that taxation is theft. A culture that says that the government spending "your" money = government telling you what to do with your money = elitism. A culture that has internalized "people are always rational"/Homo Economicus. A culture that believes, in the face of massive evidence, that markets always know better. A culture that accepts the idea that "process legitimizes outcome." How do you argue with people who (wrongly) believe that we live in an actual meritocracy and that messing with it is basically immoral?

This, of course, doesn't even scratch the surface of the other issues relating to weakness, incompetence, corruption - and even more importantly - the "not passing the necessary threshold" when we actually DO try to spend on public goods. What does this do? As a commenter noted elsewhere, it saddles Democrats with "all the stigma of big government with none of the results."

On top of all this, how can we get anything large done in a culture that is so short term-focused? What happens if a policy takes, say, 20 years to make the difference we /actually need/? In this country, you only get 2-4 or 4-8 years to prove your idea works; because of time lags, you may get voted out (and the policies changed) long before they make a difference.

We're slaves to short stints for elected officials (relative to how long certain policies take to work) and a "short termist" culture.