Showing posts with label Ideas. Show all posts
Showing posts with label Ideas. Show all posts

Wednesday, June 24, 2015

Fact-checking should be a business

By tradition and by default, books aren’t verified to anything near the standard of a magazine piece.
I am continually amazed at how often mainstream, otherwise trust-worthy news sources get things wrong.  As the quip goes, “I find that the New York Times is always right, except in areas where I have first-hand knowledge.”  Even peer-reviewed scientific journals are not immune: only about 40% of results published in top-tier psychology science journals can be fully replicated.

There are plenty of everyday examples:
  • Medical and health information is notoriously inaccurate, even from sources you’d hope you can trust. For decades, the Center for Science in the Public Interest, as well as the National Academy of Sciences, encouraged the American public to eat trans-fats.
  • A quote regularly repeated by New York Times op-ed author David Brooks, about a rising sense of self-importance among American adolescents appears to be entirely wrong. 
  • Many, many books have been retracted, or published with disclaimers
What if there were an organization, like UL (that approves electrical equipment) or Consumer Reports (that recommends a variety of household products), only instead of dealing in physical goods, they put their stamp on books or magazines?  Of course we already have book reviews, often by people who are themselves experts in the subject, but how many of them go systematically through all the facts and references to be sure that every claim in the book is accurate?

Cochrane is one independent organization that tries to be systematic in its reviews of the trustworthiness of medical findings.  Verificationist is a service that offers to do fact-checking for books on behalf of publishers or authors. Morningstar and many other investment advisory firms do this for stocks and bonds. Can’t we get something similar for books?

Unfortunately I think this would be a lousy business. Too few publishers or authors would be willing to pay to have their own work fact-checked, and most customers, if given a choice, would prefer a cheaper book with facts presented in “good faith” over a more expensive one that was independently vetted.
Francis Bacon What is Truth

Saturday, April 04, 2015

Data is not a substitute for strategy

When I was in grad school, my data sciences class assigned us this incredibly complex optimization problem where we were supposed to recommend the best place to locate a series of factories given expected product demand, availability of suppliers, distance to customers, wage and materials costs, etc. It was too complicated to solve on a normal PC with off-the-shelf software, so the other students simply gave up treating this as a data optimization problem and instead made recommendations based on strategic considerations.

Me? No, I used brute force: I rewrote the software to run on the more powerful and expensive campus mainframe, applying every trick I knew until I found the “correct” answer. It was tough, and at the time I was quite proud of my computer skills, thinking somehow I had bettered my fellow students.

But when I saw the other answers, I realized how silly I was to think that data could beat strategy. Sure, with sufficient computation power I was able to identify a mathematically-provable solution given today’s data. But who cares? Data keeps changing. It’ll be months, maybe years before some of those factories are operating, by which time all my data assumptions would have been irrelevant. Good strategic thinking, on the other hand, doesn’t depend on fluctuations in the data.

I’m reminded of that lesson in this post from Aaron Carroll (at the Incidental Economist blog), responding to Mark Cuban’s advice that everyone get their blood tested quarterly. Data, says Carroll, is not the problem. If you think that more data is always better, you will likely miss the forest for the trees. Or as the post says:

“Ordering a lab test is like picking your nose in public. If you find something, you better know what you’re going to do with it.”

Pony

Monday, September 01, 2014

Notes on Seth Roberts Memorial Symposium

I had a long-planned family reunion in Maine the week of the Ancestral Health Symposium in Berkeley this month, so it was just not possible for me to attend the Seth Roberts Memorial Symposium, but thanks to Tess McEnulty there are videos of the public talks. The talks are well worth watching, but if you don’t have the time, here is my brief summary of the highlights:

Nassim Taleb explained how Seth’s philosophy (n=1) is the exact opposite of what you see in today’s fascination with Big Data. No matter how many data points you accumulate, a new theory can be disproven with a single counter example; and sometimes you can build a true theory based on a single example. (“OJ Simpson only killed once; does that mean you can’t prove he’s a murderer?”). In fact, the more variables you add to a model, the more likely you are to find spurious correlations, as he shows in this slide:Nassim Taleb: Tragedy of Big Data

Tim Ferris (Four Hour Work Week) credited much of his book The Four Hour Body to ideas he got from Seth, who taught him five things:
  • Extremes inform the means. New products and ideas rarely come from “normal” use cases. If you want to find something interesting, search for odd examples.
  • Choose fast results over big data. Look for quick-and-dirty experiments, not big-huge-complicated ones.
  • Track yourself regularly: don’t try to judge a soccer match from a single ultra-hi res photo; it’s much better to have multiple, low-res photos, so track what you can however you can. Seth tracked most stuff with pencil and paper.
  • Remember the “Minimum Effective Dose”: for example, he gets fantastic sleep by taking raw honey and a single tablespoon of apple cider vinegar before bed. No more.
  • Care about normal people: Seth didn’t care who you are or your background. You can learn something from anybody.
Gary Taubes says Seth was one of the only people he ever talked with (they hiked regularly in Berkeley). His talk started with an overview from 19th century doctor Claude Bernard, whose 1865 book Intro to the Study of Experimental Medicine "should be required reading for every med student”.  Key quote: "All human knowledge is limited to working back from observed effects to their cause”. The rest of the talk was a summary of the limitations of various approaches to scientific research (observational studies, randomized control trials, etc.). There are no really good solutions, other than the open-minded and humble approach of people like Seth.

There were several other speakers, like best-selling “fratire" author Tucker Max (who met Seth by randomly emailing him), Paleo author John Durant (who appreciates Seth’s example that you don’t need fancy equipment to do science), experimental psychologist Aaron Blaisdell (who founded the health crowdsourcing site Healthcrowd.com thanks to collaborations with Seth), and many others.

So many great memories of Seth’s ideas, by people who knew him well. I wish I could have attended in person.

Monday, August 11, 2014

College degree expiration dates

My daughter is thinking about the essays on the Common Application  the long standardized form that most colleges now require as part of their admissions process. These essays, combined with grades and test scores, are supposed to help the colleges decide who is a good fit. But how do they know who “fit”? I guess they assume that, once you graduate you’ve proven that you’re one of them, and now for the rest of your life, no matter what you do, you still have that degree from that institution. But does that make any sense?

I know a guy who graduated from MIT in 1978 with a degree in electrical engineering. Would you hire him as an engineer today just based on that piece of paper? Of course not; you’d need to know a lot more about what he’s done since then. How about somebody who majored in English literature — would you assume they (still) understand good writing, ten or twenty years after they have the degree? Or history: what if somebody majored in it ten years ago but hasn’t read a single book since then? Do you think they should still be allowed to say “I have a degree in history from <such-and-such-school>?"

Physicians have to renew their licenses every two years. For lawyers, it’s every year. Even priests need to renew every year.

What if colleges required you to renew your diploma every so often — say, five years. What if you had to submit another essay, to prove that you’re still worthy of that degree?

I bet a LOT of people would simply drop their degree. Once you have your job, or are married, or otherwise stable in life, you don’t need that degree anymore. Most people don’t donate to their alma maters, presumably because by now they feel it’s irrelevant.

But then, why did you go to that school? For that matter, what was the point of the whole exercise — including that admissions essay? 

How about you? Would you bother to renew your college degree?

Saturday, August 09, 2014

Do you believe in evolution?

Keith Blanchard makes an excellent point. Writing in The Week under the title "Why you should stop believing in evolution”, he says:
So if someone asks, "Do you believe in evolution," they are framing it wrong. That's like asking, "Do you believe in blue?"
Evolution is nothing more than a fairly simple way of understanding what is unquestionably happening. You don't believe in it — you either understand it or you don't. But pretending evolution is a matter of faith can be a clever way to hijack the conversation, and pit it in a false duality against religion.
 I have found that most non-science majors I know — even those who are otherwise well-educated — can’t describe evolution in a concise enough way to convince me that they really understand it. When pressed, it becomes clear that what they really believe in is “science”, or “what my teachers taught me” or “what other college-educated people believe”.

The same is true of many other topics where it’s tempting to ridicule those who don’t believe like you do:
  • Do you believe in the danger of GMO (or nuclear energy or the Keystone Pipeline)?
  • Do you believe in global warming?
  • Do you believe vaccines cause autism?
  • Do you believe in God?
When you don’t understand something, you can be easily fooled by somebody who does, which is why it’s dangerous to dismiss unbelievers as ignorant—often you’ll find they are more informed than you are, precisely because they’ve had to dig deeper into the issue in order to withstand criticism of an unpopular position.

To me that explains facts like why those who identify with the Tea Party are more likely to visit science museums, or why climate science literacy has no correlation with political identity.

What about you? Do you understand climate change?
Camping with Martha

Wednesday, April 30, 2014

[book] Proust and the Squid

I can’t remember who recommended this book to me, but as you’ll see by the end of this post, I feel compelled to write down what I learned from my reading of it.

The author,  Maryanne Wolf, a child development professor at Tufts in the Center for Reading and Language Research gives a detailed explanation of what happens — neurologically, socially — when we read, arguing that literacy is a cultural and learned trait that we should treat differently from the rest of our language instincts for communicating and thinking. If it took humans two thousand years to develop the written language, how can we expect children to adapt their minds properly after only two thousand days? (roughly the 7 years or so it takes to become functionally literate).

I came away with two thoughts: first, that reading well almost always means writing well; the best way to learn is to do. The author references this quote from Socrates (in Plato’s Phaedrus 275a), spoken to the inventor of writing:

You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.

Second, I’m thinking of the wisdom of the US Constitution, which admits that the text alone is not sufficient — a living, human trained judiciary is necessary to truly understand the meaning. Or the Catholic religious tradition, which says that we need both the text and the trained priesthood to understand the full meaning of the Bible. I’m back on my rant against those who claim to be knowledgeable just because they read The New York Times or whatever.

Understanding requires much more than simply knowing. I’m going to spend more time doing and less time reading.


Tuesday, April 29, 2014

Seth liked people who think for themselves

Some people have an influence that is difficult to see in the usual standards of pageviews, book sales, citations. Seth Roberts, I think, was one of those people whose real influence was felt one-on-one. I’m reminded of the movie where Forest Gump changed everything from Elvis Presley’s music to the Vietnam War, through a simple comment or gesture that seemed unimportant at the time, but turned out to be very profound.

Many of those whose lives he touched have stepped forward with eulogies that remind me how special he was. Here’s one from John Durant (a Steven Pinker-recommended author I respect):

Seth will be remembered for his work in self-experimentation. More than just his quirky findings – faces on TV, honey at bed, flaxseed oil – Seth taught that anyone can be a scientist.

Beyond this, I think Seth believed something else: that anyone can think for themselves. As someone who knew first hand about how modern science is conducted, he was not impressed just because a study showed this, or an expert said that. He was baffled about why people would make life-changing decisions based solely on what they read in The New York Times, or because they heard it from somebody with a sophisticated credential. Try it yourself! he would say. Check to see if it works on you!

Here’s another one from Seth’s friend Tucker Max (another best-selling author he mentioned frequently in our conversations):

Seth had intellectual courage as well. He examined ideas in themselves, not who they came from, and he defended and stood up for the things that were right, regardless of what they cost him. [emphasis mine]

Part of the reason I think Seth was so approachable was that he knew that good ideas could come from anywhere, that often the best ones come from ordinary conversations. Everyone who knew him personally appreciated those simple conversations the most, and we’ll never fully measure the outsize influence that resulted.

Friday, November 29, 2013

The argument for regulating information

When Nobel Peace prizewinner Liu Xiaobo was imprisoned by Chinese authorities, his captors were unafraid that he alone, a single individual without guns, an army, or even military training, threatens the Chinese government, which has plenty of those items to spare. And few in the government would argue that Liu himself, or his ideas, are themselves irresponsible. He’s a well-educated, perfectly sensible individual well within his rights to think the thoughts he was thinking. No harm if he had simply stopped there.

When you or I, dear reader — the educated elite users of a product like the genetic testing service from 23andme, when we use the information about our genes, few at the FDA will argue that there is a danger. After all, we’re the early adopters, the people smart enough to seek this information in the first place. The trouble is not you and me, it’s them, don’t you know, the unwashed masses out there who may become — how shall we say this delicately? — overexcited, causing themselves potentially tragic — and avoidable — harm.

History shows that the ideas Liu outlined in Charter 08, might actually help China. Reasonable people, those bearing the full responsibility for the stability and long-term future of the country, have no fear of the ideas themselves. Once the country has matured a bit more, once the people are ready for this information, then yes, it may become appropriate to discuss the issues publicly. But right now, here in the real world, where leaders with actual accountability for China’s long-term stability, know that to throw Liu’s ideas out there, wily-nilly, without the proper preparation…well, think of what could happen if those ideas landed in the hands of the irresponsible masses who might be tempted to take action without understanding, as we do, the full consequences.

You see, an expert, whether at the FDA or in the Chinese Communist Party, has been carefully vetted, with years and years of education that brings a better sensitivity to the long-term benefits, as well as the potential downsides, that come with access to powerful ideas.

The government has been very patient with Liu Xiaobo, offering years of warnings, giving him plenty of time to realize the potentially destabilizing consequences of his behavior. The FDA was similarly patient with 23andme, spelling out over dozens of meetings and countless emails, precisely what the experts fear — know — can happen when important information gets into the wrong hands.

Liu Xiaobo has no gun, but many of his potential readers do. 23andme doesn’t perform mastectomies or administer drugs, but many of their potential readers may not be so limited.

You and I may be able to handle a world without sensible regulation of ideas and information. But do you really think that others can?

23andMe packagingSpeaker Pelosi With a Portrait of Liu Xiaobo at the Nobel Peace Center

Sunday, December 11, 2011

Worse than a robot

I’m back in the US for a few days, a good time to order items that I can’t get in China, and that means borrowing the address of a friend to accept delivery. Since I come often, I try really hard not to wear out my welcome with these friends by making the pickup process as simple as possible.

This time I needed wanted a new iPhone 4S, which can have unpredictable arrival times, so I ordered it several weeks in advance. My friend happened to be out when the truck came, and I got an email notice offering to hold it instead at the main FedEx shipping center. Perfect! I thought: I’ll pick it up right after my plane arrives, early in the morning  with no need to intrude on my friend.

Seemed like a perfect plan until the lady at the FedEx counter asked for my ID. Of course I have my passport/drivers license, but the package was delivered to my friend’s name, and as the lady explained to me: “FedEx policy requires that the name on the package match the ID of the person picking it up”.

Well what I can do? I’m obviously who I say I am -- she sees my ID -- and I have the correct tracking number. It’s clearly my package. But policy is policy, according to the  counter lady. My only hope is to get my friend to call FedEx and change the name on the delivery. I explain that it’s early in the morning, my friend did me a favor by accepting delivery in the first place, and I don’t want to impose.  Sorry, she says.  “It’s policy”.

“Wait,” I say. “What’s to stop me from calling FedEx myself?”  I know the tracking number, the address of the original delivery, and I have an email that FedEx sent to me. Rather than ask my friend to call, why don’t I call myself, pretend to be the friend, and the problem is solved, right?

The counter lady hesitates. I have a good point, she admits, but now that she’s on to me, she says, she still won’t let me have the package because she’ll know it was me just faking to be my friend. The only thing I can do, she insists, is call my actual friend and get her to dial FedEx herself.

But that’s a hassle for my friend, who will have to drop what she’s doing to make a phone call, look up the tracking number, sit on hold.  I really don’t want to impose.  Too bad, says the counter lady. “Policy is policy”.

I called Apple. The person on the line was very friendly and accommodating, but Apple’s IT systems and FedEx IT systems are separate, so it could take as long as 24 hours before word of the different name on the address trickles into the FedEx office. The Apple person offers to speak directly to my FedEx lady, who replies “Nope: it’s policy”.

Finally, after too much time wasted already, I excused myself and went outside. I called FedEx and said I want to change the name on the delivery. No problem, they said. A few minutes later I went back to the counter lady, she looked up the entry and sure enough it’s okay to accept delivery from “Richard Sprague”.

Ugh.  What a waste.

This counter lady adds no value. By sticking so firmly to the rules, she was making herself into an automaton, the perfect job for a robot. Unlike a machine, though, she can’t work twenty four hours a day, and she needs to be paid.  So she’s actually worse than a robot!

If, on the other hand, she had used a little common sense -- the kind that is far more complicated to program into a robot -- she could have realized that my story makes complete sense. I am showing her a real ID, and I’m happy to give real additional contact information in case --against all logic--I am a criminal who somehow stole this tracking number, faked the email I showed her, and now is going through all the trouble of coming to the FedEx office -- in person -- to pick up a delivery of a brown box that has no indication of what’s even inside.

The US unemployment rate is too high, and there are a lot of proposals for how to “put America back to work”. But the unfortunate fact is that too many Americans are like this FedEx counter lady: doing work that is fundamentally replaceable by automation and robots. I don’t know what this particular women will be doing in five or ten years, but I know that if FedEx wants to continue controlling costs, they’ll need to look carefully at how much value she adds, and inevitably they will conclude that a robot is better for this work than she is.

It’s sad, because she, like all humans, has some skills that are extremely hard to replace with machines. But first she’ll need to start acting like a human, and not like a robot.

IMG 3843


Sunday, May 29, 2011

Hard to know for sure

I keep running into this problem of the limitations of knowledge. Today, a professor friend of mine reminded me how the one thing you take away from earning a PhD is how little you know of your field of study. We were discussing the old saying that after your first week in China you’ll feel like you could write a book about the place; after a year you’ll think you could write a magazine article; after a few years you give up writing anything.

The Useless Tree Blog discusses (via China Law Blog) a recent interview with Chinese official Wang Qishan claiming that China is only understandable to insiders like himself, but my first thought is “what does it mean to understand in the first place.” Does anybody really know?

Then there’s Peter Norvig’s excellent review of a recent remark by Noam Chomsky dismissing the use of statistical techniques in linguistics. Chomsky apparently thinks real scientific understanding requires more than a statistical analysis of a bunch of data—you have to synthesize that knowledge, presumably into simpler, fundamental rules that describe the Universe. That’s super-hard, and except in Physics almost always turns out to be an approximation anyway.

This is just restating the problem identified by Hayek in The Use of Knowledge in Society and by countless others who reflect on the limitations of what we know.

Society gives too much credit to people who appear to know, but I think self-confidence is no substitute for understanding.

Wednesday, June 17, 2009

Politics as entertainment

I know very little about Iran, other than the headlines I read in the mainstream US press. I bet you don't know much more than I do, though like me you are probably cheering for the "underdog" Mousavi to beat that awful Ahmadinejad. But if I reflect honestly, I have to admit that I don't really know what's best for Iran, or even the long-term interests of the United States. I cheer for one side because it's "my" side, and because a lot of people I like are on this side.

This is no different than why people cheer for a particular sports team. A Mariners fan doesn't care about baseball in any purely detached or objective sense. He wants his side to win because it's his side. Although you could imagine an objective standard of "Truth" about which team is ultimately the best, even the most well-informed sports nut -- the guy who can recite statistics all day -- is going to cheer for his team, not because it's the "best" but because it's "his". It's not about truth, it's about entertainment.

Most well-informed political junkies are the same: to them, politics is a form of entertainment. It's not about being "right", it's about cheering for and supporting one side, sometimes for no reason other than to oppose the competing side. Sure, they can recite facts and statistics -- they enjoy it! -- but press them on why, or about the truth of the matter, and it comes back to "because my side says so".

I think entertainment gets in the way of truth. Few of us have the time to dig into each policy decision in the kind of detail necessary to come to a real opinion, so it's nice to delegate our thinking to a political party. But combine that with the natural tendency of Type A people to dominate conversations, and we get cacophony. No real understanding, just a bunch of loudmouths who classify everything as either Republican or Democrat.

I'm trying to think of a good answer for the next time somebody asks me my political party. I find that the question itself is like asking my favorite baseball team: it's not about any serious discussion of Truth or the issues, it's about figuring out which (of presumably only two -- why is that?) team I'm cheering.

Libertarians aren't really a political party as much as a mindset. Ask what matters to me in the current situation with Iran and, without knowing any facts, I'll give you as reasonable an answer as is possible without a lot of research. But, like domestic political issues, why do I have to cheer for one side or another?

Sunday, May 24, 2009

Digital socialism = capitalism.

In the Jun 2009 issue of Wired, Kevin Kelly uses the loaded term "socialism" to describe how the fragmentation of everything is giving us access to more choice than ever before.  He admits to using a word with much cultural baggage, and it's not clear if by coining the phrase "digital socialism" the result will be to co-opt and re-define the popular meaning of the word, or if careless readers will simply think he's giving a nod of approval to a dangerous meme that of necessity is associated with coercion and top-down control.

Open Source founder and guru Richard Stallman reminds us of the two meanings of "free":  The first ("as in beer")  refers to not having to pay for something. But the second, which embodies the real power of the open internet, means liberty, the freedom to choose what you like, with no coercion from a government or supplier.  Traditional socialism offers a world that is free in the first sense ("free healthcare") but not in the second ("freedom to choose a non-government appointed supplier").  Capitalism, in the original Adam Smith sense, emphasizes the second (liberty) aspect of free. The Invisible Hand won't work unless you can freely choose your suppliers and customers.

Kevin Kelly's insight is that we are moving to a world where we can have both meanings of "free".  The best things in  what the Wired issue calls the "New New Economy" are free in both senses (e.g. Wikipedia, which lets you both access and modify freely without payment).  He calls it "digitial socialism", but he wants us to think of it as a third way that renders irrelevant the old debates about traditional capitalism and socialism.

But it isn't a third way.  What Kevin Kelly calls "digital socialism" is just plain old decentralized, Hayekian capitalism: zillions of independent, free actors whose individual self-motivated choices add up to something bigger than any of us. The real baggage of socialism, and the reason I think Kevin Kelly's term doesn't work, is that it relies on coercion by a third party (government) to make it work.  Capitalism and liberty are always tightly associated because neither can work without the other.  Socialism's "free as in beer" is always associated with a strong (government) Leviathan who can (by force if necessary) redistribute from one person to another.

In the digital world, where redistribution has no cost, Kevin Kelly thinks we can remove the coercive aspect of socialism.  Nobody forces you to contribute your Linux bug fixes or Youtube videos to the collective -- you give freely and selflessly, something he believes you would never do in a world motivated only by money.   But capitalism isn't about money, as  Adam Smith himself notes: "The real price of every thing, what every thing really costs to the man who wants to acquire it, is the toil and trouble of acquiring it."  If Kevin Kelly's digital socialism simply means we no longer need government to reduce the cost of acquiring things, then why not use the term that already describes a system that does exactly that:  capitalism.

update: Lawrence Lessig agrees,  disputing Kevin Kelly’s word choice.  Socialism requires coercion.

Monday, April 13, 2009

The many faces of me

I’m a blog social media hobbiest.  I do this for fun, not for a living.  But as more people come on line, both friends and work-related colleagues, and as more people shift to online as their main source of information, I’m finding myself worried a lot more about my “brand” and how I appear online.  It’s enough to give me writer’s block. Here’s why:

Every one of us who uses online social networking can point to interesting and fulfilling experiences that wouldn't have happened otherwise, and we become evangelists for the cause. I think it's because any interesting person lives a complex, multi-dimensional life with at least the following faces:

  1. Locational [people who happen to be physically near you] neighbors, the mailman, your barista.
  2. Professional [how you make your living]: work colleagues, your boss; customers, others in your industry
  3. Situational [associated with a particular time or event in your life]: your college years, grad school, that summer you spent in a timeshare in the Hamptons
  4. Associational [organizations you belong to]: church or synagogue, service organizations like Rotary Club, the PTA.
  5. Beliefs [people who share opinions or beliefs about something]: religion, politics, superstitions
  6. Familial [immediate family and relatives] : brothers & sisters, uncles, ex-wife.

In the real world, these faces collide only occasionally, and when they do it can be an experience that ranges anywhere from wonderful to embarrassing.

In some cultures, these dimensions are marked with strict rules about clothing, use of eye contact, and even language. In Japanese for example, you literally change your vocabulary to suit the dimension you are in at a given moment, and it can be awkward -- even offensive -- to use certain styles of speech in an inappropriate situation. Even in English, we switch "speech registers" all the time: think about the words you use around your poker buddies versus the way you talk in a job interview.

Which of these is the real you?

Although we generally socialize with a given person based on only one of these dimensions, sometimes seeing little snippets of another dimension can make that person seem more alive and interesting. But go too far, expose too much of another dimension, and you've violated some rule that is awkward for you and your listeners.

Politics and religion are the easiest examples: what happens if your professional relationships find that you have a different (perhaps unpopular) opinion than they do? You are hopefully proud of your political or religious opinions, but express it too loudly among people who disagree (or who don’t understand) and it can be an unnecessary distraction.  Instead of being “that really smart guy who works hard and knows a lot about X”, you’ll be “That <member of minority group> who somehow knows about X”.  Even outside the obvious politics/religion examples, this can be true on lots of topics like health or even favorite movies and books.  Will it really help your “brand” if people know you like <such-and-such sappy musical group>?

A few weeks ago I heard a talk by Dalton Conley, sociologist at NYU and author of Elsewhere, U.S.A. who thinks the future might require us to meld these worlds into one. You will become a "hall of mirrors", with so many of your dimensions exposed that people simply won't be able to tell which of you is "real" except in context -- and we'll all just take it for granted that people are multi-dimensional.

Many of the “leaders” I know in marketing or management think they’ve solved this problem by ignoring their personal brand identity.  “I’m too important to be online” or “I make my team be online, so I don’t have to”.  But I think that’s dinosaur thinking.  You can’t understand the digital world if you don’t live in it yourself.  There are many faces of me, but I’m not going to figure this out unless I jump in.

Saturday, February 07, 2009

Where are the Spragues?

I often run into people who, upon hearing my last name, ask if I’m related to so-and-so other Sprague they know.  Usually the answer is no.  Our family has been in the country since Pilgrim times, so it’s not a terribly rare name, and now there’s a new web site, Dynastree,  that shows name frequencies graphically and statistically. I typed in ‘Sprague” and found this:

  • There are about 30,000 of us spread across the U.S. 
  • We’re the 1210th most common name in the U.S.
Create your family tree at dynastree.com
Distribution of the surname Sprague
Distribution of the surname Sprague

Where does your name come from?

Unfortunately the map is deceptive, since it appears not to correct for the population of each state.  Since California and New York are the largest states, that means just about any name is likely to show red in those places.  Here’s what I got when I sorted to find the top ten states where the name ‘sprague’ is highly frequent.  “Common-ness” tells you how common the name is;  for example In Maine, we are the 175th most common name, even though there are only 507 of us there.

State Common-ness People
Maine 175 507
Vermont 248 157
Rhode Island 416 115
New Hampshire 438 195
Oregon 600 245
Michigan 602 369
Nevada 714 74
Washington 752 331
New York 783 833

 

The “common-ness” metric still isn’t perfect (I’d rather get a number like frequency per thousand), but it’s much closer to my experience, with many Spragues in the Northeast, and a surprising number in Oregon and Washington.  Here on Mercer Island, there’s only one family of us in the phone book, which feels about right.

I’m not sure I want a name that is super-common.  On the other hand, it would be nice not to have to remind people that we’re pronounced “SPRAYG"  (rhymes with vague) and not “SPRAHHHG” (like the linguistically unrelated city in the Czech Republic).

Friday, January 23, 2009

Worst economy since the Great Depression

From the Jan 13th issue of Time Magazine:

The slump is the longest, if not the deepest, since the Great Depression. Traumatized by layoffs that have cost more than 1.2 million jobs during the slump, U.S. consumers have fallen into their deepest funk in years. "Never in my adult life have I heard more deep- seated feelings of concern," says Howard Allen, retired chairman of Southern California Edison. "Many, many business leaders share this lack of confidence and recognize that we are in real economic trouble." Says University of Michigan economist Paul McCracken: "This is more than just a recession in the conventional sense. What has happened has put the fear of God into people."

(oops, forgot to mention the year: this article is from 1992, during what in retrospect turned out not to be much of a recession at all.) [via Marginal Revolution]

If you’re one of those who thinks President Obama is inheriting “the worst economy since the Great Depression”, please check out two years: 

1982 [via David Leonhardt in NYTimes]

The first big blow to the economy was the 1979 revolution in Iran, which sent oil prices skyrocketing. The bigger blow was a series of sharp interest-rate increases by the Federal Reserve, meant to snap inflation. Home sales plummeted. At their worst, they were 30 percent lower than they are even now (again, adjusted for population size). The industrial Midwest was hardest hit, and the term “Rust Belt” became ubiquitous. Many families fled south and west, helping to create the modern Sun Belt. Nationwide, the unemployment rate rose above 10 percent in 1982, compared with 7.2 percent last month.

and, of course, 1973.

I wouldn’t trade today’s situation for either of those two years, and not just because today’s economy is by comparison so much better. It’s impossible to know the future, so who knows and maybe things will get a lot worse. But meanwhile it’s important not to over-react based on over-dramatic headlines.

Saturday, January 10, 2009

Steven Pinker on Personal Genomics

The Jan 11th edition of the New York Times Magazine has a cover story by my favorite thinker describing his experiences with genome testing, and he finds the same lack of satisfaction that I have.  He coins the term “Geno’s Paradox”, to describe how with genomics it seems that the more you know the less you know.  My experience with the 23andme test is that yes, I’m glad I tested myself, but what did I really learn?  Like Pinker, I find myself using my knowledge of myself to make sense of the test results, rather than the other way around.

Some interesting takeaways from the essay:

  • He didn’t have the guts to test himself for Alzheimer’s.  [unlike me]
  • Although he has the bitterness receptor (unlike me], he still enjoys brocolli and beer.  So what good does the gene do?
  • He’s a libertarian! [like me]
  • The company Counsyl specializes in pre-natal genetic testing, a good idea before you have kids.

He doesn’t use the Taleb term “narrative fallacy”, but that’s what he means when describing the need that people have to explain why they turned out the way they did—even if it’s untrue.

some good quotes:

The most prominent finding of behavioral genetics has been summarized by the psychologist Eric Turkheimer: “The nature-nurture debate is over. . . . All human behavioral traits are heritable.” By this he meant that a substantial fraction of the variation among individuals within a culture can be linked to variation in their genes. Whether you measure intelligence or personality, religiosity or political orientation, television watching or cigarette smoking, the outcome is the same. Identical twins (who share all their genes) are more similar than fraternal twins (who share half their genes that vary among people). Biological siblings (who share half those genes too) are more similar than adopted siblings (who share no more genes than do strangers). And identical twins separated at birth and raised in different adoptive homes (who share their genes but not their environments) are uncannily similar.

[this is obvious to anyone who has children]

Although Pinker is clearly doubtful about the short-term promise of genetics testing, I actually think the situation is even more complicated.  What if much of our “environment” is determined by all those genes from the bacteria inside our bodies, some of which are inherited, some of which just arrives through whatever accidents life presents us.  In that case we’d have something with a genetic component (bacterial genes) combined with an environmental one (how we picked up the bug).  How in the world would you ever be able to analyze that amount of complexity?   But like he says:

Personal genomics is here to stay... People who have grown up with the democratization of information will not tolerate paternalistic regulations that keep them from their own genomes…There are risks of misunderstandings, but there are also risks in much of the flimflam we tolerate in alternative medicine, and in the hunches and folklore that many doctors prefer to evidence-based medicine. And besides, personal genomics is just too much fun.

Saturday, January 03, 2009

My blog graphically

from:

http://www.aharef.info/static/htmlgraph/?url=http%3A%2F%2Fblog.richardsprague.com

 

Graphical view of my blog

What do the colors mean?

  • blue: for links (the A tag)
  • red: for tables (TABLE, TR and TD tags)
  • green: for the DIV tag
  • violet: for images (the IMG tag)
  • yellow: for forms (FORM, INPUT, TEXTAREA, SELECT and OPTION tags)
  • orange: for linebreaks and blockquotes (BR, P, and BLOCKQUOTE tags)
  • black: the HTML tag, the root node
  • gray: all other tags

Wednesday, December 31, 2008

Anniversary Cards are for men

IMG_7951

How come there are more anniversary cards addressed “to wife” than “to husband”?  At the Hallmark store yesterday I counted 25 cards designed for men shopping for their wives, versus only 14 for the wives to choose among for their husbands.  Here I always thought women do most of the greeting card buying, so why don’t they have a larger number of cards to choose from?

The reading habits of George W. Bush

I’m skeptical whenever mainstream consensus settles on an opinion about somebody or something, especially when it’s a subject that lends itself to very little first-hand, direct, up-close experience.  For example, It’s pretty well-accepted, both by the general media and by people I hang out with, that our outgoing president is not the brightest bulb who ever inhabited the White House. But I’ve always been skeptical of such a quick-and-easy conclusion.  You don’t get to be President by being a dummy.  There’s enough competition out there to weed out the dim bulbs pretty quickly. 

That’s why I was not as surprised as I bet you were to see that he reads about one serious book per week – far, far more than the general public, and probably more than you do.  In fact, about 40% of Americans didn’t read a single book last year.

According to a WSJ article by Bush friend Karl Rove, the President read the following books in 2008:

David Halberstam's "The Coldest Winter," Rick Atkinson's "Day of Battle," Hugh Thomas's "Spanish Civil War," Stephen W. Sears's "Gettysburg" and David King's "Vienna 1814." There's also plenty of biography -- including U.S. Grant's "Personal Memoirs"; Jon Meacham's "American Lion"; James M. McPherson's "Tried by War: Abraham Lincoln as Commander in Chief" and Jacobo Timerman's "Prisoner Without a Name, Cell Without a Number."

He’s not one-sided, either.  Look at this list of recent fiction:

Besides eight Travis McGee novels by John D. MacDonald, Mr. Bush tackled Michael Crichton's "Next," Vince Flynn's "Executive Power," Stephen Hunter's "Point of Impact," and Albert Camus's "The Stranger," among others.

Of course, just because somebody reads a lot more than you does not mean they are smart.  But like almost everything else you think you know politics, or the economy, or famous people – things you know only indirectly (from what you read) and by word-of-mouth (from people you talk to) – the real story is far more complicated than you can imagine.

p.s. I wonder if this is one of the books he’s read:

Thursday, December 04, 2008

Obama’s slave ancestors

The country is justifiably excited at the thought of a first lady in the White House who is the descendent of a slave.  Michelle Obama’s great-great grandfather, Jim Robinson, born about 1850, lived as a slave on a rice plantation until the Civil War.

So here’s what I’m thinking.  Michelle has a total of 32 great-great-grandparents, just like you and me.  That’s a lot of ancestors, each with his or her own complicated ancestry.   In fact, thanks to the mathematics of ancestry, it’s pretty likely that one of those 32 people is descended from somebody who was on the Mayflower, especially if her family had been in America for a long time.

it turns out that I too have a great-great grandparent who was a slave, (technically, a serf) working under a harsh landlord in the wilds of Lithuania; forbidden to own property, working long hours for no wages, subject to severe punishment if he tried to escape.  Somehow, thanks to changes in the economy and governments, his son Matthias (my great-grandfather) was able to leave the country and eventually settle in America.

Now, here’s another consequence.  We know that Barack Obama has an African father, so presumably there are no slaves on that side of his family (at least not in America).  But what about his mother (and former Mercer Island resident) Stanley Ann Dunham?  Wouldn’t it be likely that, like me, she too had an ancestor (perhaps a great-great grandmother) who had been born a serf?

A quick scan of Wikipedia indicates that Ann was mainly descended from people of the British Isles (where there was no serfdom in the mid-1800s), so I guess it’s unlikely (though not impossible—remember, all it takes is one who was not from there).  But wouldn’t it be ironic if our new President, like his wife, had a serf/slave great-great grandparent – on his mother’s side?

What an amazingly wonderful country!