What is an Expert?

I’ll tread carefully here, because we live in a strange time of questioning the motives and knowledge of experts to bolster every bizarre conspiracy theory under the sun. No one trusts any information anymore. It’s not even clear if trusting/doubting expert opinion is anti/hyper-intellectual. But that isn’t the subject of today’s topic.

I listen to quite a few podcasts, and several of them have made me think about expertise recently.

For example, Gary Taubes was on the Sam Harris podcast and both of them often get tarred with the “you don’t have a Ph.D. in whatever, so you’re an unknowledgeable/dangerous quack” brush. Also, Dan Carlin’s Hardcore History podcast is insanely detailed, but every ten minutes he reminds the audience “I’m not a historian …”

Many people who value the importance of expertise think that the degree (the Ph.D. in particular but maybe an MFA for arts stuff) is the be-all-end-all of the discussion. You have the Ph.D., then you’re an expert. If you don’t, then you’re not.

The argument I want to present is that if you believe this, you really should be willing to extend your definition of expertise to a wider group of people who have essentially done the equivalent work of one of these degrees.

Think of it this way. Person A goes to Subpar University, scrapes by with the minimal work, kind of hates it, and then teaches remedial classes at a Community College for a few years. Person B has a burning passion for the subject, studies all of the relevant literature, and continues to write about and develop novel ideas in the subject for decades. I’d be way more willing to trust Person B as an expert than Person A despite the degree differences.

Maybe I’ve already convinced you, and I need not go any further. Many of you are probably thinking, yeah, but there are parts to doing a degree that can’t be mimicked without the schooling. And others might be thinking, yeah, but Person B is merely theoretical. No one in the real world exists like Person B. We’ll address each of these points separately.

I think of a Ph.D. as having three parts. Phase 1 is demonstration of competence of the basics. This is often called the Qualifying or Preliminary Exam. Many students don’t fully understand the purpose of this phase while going through it. They think they must memorize and compute. They think of it as a test of basic knowledge.

At least in math and the hard sciences, this is not the case. It is almost a test of attitude. Do you know when you’re guessing? Do you know what you don’t know? Are you able to admit this or will you BS your way through something? Is the basic terminology internalized? You can pass Phase 1 with gaps in knowledge. You cannot pass Phase 1 if you don’t know where those gaps are.

Phase 2 is the accumulation of knowledge of the research done in your sub-sub-(sub-sub-sub)-field. This basically amounts to reading thousands of pages, sometimes from textbooks to get a historical view, but mostly from research papers. It also involves talking to lots of people engaged in similar, related, or practically the same problems as your thesis. You hear their opinions and intuitions about what is true and start to develop your own intuitions.

Phase 3 is the original contribution to the literature. In other words, you write the thesis. To get a feel for the difficulty and time commitment of each step, if you do a five year Ph.D., ideally Phase 1 would be done in around a year, Phase 2 is 2-4 years, and Phase 3 is around a year (there is overlap between phases).

I know a lot of people aren’t going to like what I’m about to say, but the expertise gained from a Ph.D. is almost entirely the familiarization with the current literature. It’s taking the time to read and understand everything being done in the field.

Phase 1 is basically about not wasting people’s time and money. If you’re going to not understand what you’re reading in Phase 2 and make careless mistakes in Phase 3, it’s best to weed those people out with Phase 1. But you aren’t gaining any expertise in Phase 1, because it’s all just the basics still.

One of the main reasons people don’t gain Ph.D.-level expertise without actually doing the degree is because being in such a program forces you to compress all that reading into a small time-frame (yes, reading for three years is short). It’s going to take someone doing it as a hobby two or three times longer, and even then, they’ll be tempted to just give up without the external motivation of the degree looming over them.

Also, without motivating thesis problem, you won’t have the narrow focus to make the reading and learning manageable. I know everyone tackles this in different ways, but here’s how it worked for me. I’d take a paper on a related topic, and I’d try to adapt the techniques and ideas to my problem. This forced me to really understand what made these techniques work, which often involved learning a bunch of stuff I wouldn’t have if I just read through it to see the results.

Before moving on, I’d like to add that upon completion of a Ph.D. you know pretty much nothing outside of your sub-sub-(sub-sub-sub)-field. It will take many years of continued teaching and researching and reading and publishing and talking to people to get any sense of your actual sub-field.

Are there people who complete the equivalent of the three listed phases without an actual degree?

I’ll start with the more controversial example of Gary Taubes. He got a physics undergrad degree and a masters in aerospace engineering. He then went into science journalism. He stumbled upon how complicated and shoddy the science of nutrition was, and started to research a book.

Five years later, he had read and analyzed pretty much every single nutrition study done. He interviewed six hundred doctors and researchers in the field. If this isn’t Phase 2 of a Ph.D., I don’t know what is. Most students won’t have gone this in-depth to learn the state of the field in an actual Ph.D. program.

Based on all of this, he then wrote a meticulously cited book Good Calories, Bad Calories. The bibliography is over 60 pages long. If this isn’t Phase 3 of a Ph.D., I don’t know what is. He’s continued to stay abreast of studies and has done at least one of his own in the past ten years. He certainly has more knowledge of the field than any fresh Ph.D.

Now you can disagree with his conclusions all you want. They are quite controversial (but lots of Ph.D. theses have controversial conclusions; this is partially how knowledge advances). Go find any place on the internet with a comments section that has run something about him and you’ll find people who write him off because “he got a physics degree so he’s not an expert on nutrition.” Are we really supposed to ignore 20 years of work done by a person just because it wasn’t done at a University and the previous 4 years of their life they got an unrelated degree? It’s a very bizarre sentiment.

A less controversial example is Dan Carlin. Listen to any one of his Hardcore History podcasts. He loves history, so he obsessively reads about it. Those podcasts are each an example of completing Phase 3 of the Ph.D. And he also clearly knows the literature as he constantly references hundreds of pieces of research an episode off the top of his head. What is a historian? Supposedly it’s someone who has a Ph.D. in history. But Dan has completed all the same Phases, it just wasn’t at a university.

(I say this is less controversial, because I think pretty much everyone considers Dan an expert on the topics he discusses except for himself. It’s a stunning display of humility. Those podcasts are the definition of having expertise on a subject.)

As a concluding remark/warning. There are a lot of cranks out there who try to pass themselves off as experts who really aren’t. It’s not easy to tell for most people, and so it’s definitely best to err on the side of the degree that went through the gatekeeper of a university when you’re not sure.

But also remember that Ph.D.’s are human too. There’s plenty of people like Person A in the example above. You can’t just believe a book someone wrote because that degree is listed after their name. They might have made honest mistakes. They might be conning you. Or, more likely, they might not have a good grasp on the current state of knowledge of the field they’re writing about.

What is an expert? To me, it is someone who has dedicated themselves with enough seriousness and professionalism to get through the phases listed above. This mostly happens with degree programs, but it also happens a lot in the real world, often because someone moves into a new career.

Advertisements

Ethics of Manners

This has bothered me from my earliest years. When I was little, I never understood the seemingly pointless rules people had to follow called “manners.” I was going to do some research before writing this to make sure I’m not way off base. I also wanted it to be well-researched so that it would be taken seriously. Oh well, I’m more of an impromptu type of person.

I’ve had several experiences in the past couple of weeks that has brought this back into my mind. I tried to read Lynn Truss’ Talk to the Hand: The Utter Bloody Rudeness of the World Today, or Six Good Reasons to Stay Home and Bolt the Door. I became infuriated at how she was confusing and oversimplifying ethical issues to make it sound like the problem was with people’s lack of manners. In fact, I admit I never finished it due to this frustration. The second was with my whole rant on Sam Harris who basically is claiming that upholding manners is causing lots of unnecessary suffering. Let’s be rude! These two things started allowing me to notice manners in the world more.

In the beginning there was the word and according to dictionary.com the definitions are:

2a. “the prevailing customs, ways of living, and habits of a people, class, period.”

3. “a person’s outward bearing; way of speaking to and treating others”

I think that in 2a we immediately see a clash of interest for me. You should always have good reasons to do things. Doing something because it is the way it has always been done is not a good reason. This is why I hate that culturally imposed behavior is so hard to break. Sometimes it is not good. Slavery was socially acceptable at one time. If people hadn’t been “utter bloody rude,” then the practice would still be going on today. OK. So anyone with philosophical training is going to call a red herring on me right about now. I’m examining the definition of manners and I bring up slavery. All I’m trying to say is that according to 2a, manners are a social norm, and in the past social norms have been seen to be unethical. I’ll build the case later that manners are precisely this.

One semi-irrelevant thing to think about is that if manners are a form of ethical conduct, then we have a case of cultural ethical relativism, since every culture has different sets of manners. In fact, some sets of manners are in direct conflict with each other.

Evidence for manners being unethical: 1. they are a form of lying, 2. they are useless and waste people’s time (and hence are ironically rude), 3. they allow people to practice unethical behavior behind an acceptable name.

1. According to definition 3 (I should probably map this out visually for people who are no good at trying to follow where all these numbers are, but that would be polite and thus evidence against my case), manners are a person’s “outward bearing.” There is very explicit intonation there that this is not what the person is truly thinking. Let’s get real. Manners teach us to lie with dignity and in a socially acceptable way. You hate someone’s hair. Manners tells us to not go up to that person and say, “I hate your hair.” We’ll come back to this in 3 (not definition, but evidence 3).

2. Manners are rude. I may have accidentally constructed a zen koan on this one. I think it is rude to waste people’s time. This is probably the general consensus. Well, picture yourself in this common situation. You are passing someone you know. You have nothing to say to them. Manners says that you should be polite and make at least a little small talk. This accomplishes nothing. In fact, usually there are lies exchanged (see 1) such as, “How are you?” “Fine.” It could be the worst day of your life. You will probably say, “fine.” (see the play Wit by Margaret Edson). A few minutes later, you have exchanged absolutely no useful information, and everyone’s time has been wasted. Hmm…seems rude to have wasted that person’s time. What ever happened to that bit of manners that says, “If you have nothing useful to say, don’t say anything at all.” Erm…that’s not quite right, but a little altering of the truth can be polite we’ve already established.

3. This is a bit more serious, so I’ll drop the lighthearted tone. Now I’m talking about respect and human rights. This is where the slavery example comes back. There is a fine line between respecting a culture’s practices and allowing violations of human rights to occur. An example from all over the news recently (I think last week) was that a (the) gay Anglican bishop was not invited to the national conference. There is only one ordained gay Anglican bishop, because it is still technically against policy. The conference was holding debates about whether to allow gay people to be ordained. Don’t you think the only person with first hand experience should be allowed to state an opinion on the issue? So you might not like this example, but it was recent and it could be any of the hundreds of current examples of inequality being practiced somewhere.

We can always find a “proper manners” or respect argument to hide behind. We say that that is their culture, their belief, their faith, and if someone doesn’t like it then they shouldn’t practice that religion. What people don’t realize is that their manners are not saying what they think. If we allow a group of people to say that one type of person is better or worse than another (at a fundamental level), then this is not confined to the group. This is sending a message to what is now a globalized world that this type of practice is acceptable. The time for politeness is over. We can continue to use our excuse that we are being respectful to someone’s beliefs, but it is unethical to hide behind this cultural norm. Good manners are the cause of a lot of needless suffering and inequality.

Conclusion: the practice of good manners is unethical.

Vote for a Direction

There are many ways I could go for the next couple of weeks. All are fascinating to me, so I’ll let you decide. Vote now!

1. I could lay out in simple terms my favorite Millennium Problem: The Hodge Conjecture. I wrote this up last winter, and it is for all levels. It is conceptual and basically no details rear their ugly head. So if you’re interested in what these million dollar prize problems are like vote 1.

2. Several art related things that are well worth analyzing/discussing have come up.
2a. Literature: My Gravity’s Rainbow Challenge is well under way or I never really discussed my thoughts on my first Haruki Murakami experience.
2b. Film: I saw my first Pedro Almodovar film. Other directors worth discussing that have come up recently are Harmony Korine, Werner Herzog, Shinya Tsukamoto, and Shyamalan’s newest The Happening.
2c. Music: Who has popped up this year as exceptional (Bon Iver, Son Lux, Extra Life, etc) and who has let me down (Death Cab). I have a harsh opinion people don’t want to hear.

3. Philosophy: The standard philosophy of mind and language that I’ve been reading, or some ethical debates (more on Sam Harris maybe?).

4. Choose your own adventure: Anything you’ve seen or heard of lately pertaining to math, physics, philosophy, or art that you think I may be able to shed some light on. I have an article entitled “Noncommutative Geometry for Pedestrians” that I’ve been looking for an excuse to read. Also, I have a library system and netflix, so basically any book or film you bring up I should be able to get my hands on.

A Letter to Sam Harris

If you are unfamiliar with Sam Harris, I highly recommend him. He has ethical concerns for people who consider themselves to be rational human beings, but do not stand up to unreasonable action and thought. I won’t go into the details or else this could turn into a thousands of words post.

Dear Sam,

I am a big fan of yours and align myself pretty completely with your views. I am sure that you get tons of letters everyday from people criticizing your ideas, so it is possible you won’t even read this. Following your lead, I feel that I need to point out a flaw in your argument for upholding spirituality (in the form of isolated meditation). Now I just saw you roll your eyes, since this has probably been your most received letter since your talk at the 2007 AAI Conference.

First off, I said I aligned with you, and this is true on the point of spirituality in that I am a practitioner of Zen forms of meditation. My point is that you cannot use the argument you do to uphold this position. My argument comes from the theory of cognitive dissonance, so if you do not subscribe to this, then my point may be moot.

Your seem to have two main claims. One being that through experiences such as isolation and meditation, we can achieve greater awareness of ourselves and our nature. Or, before you jump on that, at the very least, it is possible for every human to test for themselves whether or not this is a true statement. If you find that it isn’t true, then we can’t totally dismiss it, because as with a great athlete, the training required might be much more than what the individual put in. The second claim being that people that claim to have this higher awareness, are in general happier, selfless, etc making the pursuit a worthy one.

Please keep reading even if the details of the above are not correct, as I feel my argument is not in the details but the general. Suppose someone goes off to live in isolation for a year. He meditates for 15 hours every day. Nothing is happening. Isn’t it possible that some cognitive dissonance is building up? The person is told that they will become enlightened if they put the time and effort in. After months of nothing happening, there is a change. The things that people said will happen are happening: the loss of the sense of self, and more. Isn’t it possible that to resolve the cognitive dissonance that this person has sat alone in isolation for a year with no results, the brain has decided to make the fiction a reality?

If the above is even of slightest, remotest, possibility, then we have a problem. This is quite possibly what goes on in other religions as well. You pray every day of your life, then one day you have a life altering religious experience and “know the truth.” This sounds very similar to the person that meditates and has a life altering experience. According to your own viewpoint, if we condone the “non-religious” spirituality, then we necessarily cannot make an arbitrary distinction and condemn the other (especially if the same cognitive mechanism drives both of them).

What about the outcomes? I believe that it is also your opinion that outcome is irrelevant to justification. Even if it is, there are plenty of Christians (insert religion of choice) that due to their very belief in God do positive things in the world. “It gives meaning to their life.” Can you explain how it is different from when the ascetic gains meaning to their life through meditation?

To end on a more positive note, I still consider myself in alignment with you. But as someone who practices rationality, I feel that I cannot use the reasons you have provided to justify my spirituality. They seem to be too close a variant on the irrational arguments (and possibly the same mechanism) that organized religious people use.

Sincerely,
HilbertThm90

I am going to send this to him, but I would first like to hear if I am missing something major. If you are unfamiliar with Sam Harris, then you probably will have trouble following this post.