Why did psychology’s leading researchers take that first course? Was it the compelling advice of a master? Perhaps a sudden epiphany?
There’s a story behind every good psychologist. A cross-section of psychologists were asked to share their stories and illuminate the heart of this careerma king decision.
Part one of this series showcases the paths of psychologists in various disciplines from around the world.
University of California, Irvine
Dad had to raise three kids alone after Mom died, so he was pretty busy. He worked all day, came home for dinner, and often just wanted to spend the evening in his room reading. But I found a way to get him to talk to me – I asked him for help with my math homework. He had been an award-winning math whiz as a kid, and math became the one thing we could talk about. With the benefit of Dad’s “tutoring,” I too would win mathematics awards. Since I seemed to be good at mathematics, my life plan was to become a high school math teacher.
But things changed. I was a math major at the University of California, Los Angeles, but of course I needed some electives. I took introductory psychology from Allen Parducci and got hooked. Nearly every elective course I took thereafter was in the field of psychology, and when it was all said and done I had enough credits for a double major. As luck – or perhaps wisdom – would have it, I chose to continue graduate work in psychology. I heard about a field called “mathematical psychology” and it sounded perfect for me. Stanford University was known to excel in that field, and that’s where I wanted to go for graduate school.
During my third year at Stanford, I developed what was to become a consuming interest in long-term memory. I learned how to use the methods of experimental psychology to investigate human memory. But I also learned an important truth about science more generally. Science is not just a giant bowl of facts to remember, but rather a way of thinking. It’s a process that is based on a fundamental insight, namely, that an idea may seem to be true, but this has nothing to do with whether it actually is true. In order to distinguish true ideas from false ones we must test them. My undergraduate and graduate education prepared me well to test ideas, often by experiments, and I happily applied this knowledge to the study of human memory. Discovering some fundamental facts about the malleability of memory has been pretty exciting.
Years later, when I gave a commencement address to graduating college seniors, I told the graduates about these experiences. I talked about an important gift that the study of psychology gives to people. It is the gift of knowing how to ask the right questions about any claim that someone might try to fob off on you. Ask them: “What is the evidence?” for that claim, but don’t stop there. Get more specific: What kind of study was done? What was the dependent variable? Was there a control group? What kinds of statistical tests were used to analyze the data? Has the study been replicated?
We need ask: “What exactly is the evidence?” because some evidence is so flimsy that it’s not really evidence at all. I thank my professors, the authors of my textbooks, and my fellow students for helping me appreciate this gift, and of course my father for leading me, somewhat fortuitously, in its direction.
Three Strikes You’re In
Jeffrey S. Katz
In looking back on why I became an experimental psychologist, my first thought was, ‘Why not?’ What could be more important than understanding how the underlying mechanisms of the mind might work? Certainly, the problem was not going to be solved any time in the near future. I thought the complexities of how we think and behave and how that might differ from other species had to be as challenging as any problem that faces scientists in biology, chemistry, or physics.
That makes a nice story, but I’m not confident I was that savvy in my thinking 15 years ago. For the academic, there is one clear global career decision: to become an academic, with smaller, critical, perhaps serendipitous points along the way. Here are two events that started me on the path and a third that kept me on it.
The first was discovering the information processing view. My first choice for an undergraduate major was computer science. During my sophomore year, I was enrolled in an experimental psychology class. At the same time, I was losing my passion for learning different data structures, search algorithms, and computer languages. When I learned that psychologists were using flowcharts to describe how memory worked, I saw the obvious connection between what I was learning in computer science and how that could be applied to human information processes.
The second event took place during my junior year, in another class. I had developed into one of those students who asked too many questions. During one class the professor was in the middle of a horrible lecture and I thought, “I could teach this at least as well, if not better.” I was certainly naive and likely a little arrogant at the time, but, as I’ll be the first to admit, I was probably right.
The third event occurred during a dinner conversation right around the time I had finished my master’s thesis. A colleague of my graduate advisor had presented a colloquium earlier in the day, and I was invited to dinner with him that night. I’m sure I played every part of the wide-eyed graduate student that evening. The conversation turned to career decisions and, in particular, my ability to find an academic position. I was working in the field of animal learning and cognition and was fully aware of the scarcity of a tenure-track position in this area. I described these predicted difficulties to my dinner partner, and the advice I got in response was straightforward, and it went something like this: “Everyone who deserves a job gets one.” The debatable veracity of this statement is of secondary importance to its service as a major motivating force in my career path.
Portrait of the Psychologist
Ben R. Slugoski
James Cook University
If my developmental psychology colleagues are right, I began formulating conceptions of human psychological states and processes at about the age of three. Institutional recognition for my efforts came much later of course, with a DPhil in social psychology from Oxford University in 1985. In between, it was largely a matter of learning to put practice into theory.
In fact, it was only toward the end of my third year of undergraduate studies at Simon Fraser University that I committed to psychology as an administrative category. I had been double-majoring in psychology and English, and the latter was winning hands down in telling me anything of value about “human nature” and the “human condition.” During those obsessively reflexive 70s at “Berkeley north,” philosophy and English were king and queen, and from the top floor of ‘academic quadrangle’ – floating amongst the clouds seemingly tethered to Burnaby mountain – we English majors could spit on just about any other department in the university. Psychology, by contrast buried deep in the bowels of ‘Classroom Complex’ on the campus’s periphery, often bore the brunt of our collective disdain; Laçan and Derrida were at home in English, unknown – or at least unmentioned – in Psychology; and who could argue that Skinner’s Beyond Freedom and Dignity held a candle to Dostoevsky’s Notes from the Underground in laying bare the central existential dilemma.
My decision to undertake Honors in psychology rather than English was driven only in small part by the expectation of having to feed a family down the road. First, I was something of an oddity among my peers in either discipline in finding an almost perverse pleasure in the counterpoint between the rhetorical and positivist approaches to understanding human nature. Or perhaps it was just because I got easily bored, and there was no antidote to a few hours deconstructing Coleridge or Blake like working out the expected mean squares for a tricky experimental design (a rakish sex-life not otherwise being in the cards!).
Far more important for choosing psychology was that I found all my beacons there. Erudite though my English professorss were, they were only vessels for conveying the brilliance of the ‘Greats’ and as such were never particularly good models for an aspiring player. What ultimately determined my allegiance to psychology was the brilliance personified in my psychology lecturers, of whom three in particular made an enormous impact: the late Kenneth Burstein, old school rat-runner, unabashed liberal, and the person whom you would least want as a relationship counsellor; Raymond Koopman, statistician and methodologist extraordinaire; and my recently retired, analytically-trained mentor through a Masters degree, James E. Marcia. A more diverse range of characters and backgrounds can hardly be imagined, and I can only trace their classroom influence on me to some uncanny ability to jar my presuppositions and enable me to recognize a ‘good idea’ when, if one came along. It is probably worthy of note in these days of multimedia, dot point-driven instruction that my beacons were invariably Socratic minimalists for whom the take-home message was quite subsidiary to the intellectual journey (seemingly) constructed in situ. Thus, I recall Burstein leading us from eye-blink conditioning with rabbits to human divorce statistics via a little sociobiology, Koopman had the class reinvent the correlation coefficient, and Marcia … well, Marcia had us ruminating about the conditions and consequences of sleeping with ones’ clients. Jarring a basic pedagogic assumption, Marcia also distributed his multiple-choice questions in advance of the exams, which I considered to be a very good idea. They certainly didn’t do that in English!
So my choice was clear: first Honors and then a MA under Marcia investigating (in a supreme act of intellectualization) the ‘Cognitive and social-interactional characteristics of ego identity statuses in college males.’ I subsequently was able to reconnect my psychological studies with my abiding interest in language, and my reading on the plane delivering me to Oxford was John Austin’s seminal How to do things with words. Ordinary language philosophy – Oxford philosophy – was to provide the machinery with which I have since been probing the interaction of linguistic and a variety of social cognitive variables and processes such as attribution, cognitive biases, and – in one fit of post-modern nostalgia – even Marcia’s identity statuses.
Johns Hopkins University
When I was a student during the 60s at The Slade (the art school at University College, London), my tutor often referred to my paintings as ‘cerebral,’ not a favorable commentary in an era of late abstract expressionism. Little over a decade later I shifted from my life long love of art to a career in physiological psychology. For me the two had much in common; they elicited a fascination with the most compelling questions in nature about the human mind and required a good deal of time spent on matters technical in attempting to arrive at some approximate answers. I was able to balance my cerebral tendency with the practice of methods to formulate work that was part intentional, part unpredictable.
Of course a great teacher was influential in changing my path. Bruce Kapp, then a young assistant professor at University of Vermont, provided my introduction to physiological psychology. Bruce was an inspired and passionate teacher. At heart, his teaching was intimately tied to his deep interest in the work of science. By his example, I could hardly become interested in a topic without wanting to dig right in and study it, and I soon became his first PhD student. So I caught the bug from Bruce and have lived very happily with it since.
Making career decisions in the 60s was not at all what it is today, or so it seems to me. As I decided to go to graduate school and worked on my dissertation research, I gave little thought to work life down the road. Graduate training back then also gave little attention to professional development. Sometimes I wonder, as my own students cast a far-reaching eye on their careers, how a more realistic understanding of being a mature scientist would have affected my decision on a career, or tempered my experience as a wide-eyed child in those early days of training.