[Home]KnowLedge

HomePage | RecentChanges | Preferences

<The following is a portion of LarrysText, wikification is encouraged>

Most of you probably think you know some things. You know your name -- let’s start with simple things. You know that you are a student; you know that you are over ten years old; you know that the city we’re in is called "Columbus"; and so forth. You know all kinds of things lik7e that.

The first thing we’re going to do is to ask what knowledge in general is. When do you know something and when do you merely believe it? So we are going to be studying the third part of the definition of "epistemology" that I gave you last time. Let me repeat the third part; it reads: epistemology is the study of "(3) what knowledge is, i.e., what epistemic features would make a true belief knowledge."

Now, the theory of knowledge would be a mildly amusing intellectual game if it involved no more than defining what you mean when you know what your name is, or that you are over ten years old. If that were the total extent of the importance of the theory of knowledge, then of course it wouldn’t be very important. But the theory of knowledge also investigates something much more crucial and interesting; namely, what it means to say that we know crucial and important things, and not merely trivial things like our own names. How can we distinguish when we know, say, that we have a cure for human cancer, versus when we merely believe it with some evidence, but don’t know it?

Here, see if you agree with this or not: sometimes it really makes a difference whether we know something, as opposed to just believing it with a little evidence. A correct assessment of whether we know something can sometimes spell the difference between life and death. Suppose you are enjoying yourself, driving fast along a windy mountain road. You come to a gas station and the gas station attendant tells you: "I just heard from someone that there was a big accident just around the big curve on State Route 99." That’s all you find out. So you hop back in your car and drive down the windy mountainous road, State Route 99. And you keep driving fast. You might tell yourself: "I don’t know that there the wreck is just around the curve." So you keep driving fast. But then a thought occurs to you: "I don’t know that it’s not around the next corner, either!" So when you approach the next corner, you slow way down. And it’s a good thing too -- because there’s the wreck and you would have crashed right into it. So it saved your life, in this fictional scenario anyway, that you correctly assessed that you didn’t know that the wreck wasn’t around the next corner. That’s one example of how a correct assessment of your knowledge state can save your life. I’m sure you can think of many other examples of how knowledge can turn out to be very important.

Now, I’m not saying that what you’re about to learn is going to save your life, because it probably won’t. All I’m saying is that the subject of the investigation, knowledge, really is very important. And so that lends at least some importance to the study of what knowledge is.

So let’s get down to business, and ask: What is knowledge? How can I define the word or the concept, "knowledge"?

Let’s begin by saying what we’re not talking about, when we ask this question. First of all we aren’t just talking about belief. I guess there is a very, very loose way we have of talking sometimes, when I can say, "What she knows is different from what I know; we disagree." When I say stuff like that, all I mean is that we believe different things. I don’t mean, in any strong, literal sense, that both of us know that contradictory claims are true. So when we’re talking about knowledge, we aren’t talking about mere belief. It’s something more than that.

We also aren’t talking about know-how. If I know how to swim, I display my know-how by swimming. On the other hand, if I know that water is made of two hydrogen atoms and one oxygen atom, I do not display my knowledge by doing anything in particular. Not even by talking. Even if I couldn’t talk, I would still know that water is H2O. On the other hand, if I can’t swim, then I don’t know how to swim. So knowing how to do something is different from knowing that a proposition is true. Knowing how is different from knowing that, or as we will say, propositional knowledge. Knowledge that some proposition is true is propositional knowledge. That should make sense, I think. When I use the word "know" in a sentence like "I know how to use a circular saw," I mean one thing. When I use the word "know" in a sentence like "I know that the value of pi can be infinitely expanded," I mean something different. Roughly speaking, the first claim is about action; the second is only about something in our mind. An even better way to think about it might be that knowledge is a highly specialized mental process.

So we aren’t talking about mere belief, and we aren’t talking about know-how. But third, we also aren’t talking about the accumulated information that human beings have amassed. If you like you could call this "encyclopedic knowledge." Something is knowledge in this sense if it is generally accepted and the sort of thing that you would find in an encyclopedia. Sometimes people -- scientists and historians are two good examples -- will say, "It is known that p," and they state some specific fact that is generally accepted by scientists or historians. But of course a fairly large part of what we "know," in this encyclopedic sense, is just wrong. We discover that what we thought was a fact, and what we taught schoolchildren to be fact, really isn’t factual at all -- it’s an error. And so we might say that human knowledge, in this sense, has evolved and changed over the years.

We aren’t talking about knowledge in this sense. In other words, by the word "knowledge," we don’t mean simply whatever is generally regarded as true these days by most experts. We aren’t talking about that, because the sort of knowledge we’re talking about isn’t something that changes. If you know it, then you can’t change your mind and then know something totally contradictory. If you change your mind and believe the opposite of what you believed before, then you can either have knowledge now, or before your changed your mind -- not at both times. And of course that’s the beauty and the value of knowledge, in the sense that we’re talking about: once you have it, as long as you don’t change your mind, then you’ve got it for good.

So here are three things that the word "knowledge" doesn’t mean, in the sense we’re interested in: first, mere belief; second, know-how; and third, encyclopedic, changeable human knowledge. I could list other senses of the word "knowledge" that we don’t mean, but I doubt that’s necessary. As I said, the name for what we are studying is knowledge that, or propositional knowledge.

In fact, I can begin our discussion of what knowledge in this sense is, just by starting the definition; we’ll decide what to add to it later:

S (meaning a person) knows that P (meaning a proposition) iff S ...

Now we have to fill in the blank. What does S have to do in order to know that P?

Well, first, S has to accept or to believe that P. In order to know that a proposition is true, you’ve at the very least got to believe it. Or more to the point, if you don’t believe it, if you disbelieve a claim, then of course you can’t really know it. What sense would it make to say: "I don’t believe that my watch is broken, but I know that my watch is broken." That’s so strange that it hardly makes any sense. If you know something, then you at least believe it.

But as I already said, mere belief by itself isn’t enough. So what more is needed?

Truth. Truth is the second requirement for knowledge. So not must you believe a proposition, in order to know it; it must also be true. Again, try this claim out for size -- see if you think this makes sense. "It’s false that my watch is broken, but I know it is broken." Or how about this: "I know that high tide occurs at 2 pm today; but it’s false that it does." That’s just absurd! I doubt I need to dwell any longer on that point. So now let’s look at our definition of knowledge so far. Here it is:

S knows that P iff S believes that P, and P is true.

For example: I know that the sun does not rise for a month at the North Pole in the middle of winter; I believe that claim and that claim is true. All right?

But let me give you another example where this definition would apply. I decide to go to the horse races. I’m just out for a good time. So I’m up in the stands, and my eye happens to fall on a list of the names of the racehorses competing today. The first name I see is "Lucky Charm" and I think to myself, "Lucky Charm is going to win. Lucky Charm is definitely going to win." And somehow I manage to convince myself that Lucky Charm will indeed come in first place. So I watch the race and what do you know, but Lucky Charm does win. Now let P be: "Lucky Charm will win." So I believed that P, and P was true. But did I know that P? Did I know that Lucky Charm would win? Of course not! It was totally arbitrary how I decided to believe that that particular horse, out of all the other horses, would win.

On the other hand, maybe if I studied all the right statistics, and I knew that Lucky Charm was having a good day and the competition was really off, then I might know that Lucky Charm would win. But the point is that not only do I have to believe a claim in order to have knowledge; my belief also has to be justified. I have to have good reason to believe it. I have to have evidence. My belief has to be rational. So let’s update our definition of "knowledge":

S knows that P iff S believes that P, P is true, and S is justified in believing that P.

Or for short: Knowledge is justified, true belief. Remember that formula. This is widely called the "JTB" definition of knowledge, for the rather obvious reason that "JTB" stands for "justified, true belief." We originally got it, or a version of it, from good old Plato, who may or may not have gotten it from Socrates. So this definition of knowledge has been around for a long, long time.

Justification is part of the concept of knowledge, by this definition. So you might notice that this definition is different from the one suggested by John Hospers in our reading. Hospers says that the third requirement is that one must have evidence for one’s belief, in order for it to be knowledge. I could summarize his definition by saying that knowledge is true belief with evidence. And well, we could put it that way; that would be acceptible. But there may be a difference between saying that a belief is justified, and saying that one has evidence for it. How would we decide if justification and evidence are the same? Well, we’d have to ask: Do all justified beliefs have evidence for them? I don’t think so, but I won’t go into it. And we’d also have to ask: Are all beliefs that are supported by evidence justified? Again, I don’t think so, but I won’t go into that either.

In any case, though, justification and evidence are both epistemic features of belief -- if you can remember that term, "epistemic features," from last time. Justification and evidence are, in other words, both qualities that indicate that the belief is true. We could try out other epistemic features in the definition of knowledge, if we wanted to. Instead of "justified true belief" or "true belief with evidence," we could say that knowledge is "rational true belief" or "warranted true belief." For our purposes, the differences between these different options don’t matter. The whole point is that, to be knowledge, a belief has to have some positive epistemic feature; it can’t be arbitrary or random or irrational. So let’s just stick with "justified true belief," the old JTB definition.

Now there are three different sorts of questions that arise about the JTB definition, that I am going to address anyway. The three questions are as follows. First, what degree of justification is required for knowledge? Second is a question that would take too long to formulate; let’s just say that it’s widely known as the Gettier problem. Third, is knowledge possible? With the third question we’ll be moving finally into the topic of skepticism.

So let’s start with the first question: What degree of justification is required for knowledge? You might say: "Huh? What do you mean ‘degree of justification’?" Well, justification comes in degrees, from weak justification to strong justification. The better your justifiers, or your evidence, the better justified your belief is. To make this clear, let me give you an example of a belief that has weak justification, and then an example of a belief that has strong justification.

In the Dispatch in the back of Metro section everyday you can find a five-day weather forecast. The forecast for the day the newspaper is published on is pretty good. It’s occasionally wrong, but it’s usually right and therefore very useful. And so if the forecast says that today, it’s going to rain all day long, and I look out my window and I see that it’s raining, then I am justified, rather strongly justified, in believing that it will be raining for most of the rest of the day, at least. So that’s a fairly strongly justified belief. You can imagine even stronger beliefs, like my belief that my name is Larry Sanger. It’s hard to imagine a belief that has stronger justification than that. The beliefs that have the strongest possible justification are beliefs that we call certain. A belief is certain if no belief is more strongly justified than it is.

By contrast, suppose that the forecast for five days from now is: sunny and warm. But the five-day forecasts are not nearly as reliable as the same-day forecasts. They aren’t completely useless, but they are often wrong. So if I were to believe that in five days, it will be sunny and warm, just because I see that five-day forecast in the newspaper, that belief would not be strongly justified. Not nearly as strongly justified as my belief in what the forecast says for today.

So now again, here’s the problem: How strong does the justification of a true belief have to be, in order for that true belief to be knowledge?

I’m not going to try to answer this question at much length, but I do want to say a few things about it. First of all, it clearly can’t be too weak. Take for example my belief that in five days, it will be sunny and warm. Then suppose that five days from now, it really will be sunny and warm, just as was forecasted. Then I have a true belief about the future; and surely the forecast does give me some very weak justification for the belief. But do we really want to say that I know that it will be sunny and warm? I don’t think so. I mean, it was just too much of a shot in the dark. I just got lucky. The five-day forecast just happened to be right.

On the other hand, the justification doesn’t have to be certain. For example, indeed, suppose it’s three o’clock in the afternoon, and I’m walking along and I meet you. I’ve got my umbrella, because in the morning I saw the morning forecast: rain all day long. So I believed it was going to rain. You, on the other hand, don’t have an umbrella and you’re getting wet. So I gloat, "I knew it was going to rain." Now would it be right for you to reply like this? -- "Naw, you didn’t know that. All you did was read the morning forecast. So it wasn’t certain that it would rain." I don’t think that would be right. Did I really have to be certain that the forecast was right, in order to know the forecast was right? I don’t think so. Maybe this is a borderline case of knowledge though. Maybe it isn’t strongly enough justified: I mean, maybe we never have knowledge about weather forecasts. But if you want my personal opinion, it’s that we do, sometimes, have knowledge of weather forecasts. I definitely don’t think that we have to be certain of a thing in order to know it.

But there is a strong sense of the word "knowledge," perhaps as used in mathematics, where you have to be either certain, or very close to certain, before you can be said to know something. The standards of knowledge there might require stronger justification than in other areas of life. Why might that be? Why would mathematicians require a higher degree of justification in order to have mathematical knowledge? Well, perhaps because it’s possible to prove things in mathematics, in a way that one can’t prove things about the weather or about economics or sociology. Since a higher degree of justification is possible, that higher degree of justification is made a requirement for knowledge.

So my idea is that the strength of the justification you need in order to have knowledge depends on the object of knowledge -- the thing you are trying to know. At any rate we should update our definition of knowledge to reflect the insight we’ve gained:

S knows that P iff S believes that P, P is true, and S is sufficiently justified in believing that P.

And for simplicity we’re just going to leave the term "sufficiently" vague and unspecified. There is more discussion of this point in our reading.

OK, onto the second question, which I said is not so much a question as it is a rather complex problem, called the Gettier problem. The Gettier problem is basically the problem that we can give certain kinds of counterexamples to the JTB definition. A counterexample is a case where the definition applies, but the word defined doesn’t; or a case where the word defined applies, but the definition doesn’t. Gettier counterexamples are examples where the definition, justified, true belief applies; but one nevertheless still doesn’t have knowledge, so the word "knowledge" doesn’t apply in that case. So let me give one such counterexample. Another such case can be found in our reading. This sort of counterexample is due to an obscure philosopher named Edmund Gettier.

Say there’s a man you know named Jones, and you find out that Jones is going to be offered a job (the boss tells you this). So you’re walking around somewhere and you see Jones, who for some reason is emptying out his pockets and counting out his change. He says that he has ten coins in his pocket. So now you have two justified beliefs: that Jones is going to get the job, and that Jones has ten coins. And so you infer from these two beliefs: The person who is going to get the job has ten coins. And that’s a justified belief too, right? Because you’re perfectly justified in believing that Jones is getting the job and that he has ten coins. So the person who is going to get the job has ten coins. Fine.

But now suppose that you applied for the job; and contrary to what you were told, it turns out you are going to get the job, not Jones. The boss only told you that Jones was going to get the job, so that he could surprise you. So it turns out, even though you originally had a justified belief that Jones was going to get the job, he didn’t get it. And that happens sometimes: sometimes things that we’re well within our rights to believe turn out, surprisingly, to be false. But now just on a lark you decide to empty out your pockets and lo and behold, you count out ten coins. So! It turns out that the person who is going to get the job does have ten coins.

Now think back, to before you knew you were going to get the job. You thought Jones was going to get it. And you believed this justifiedly, even though it turned out to be wrong. And you were also justified in believing that Jones has 10 coins. What follows? The person who is going to get the job has ten coins; and so you believed this justifiedly too. But it turns out that this was true. So you had a justified, true belief that the person who is going to get the job has ten coins. But clearly you didn’t know that then. You thought it was Jones who was going to get the job and you based your claim on that false, but justified assumption. Nonetheless, what you inferred from that assumption was true! So you had a justified true belief; but you didn’t have knowledge. Well, Gettier and a lot of other philosophers said, that means that knowledge must be something more than justified, true belief.

Gettier’s article was published in 1963. Right after that, for a good decade or more, there was an enormous number of articles trying to supply the missing fourth condition of knowledge. The big project was to try to figure out the "X" in the equation, knowledge = belief + truth + justification + X. But whenever someone came up with a new definition, by coming up with something to put in for X, someone else would come up with a new counterexample to shoot down that definition. We don’t really have time to go over the different definitions that people have proposed. Some of them are actually quite complicated.

The Gettier problem is, at any rate, the standard objection to the JTB definition of knowledge; and it’s instructive to try to figure out how you might try to get around the problem. So I’ll just have to leave it as an exercise for you.

The third question about our definition of knowledge is: Is knowledge possible? Now if you take a look at that definition, you’ll notice something: If I know something, then I am sufficiently justified in believing it. So we have a way to rephrase the question, "Is knowledge possible?" Instead, we can ask: "Am I ever sufficiently justified in believing something in order to have knowledge?" The skeptic is the person who says "No." So let’s look next at skepticism.


HomePage | RecentChanges | Preferences
This page is read-only | View other revisions
Last edited February 2, 2001 11:52 am by PhillipHankins (diff)
Search: