Sound and Fury

Signifying nothing

Factivity of knowledge makes it redundant

with 2 comments

I’m thinking of taking an epistemology course this year. Which means that I’ve been thinking about knowledge. Now, a lot of epistemology starts from the assumption that whatever knowledge is, it must be true. This is called the “factivity of knowledge”. Basically, I am supposed to have the intuition that attributing knowledge of a falsehood to someone sounds a little odd. Consider “Ptolemy knew the Sun orbited the Earth”. We should be inclined to think that this sounds odd, and what we should say is “Ptolemy thought he knew the Sun orbited the Earth” or “Ptolemy believed the Sun orbited the Earth.” This is an intuition I just do not share. I take the point that perhaps it is a little odd to suggest Ptolemy knew the Sun orbited the Earth, but take modern instances of scientific “knowledge”: I know there are electrons; I know that nothing can go faster than light. Accepting that scientific knowledge is fallible, does that mean that it is not knowledge? Or rather, does my accepting that any piece of scientific knowledge might be wrong mean I cannot be confident that I have any scientific knowledge? After all, knowledge is not defined as “justified, approximately true belief”… It seems that epistemology tries too hard to accommodate the intuition that there’s something fishy about attributing knowledge of falsehoods to people, while ignoring the use of words like “knowledge” in science, for example. Any theory that fails to live up to the “folk” use of knowledge in science had better be damn good at what it does elsewhere…

And what’s the point of an analysis of “knowledge” that makes it so inaccessible. Given my acceptance of my own fallibility, I can never know I have knowledge: accepting that any belief of mine could be false makes the KK principle radically false. Do knowledge attributions help us understand or assess the rationality of someone’s reasoning or decision? No: once we’ve a picture of their doxastic state (their beliefs), and the relevant truths, then attributing knowledge doesn’t seem to add anything. I can never say “I merely believe this proposition, so I will act in this way in respect of it; but this proposition I know and can therefore act in this different way…” since I never have access to which of my strong justified beliefs are also true. So what role does this kind of knowledge play?

Maybe this attitude is caused by my being corrupted by LSE’s interest in decision theory. Perhaps I am too focussed on using a theory of knowledge to assess the rationality of some action of set of beliefs. Maybe the real problem is to understand what is special about knowledge, over and above mere belief. And maybe one thing that sets knowledge apart is that knowledge is true. But, to my (hopelessly pragmatically based) mind, that’s not an interesting distinction, since it’s one that never makes a difference to me. But maybe there are some belief states that do have practically privileged status: maybe some kinds of belief (e.g. justified beliefs) allow me to act differently. And if this sort of belief looks a bit like knowledge, then maybe we should adopt that label.

Perhaps the best course of action is just to give up the word “knowledge” to the epistemologists and to focus on the practically useful propositional attitudes like belief. Obviously, truth is still important. Not only is having true beliefs often pragmatically the best way to go, but having true beliefs may well have some kind of “epistemic value” in and of itself. But to make the truth of some belief part of characterising what kind of belief it is seems wrong. Maybe the misunderstanding I have of epistemology (or at least of analyses of the concept of knowledge) is that I want to focus on those aspects of my propositional attitudes that can influence my behaviour, that I can be aware of.

This post grew out of something I posted on Twitter, and thus thanks are due to all the people who argued epistemology with me over there. I’m beginning to think that Twitter is uniquely unsuited to philosophical discussions, but I’ve had some interesting conversations on there nonetheless. Thanks to:

This also marks my third blog post of the day. The others being here and here. I must have a deadline or something. (In my defense, the other two were already substantially written before today). I will be at EPSA so I will continue to not post here.

Written by Seamus

October 3, 2011 at 4:23 pm

2 Responses

Subscribe to comments with RSS.

  1. First a confession: not only am I not an epistemologist; I don’t even play one in class when I teach.

    That said, let me lay out my prejudices. I’m part of the party that isn’t much interested in issues about skepticism. I’m part of the party that assumes: we do actually get a bunch of stuff right; in the sense of “knowledge” that you want to reject, we do actually know stuff. I’m quite happy to appeal to Moorean facts, and also happy to wax lyrical about mid-ocean boat-building.

    I have an old-fashioned hope that I’m right about a lot of things. I sure as hell hope I don’t just *believe* I taught my class a few minutes ago; I hope I actually did. Some may find their breasts unburdened by such hopes; if so, then we differ in a psychologically interesting way, but that doesn’t change what I hope.

    But I’m prepared to go further: I’m prepared to say in the factive sense that I *know* I taught that class. That’s not immune to all possible skeptical worries, but that fact (yes) is one that I can’t find it in myself to care about. (Maybe I picked the wrong profession; you be the judge.)

    So I think’s there’s such a thing as having a true belief. If someone doesn’t, I don’t really know how to move the conversation along. And for a variety of diverse reasons, I hope I have a lot of true beliefs. I suspect that in some cases, I’m better off with true beliefs, but even in cases where there’s no practical payoff, I have this old-fashioned fondness for the thought that I’m connected to the world in the factive sort of way.

    So if someone tells me that they don’t have an interest in whether their beliefs are true, I’ll admit to being at a loss. I suspect that they’re likely to fall prey to pragmatic paradoxes, but maybe I’m wrong. What I can’t see is that there’s anything odd about wanting to have true beliefs and even less that I need to defend that desire.

    Now of course, if you’re trying to give a certain sort of explanation of behavior, what matters is that X believes various things, and not that they’re true or false. But why should what’s needed for certain sorts of explanations settle whether it’s sensible to care about having true beliefs?

    So I don’t even understand the demand to defend the “truth” part of the knowledge notion. I care about truth, and I’m not moved by philosophical skepticism. (Specific skepticism about specific claims, of course, is another matter.)

    Now of course, as most of us see it, having true beliefs doesn’t automatically amount to having knowledge. And so I’d like to understand better what makes a critter into a believer of true beliefs that (to put it lazily) aren’t just true by accident. I’m interested in how to flesh out that notion of being non-accidentally right, and I’m also interested in some very broad empirical questions about what gets us hooked up with the world in the truth-tracking sort of way.

    Of course I’m presupposing a boatload here. Most broadly, I’m presupposing the falsity of philosophical skepticism rather than looking for an argument against it. I’m also presupposing there *is* a boat afloat, imperfect though it may be, and that we can improve it as it sails.

    But there’s also a danger that we’re caught in a mere quibble. Suppose I give up requiring that “knowledge” be factive. Then there’s something else I’m interested in, that I know a lot of other people are interested in, and that (for reasons I’ve gestured to) seems worth exploring. Call it ‘schmoledge” if you like. It’s somewhat like what you call knowledge but (at the least) it differs by adding the factive component. If some people don’t care about schmoledge, then let a thousand flowers bloom. Those of us who do will probably look for a better word. I’m thinking “knowledge” has a nice ring to it.

    Allen Stairs

    October 3, 2011 at 6:07 pm

    • Thanks for this reply Allen.

      I’m not saying that I don’t care about the truth of my beliefs. Just that I don’t have the right sort of introspective access to the factivity of my beliefs to usefully act on the distinction. “Act only on your true beliefs” would be a good action guiding maxim if it weren’t for that pesky “ought implies can” business.

      The epistemological debates about what you have to add to JTB to make it knowledge would, I think, be no less interesting if they dropped the “T”. What makes beliefs justified, or warranted or whatever is an interesting question, whether or not the beliefs in question are true. Now obviously we want to be the sort of belief-former that behaves successfully in the world, and having true beliefs is a good way to do that. So how to get true beliefs is an important question, but that isn’t really the one that epistemologists seem to be engaging in.

      Seamus

      October 19, 2011 at 5:23 pm


Leave a comment