Can we imagine a case where you believe something, you're right, and you've got good reason to believe it - and yet you don't know that thing? We certainly can. Let's say you're a teacher at a high school, and the principal drives a very conspicuous car: a pink Ferrari. You show up to work one morning and see the pink Ferrari in the car park. In the staff room, another teacher asks you whether the principal's at work today.
"Yes," you say. "I saw his car."
As it happens, the principal is at work - but the pink Ferrari in the car park is, by pure chance, not his car but somebody else's. Did you know that the principal was at work?
By any reasonable standard, seeing an identical car to his in the car park would constitute good reason for your belief. And he is at work, after all! But intuitively, we don't want to say that you knew it.
Here's one reason why. If you had been informed that the pink Ferrari was not the principal's car - perhaps the other teacher replies "yes, I saw the car too, but it has a slightly different numberplate" - you would have changed your mind. In order to know something, it seems like there ought to be no true piece of information out there that, if you knew it, would make you decide you were wrong.
This kind of case is called a Gettier case after the first person to formulate one, Edmund Gettier. The general principle is to propose a situation where you have a true belief, but your justification for believing it happens by sheer chance to be wrong. Gettier cases pose serious problems for people who think knowledge is "justified true belief". Can a better definition of knowledge be found? Can the JTB definition of knowledge be altered to accommodate the Gettier cases? There's no consensus in philosophy right now, but I'll outline some competing theories next post.