Differences between the human eye and the camera
Have you ever looked at a photograph you took and the result doesn't look like the scene you wanted to capture? I know it has happened to me many times, specially when I started studying photography.
Why does this happen? Because the human eye has an amazing capability of adapting to different light situations. It can adapt to situations with lots of light or little light available and see every detail in the scene (or most). The camera, on the other hand, can't register so much contrast (remember, contrast is the difference between light tones or colours and dark tones found in a picture; the lights and shadows in a scene).
So, in situations of high contrast, the human eye can see every detail in the scene while the camera doesn't. This means that parts of the image will loose detail. Depending on how we measure and expose the photograph, we will loose detail in shades and bright areas found in a scene.
The human eye can even adapt itself to different light colours. For example, if we take a picture inside a house illuminated by light bulbs, the entire picture will come out in a yellow "tint", unless we use filters or a function to balance whites (see colour temperature). However, when we are shooting a photograph, our eyes see colours practically as if they were illuminated by white light. For you to understand this concept better: if there is a white table in a scene, our eye will see it white; on the other hand, the camera registers it as yellow because it is illuminated by yellow light (light form light bulbs usually looks yellow).
This obviously means that what we see in the scene will not come out exactly in the photograph. We have to learn how the camera "sees" in some way. This is a very important (and very hard to learn) concept which can change the way we take pictures completely.
Thus, we need to learn, understand, and know the differences to be able to correct, compensate or use these differences in our favour.