To help clarify the difference between DPI and resolution, imagine this scenario:
- You have a photo on your hard disk, the file contains image data that is 1000 pixels wide.
- You print that photo on a piece of 5-inch long photo paper in "fit-to-page" mode. What's the DPI of the photo?
- You then reprint that photo on a piece of 11-inch long photo paper in "fit-to-page" mode. What's the DPI of the photo now?
Or better yet...
- You view that photo using a video projector at 1:1 ratio, so that 1 pixel on the projector is 1 pixel on the file.
- You have the projector 12 feet from the wall so the image on the wall is about 6 feet wide. What's the DPI?
- You scoot the project up really close to the wall so that the image on the wall is about 1 foot wide. What's the DPI?
You can see that, for the source image file itself, DPI is utterly meaningless. Any DPI number that happens to be attached to the file is arbitrary. DPI only counts for physical output: How many pixels per inch in the final output stage. Whether that physical output is a printer, an LCD monitor, or whatever.
Now, to confuse things...
Many CCDs don't use rectangular elements in their arrays. I know some of them have hexagonal pixel elements and they convert the results into X/Y arrays of pixels using the camera's internal electronics and then save that file for you to transfer to your computer. The claimed resolution of any given image from the camera becomes even murkier there.