Image via WikipediaQ: What are the relations among DPI, pixels, and print-size?
A: A complicated questions, and a LOT of misunderstanding. However, here's something that should clear it up:
DPI = dots per inch
Pixels are just that, pixels (concatenation of "picture elements"). Basically, a pixel is a 2-dimensional dot.
Pixels (or dots) multiplied by DPI (dots per inch) = length (in inches)
1200 pixels x 300 dots / inch = 4 inches
1200 pixels x 72 dots / inch = 16.67 inches
A picture has a particular resolution (i.e. X by Y dots). A specific output device has its own DPI. The best output is achieved when the calculated DPI (based on the picture's resolution, and desired print size), matches the output device's DPI. If the calculated DPI is LESS than the device's DPI, then the device will have to scale up the picture, which would make the image blurry (you zoomed in and saw too much detial). If the calculated DPI is LARGER than the device's DPI, then the picture is SHRUNK to the device's DPI (like zooming out, and not see the details).
For example, you CAN make a 1024x768 (0.8 megapixels) print on 16" by 12", but the picture at that physical size is a mere 72 dpi (calculated, 1024 dots / 16 inches = 72 DPI), but the output device is VASTLY superior to that (300, 600, even 1200 dpi), so the output will look VERY blurry.
Whereas if you print the same picture on a 3" x 2", the DPI you calculate backwards, is 1024 / 3 = 340 DPI, which is superior to 300 DPI, so the picture will be clear, but isn't as large as it COULD be without sacrificing details.
Best output is when you get EXACTLY 300 DPI, so 1024 / 300 = 3.413 inches. You can do the match for the Y dimension. Now the picture is as large as it can, without any zooming or scaling of pixels.