There's been a lot of interest in the PS3 due to its stated 1080p output for both games and movies (via Blu-ray). What's interesting is that a lot of folks don't realize how meaningless 1080p actually is in this generation.
Let's take games first. The PS3 has roughly the same pixel-pushing capabilities as the Xbox 360. Don't need to take my word for it, it'll be obvious soon enough over the next year. Even if this wasn't the case, consider we now live in a multi-platform development world, and that the current sweet spot developers are targeting is 720p due to the extremely similar system specifications. Simply put, a developer who is planning to release their game for both the Xbox 360 and the PS3 will aim for a common attainable ground. In fact, I'll stick my neck out and predict that that you won't see any 1080"x" games for the PS3 this year.
Let's move on to HD movies. Home Theater Magazine
(recommended!) has a sister website, and I wanted to point you to a great blog post
by Geoffrey Morrison discussing the topic. To quote:
"Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras). All TVs have a refresh rate of 60Hz. What this means is that the screen refreshes 60 times a second. In order to display something that is 24fps on something that is essentially 60fps, you need to make up, or create new frames. This is done using a method called 3:2 pulldown (or more accurately 2:3 pulldown). The first frame of film is doubled, the second frame of film is tripled, the third frame of film is doubled and so on, creating a 2,3,2,3,2,3,2 sequence. It basically looks like this: 1a,1b,2a,2b,2c,3a,3b,4aâ€¦ Each number is the original film frame. This lovely piece of math allows the 24fps film to be converted to be displayed on 60Hz products (nearly every TV in the US, ever).
This can be done in a number of places. With DVDs, it was all done in the player. With HD DVD, it is done in the player to output 1080i. With Blu-ray, there are a few options. The first player, the Samsung, added the 3:2 to the signal, interlaced it, and then output that (1080i) or de-interlaced the same signal and output that (1080p). In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference. Check out this article
for more info on that."
Most modern HD displays (Plasmas, LCD, DLP, etc.) display content progressively, even if they first received an interlaced signal. Let me restate that: when you're watching a 1080"x" signal on a modern HD display, you're almost always watching a 1080p signal. The only difference is where the de-interlacing happens - but the displayed output is always 1080p. (Minor caveat is that there are rare TVs that don't de-interlace correctly, as described in the link above. But this is very rare today.)
This is why I get hung up on the image encoding quality of HD-DVD vs. Blu-ray. That's where you're going to see a perceivable difference. As to games, 99% of PS3 titles will natively render at 720p; the few that come out with 1080"x" support are either going to be simple classic arcade ports that don't need to render complex scenes (think the original Battlezone), or will give up a lot of in-game visual effects and simply won't look very good (hence the poor showing of Gran Turismo "HD" at this past E3).
To sum up, don't get sucked into all the 1080p hype. Just make sure you have a recent HDTV that de-interlaces 1080i signals correctly and you'll be just fine.