Talk:Digital cinema
From Wikipedia, the free encyclopedia
If you feel that you must reply to the heated debate, stop, think about other editors and writers, and please just bite your proverbial tongue until you've taken it to the right place. While I would disagree that the right place to debate it is on Talk pages at Wikipedia, I only ask that you not do it here unless it's on topic. —Wikibarista 15:34, 10 March 2006 (UTC)
[edit] History and Distribution
I have submitted a section under history of digital cinema based on the request for verifiable information. Feeback is welcome.
Also, I have cleaned up distribution. Because this is still very hotly debated I recommend maintaining informative rather than 'position' statements. Feedback is welcome.
[edit] Split topic
Perhaps is would be best to split Digital Cinema into "Digital Filmmaking," the process of producing film via digital techniques, and Digital Cinema, a method of distributing and presenting motion pictures.
This would be best because movies shot on film may be distributed digitally, while films made digitally (including Star Wars Episode III, Sin City, etc.) have been primarily distributed through film prints. In fact, even if digital projectors become the standard many filmmakers like the look of film and will source material on film for digital projection. 216.64.26.114 11:44, August 26, 2005 (PST)
-
- Agree. I'll put a proposal on the main page Fitch 08:44, 8 November 2005 (UTC)
- Agree. There is already a lot here that overlaps with the digital cinematography article.--Onejaguar 00:27, 6 December 2005 (UTC)
- The latest edit, on August first under economics, while good for formatting, highlights an older entry that really should be under digital cinematography. And has POV problems. It repeats as gospel truth the idea that DC is much cheaper than film production/distribution. StevenBradford 13:56, 3 August 2006 (UTC)
[edit] Exposure Latitude Adjustment
I have added the fact that the lack of exposure latitude can be handled and mitigated through techniques used with reversal film stocks which also have the problem. Please reword my adjustments as I am not a good technical writer. 66.32.95.85 12:07, August 25, 2005 (PST)
[edit] More Info Needed
However, I think this article needs more information on hybrid motion picture production: More information about mixed media filming (both HD and film), and more important, filming on film stocks but completing the post in digital. 66.32.95.85 12:07, August 25, 2005 (PST)
[edit] Bias against Video
I have a concern about this passage:
"Film is in many ways more portable than its high quality digital counterparts. The chemical process initiated by exposing film to light give reliable results, that are well documented and understood by cinematographers. In contrast every digital camera has a unique response to light and it is very difficult to predict without viewing the results on a monitor or a waveform analyser, increasing the complexity of lighting. However, accurate calibration techniques are being developed which eliminate this as a practical problem, and the possibility of inexpensive post-production color grading can make digital cinematography more flexible than film in achieving artistic color effects."
It seems to be biased to the point that film is better than video. Each is its own medium. While "[t]he chemical process initiated by exposing film to light give reliable results, that are well documented and understood by cinematographers," is true, it is because most cinematographers have little practical experience with video. If a pro videographer were to suddenly try shooting film I'd expects bad results as well. "In contrast every digital camera has a unique response to light," is true, but every different film stock has a different response to color, illumination, shadow, etc. Once a DP knows the digital camera it should be more of a task to get what you want. In fact, a camera like the Sony F-900 has a very predictable response to these conditions, like film, but has an advantage that the response can be tailored. Perhaps the difference in thought is that the visual nuances of film are dure more to the media, while in video it is due to the camera more. Also, the line "without viewing the results on a monitor or a waveform analyser, increasing the complexity of lighting," is fallacious. In fact my DP can get amazing shots on his F-900 without anything but his light meter and camera. And our interior shots were FAR EASIER to shoot than with film, allowing use to light a scene beautifully with about half the equipment and time invested. 66.32.95.85 12:07, August 25, 2005 (PST)
- Agree I think this is what wikipedia people call weasel terms or something like that. I'll put a note on it. Fitch 08:44, 8 November 2005 (UTC)
- Rubbish!
- "It seems to be biased to the point that film is better than video."
- Unfortunately, THE INESCAPABLE FACT is that the vast majority of film producers, makers of prime-time TV shows and larger budget commercials are adamant that film IS better than video, and they'll always use film if the budget allows it. They are not all morons; video cameras simply do not produce as good a picture as film, for a variety of technical reasons that most video enthusiasts seem incapable of either understanding or observing.
- This is not a "weasel term" either. These are facts: How many movies are released each year? How many were shot with Digital cameras? According to George Lucas in 1999, film should have been dead and buried by now. It hasn't happened. Nothing like it has happened. One of the problems is that many self-styled experts simply cannot tell the difference between film and video, whereas the people responsible for making the decision about what is to be used usually can!
- There is a weird "culture" that has grown up on the Internet that seems to be dedicated to denial of the Status Quo. (AKA wishful thinking). Somehow there seems to be a shared "adolescent" fantasy that: "come the (video) revolution" THEY'RE the ones who will be called upon to make the next Jurassic Park, Indiana Jones etc. Which is arrogant nonsense.
- I've been hearing the same old statements for over 20 years, since the first Betacams came on the market, and film still remains the preferred medium. Video is just about good enough for low-budget films like "Wolf Creek". That is the situation now, and barring any massive technological breakthroughs in the next few years, that's the way it's going to stay. There are certain laws of Quantum Mechanics that would need to be repealed before a video camera could ever equal the performance of a film camera. Electronic sensors haven't gotten all that much better over the last ten years, it's more that camera manufacturers have gotten better at disguising their deficiencies!
- OK, if they want to fantasize about their career prospects, fine, but the Wikipedia is supposed to be about verifiable facts. It's proven almost impossible to sort out the facts from the fantasies, because as soon as you do, some loudmouth dreamer comes along and re-edits the page. —the preceding unsigned comment is by 139.168.91.193 (talk • contribs) 03:32, November 12, 2005
-
- Agreed
-
- "It seems to be biased to the point that film is better than video. Each is its own medium."
-
- Why does it state in the article:
-
- "Given the constant year-on-year improvements in digital cinema technology, it appears that the long-term future of cinema is likely to be digital"
-
- You can't have it both ways. Either one is better than the other because they are competing with each other for what will be the direction of the cinematic medium or one will not replace the other because they are seperate mediums. Either take the latter out of the article and replace it with the former or drop the complaint. —the preceding unsigned comment is by Kasbrakistan (talk • contribs) 14:58, December 23, 2005
-
- There are certain laws of Quantum Mechanics that would need to be repealed before a video camera could ever equal the performance of a film camera. The quantum efficiency of photographic film is well under 10%[1]. The QE of a modern image sensor is about 30% (wavelength and filter dependent -- sensitivity is even higher if the CFA's are not engineered into the sensor). The current crop of high-resolution still cameras from Canon (5D, et al) and Nikon (D200, et al) are are all basically photon noise limited [2]. I would expect that the motion cameras being made today are probably equal to, if not better than, these still cameras. Which is all to say: digital stuff already is better than film stuff.
- This discussion -- and the rantish behaviour of the film freaks -- is old hat in the still photography arena. Just peruse the USENET. Heck, barely 5 years ago, people were still predicting a long life for still film photography, arguing on the basis that the image from a digital camera wasn't good enough ... and today we have announcements from Nikon they are shit-canning most of the their analog camera line, and trivially observed truth that few professional photographers are still using film on a large scale in the 35mm form (now even medium format is yielding). Even to digital afficionados the rate of developement is startling, breathtaking. The claim that it will take "10 to 20 years" for silver-halide film to be replaced by a CMOS/CCD sensor in the motion picture realm is frankly silly given this history. mdf 04:35, 14 January 2006 (UTC)
[edit] Another point of view... 2K doesn't hack it
Digital projection will only be good when 4 K is the norm. 2K digital is like a Xerox copy of film, not quite up to snuff. For the forseeable future, films will be shot on film, and composited with digital DI technology if the effects demand it. On non-effects driven films, film will continue for production, and 2K digital will take over for all the cheaper theaters, to run them after about 2012. The better grade of theaters will run either bigger 6 perf. film or 4K digital to get a superior look, especially with 3D in the mix.
HD in the home with raise the ante, so that theater display will have to be superior to present or the admission prices will have to be cut by at least 40% to literally save the theaters in the digital age. —the preceding unsigned comment is by Nativeborncal (talk • contribs) 21:05, December 25, 2005
[edit] Sorry, seems like there are facts and bias in dispute in this article
Regarding bias: The use of broadly perjerative terms is troubling. For instance, "stillborn digital revolution" and "at this point, no movie directors are seriously using HD cameras to make theatrical films." Huh? What about Spike Lee, Steven Soderberg, Michael Mann, David Fincher, George Lucas, Bryan Singer, James Cameron, Lars von Trier, and Robert Rodriguez? One failure like "Shark Boy" does not a "stillborn" digital revolution make.
(The author's petulant response to the post alleging bias doesn't help his argument against his bias: denigrating phrases like "weird culture," "adolescent fantasy," "wishful thinking" and "loudmouth dreamer?" Whew! Let's talk rationally here. I doubt the many millions of dollars being invested by films studios, filmmakers and technology companies into evolving video into worthy competition for film involve wishful thinking, unless they're financially incompetent.)
Conversely, some of the ADVANTAGES of digital cinema could also be taken issue with as stated. The "digital cinematography" W-entry claims that the digital format actually does not, in the overall, save money in production, and cites concrete reasons. Which one is right? Which claim about digital cameras' superiority in low-light situations is accurate?
These two articles have been suggested for merger. I suggest a review of the facts and general attitude beforehand. —the preceding unsigned comment is by 65.42.107.58 (talk • contribs) 12:31, December 29, 2005
[edit] Film versus Digital
"There are certain laws of Quantum Mechanics that would need to be repealed before a video camera could ever equal the performance of a film camera. Electronic sensors haven't gotten all that much better over the last ten years, it's more that camera manufacturers have gotten better at disguising their deficiencies!"
What utter rubbish! What laws of Quantum Mechanics? Show us please this verifiable scientific data. That digital cinematography could equal film, in terms of resolution, dynamic range, latitude, and DOF eventually I would say is undeniable and self-evident and is only a matter of pixel count, bit depth, quantization and the physical dimensions of the imaging device. Furthermore, although 35mm film is often quoted to have an equivalent pixel resolution of somewhere between 4-6k, in its original NEGATIVE form, it is also demonstrable that once a well worn release print is shown in many provincial theatres (which is quite often) then it's resolution is dramatically reduced through worn print, bad projection setup/lens and dirty impaired screen. Whilst HD skips this multi-generational degradation, the problem of dirty screens and bad projection can remain; the point is that any talk of 35mm's inherent superior resolution is subject to various qualifying conditions and is almost impossible to verify in absolute common practice.
To replicate the particular 'qualities', as seen in a modern aesthetic sense, that make 'film' look like 'film', as opposed to video, is however a matter of really accurately simulating film's many peculiar 'artifacts' such as jitter, grain, speckles and floating image registration etc. However these are strictly speaking not qualities but defects that were never intended in the first place and manufacturers of film equipment and producers have striven to minimize their effects over many years. It is only recently, since the so called digital revolution, that mostly young filmakers, striving to emulate a 'glossy Hollywood' look on video, perceive these defects to be positive qualities.
I think, on the other hand, it is highly unlikely that the average public patron will even notice the difference between 2k and 4k digital projection since they have a hard time distinguishing between 16:9 SD and HD video. —the preceding unsigned comment is by 80.58.2.44 (talk • contribs) 11:04, January 1, 2006
[edit] Stop Acting as if There is a Bee in your Bonnet!!!
Jesus Christ!!! You digital people are amazing!!! You say that digital will replace film. You talk of this utiopia where digital cinema produces better quality images. When we point out flaws in your argument (flaws we can prove), you jump up and down and say we are applying an unfair standard to digital. Right now, film is undenyably a better quality image than digital. All studies prove this. Many DPs have said digital will not have the same look as film. How an audience will respond (and which one they will embrace) is yet to be seen.
The complaints for digital are fair game. Name me one digital film that is on par with Battleship Potemkin or Citizen Kane in cinematic art. Sin City? Revenge of the Sith? Maybe Sharkboy and Lavagirl? If you give me this stupid "you're just being subjective" b#llsh*t, I swear I will break something!!! Film only advanced by critism. You have to be able to prove digital will be able to create artistic masterpieces like Potemkin or Kane in order for us "film snobs" to take you seriously. Otherwise, shut up!!! —the preceding unsigned comment is by 71.139.61.28 (talk • contribs) 22:53, January 5, 2006 (PST)
- If you really wanted a digital film on par with Citizen Kane, take Citizen Kane's film and use a conversion device to change it to 10000x5405 digital, and there's a digital film. —Last Avenue [talk | contributions] 04:59, 16 February 2006 (UTC)
- The director creates the art, not the camera. Two (identical) great movies, one shot with a DV camera, and one with a film camera, will still be great. Last Avenue 00:42, 17 January 2006 (UTC)
- So it's the talent's fault, not the technology? Are you saying that filmmakers today lack the talent that people like Renoir and Welles had? —the preceding unsigned comment is by User:71.139.43.28 (talk • contribs) 19:22, 18 January 2006 (PST)
- No, I never said today's filmmakers lack talent, nor was I saying today's films aren't as good. If someone thinks today's films suck, it is pretty much the talent's fault, and not modern technology. Last Avenue 02:26, 20 January 2006 (UTC)
- This is where film and digital supporters differ. Film supporters believe digital causes a certain amount of laziness. It used to be you could only do so many takes because it would cost too much film stock and you would spend months editing. So you made the takes count. With digital, you can just keep shooting and shooting and never have to worry about stock supplies or editing. So filmmakers do not pay attention.
- It would hardly cost more in film to go from 10 to 20. Sort of like arguing about wasting money by losing a quarter when both are losing millions elsewher (financial markets, etc.) If film directors started using 20 instead of 10 takes, then the price would go up from around $2M to $3M. The other $1M that didn't increase was the distribution/etc. costs. —Last Avenue [talk | contributions] 04:53, 16 February 2006 (UTC)
- The many shots problem has occurred in current motion pictures where people just use electronic editing to edit. 20 takes are made for a shot (up from the usual 10). Also, scenes are just shot for the editing room, where the film is haphazardly made. If they switch to digital tape, it will get worse.
- Also, CGI just never seemed real. I know I am not alone in this. Yet people keep using it instead of using good effects or beautiful scenery or great acting. Not as much work goes into it. Just a bunch of 1s and 0s.
-
- Woot. Filmmakers can simply continue their old ways, using ten takes, no CGI, etc. Then it will still be the same '1s and 0s' as before? A simple conversion of an older film to 8192*3428 (2.39:1) or something similar will still yield '1s and 0s', yet still be the same. A frame of film can easily be summarized as '1s and 0s' and still look exactly the same. —Last Avenue [talk | contributions] 04:53, 16 February 2006 (UTC)
- A lot was lost in the transition from Silent to Sound. While it proved beneficial in the long run, it took decades to get the visual style back and we have never fully achieved the style once so innovative in silent. Also, since black and white has essentially become illegal in film making, a lot has been lost as well in the visual style. Any transition to digital would also loose style. None of you digital supporters have given me any reason to believe such a loss would be beneficial in the long run. —The preceding unsigned comment was added by 71.139.51.137 (talk • contribs) 05:33, 20 January 2006 (UTC).
- "Loss of style in the long run?" The tiny costs of extra takes (see above) is hardly preventing directors from taking 20+ times. "A lot has been lost as well in the visual style" due to black/white to color? Why not wear a pair of colorblind glasses? 04:53, 16 February 2006 (UTC)
[edit] This article needs a lot of work
This article is currently a mixture of useful information and material that ranges from mistaken to nonsensical. This article desperately needs contributions from experts. -- Karada 22:04, 27 January 2006 (UTC)
- What's worse is that people keep adding stuff about digital cinematography! I just don't have the time to merge it properly there, so I'm not deleting it, but it needs to go! And the above discussion is all about digital cinematography, too, not digital cinema. I don't think digital cinema should be associated with that kind of attitude. —Wikibarista 06:33, 3 February 2006 (UTC)
[edit] I made a very controversial statement (and can back it up)
I have made the statement that Ben-Hur's and Lawrence of Arabia's entire production budgets cost less than the amount of money spent on Superman Returns's CGI effects.
Superman Returns spends $100 million on special effects:
http://www.cinematical.com/2005/10/31/superman-returns-hits-250-million-picks-up-investor/
According to IMDB the production budgets for the major epics are:
Ben-Hur = $15,000,000
Lawrence of Arabia = $15,000,000
Adjusted for inflation (according to http://www.westegg.com/inflation) are:
Ben-Hur = $95,811,724.46
Lawrence of Arabia = $92,628,161.25
I will be the first to admit there is not much of a difference between $100 million and $95 million. However, this is a tremendous factor when we are talking about JUST THE SPECIAL EFFECTS!!! Also, Superman Returns was shot on digital. Where was the cost effectiveness there? They saved $2 million to spend over $200 million. This is before they spend well in to the nine figures promoting and distributing the film. Is this the final word in the arguement? No. But it should be considered when stating digital is more cost effective than film. —The preceding unsigned comment was added by 71.139.33.98 (talk • contribs) 04:15, 16 February 2006.
- How exactly did they spend over $200 million due to digital? Either way, digital didn't cause the producers/director to spend $200M on CGI and whatnot. Either way, there would have been special effects, and the special effects would have been more expensive with film-->digital-->special effects-->film (digital intermediate). —Last Avenue [talk | contributions] 04:26, 16 February 2006 (UTC)
- It's all due to the directors wanting more special effects. It's not the fact that they're switching to digital, it's the fact that there are more big-budget and uber-special-effects movies. —Last Avenue [talk | contributions] 04:27, 16 February 2006 (UTC)
The person who posted the paragraph entitled 'Criticism' appears to be citing facts that have no actual relevance to the debate about the merits of digital filmmaking. He states that the cost of film production has risen incredibly, which is true, but what does that have to do with digital video which has been a valid format since Spike Lee's 'Bamboozled' which was released in October, 2000 -a little over five years ago? Since then only a few features have been filmed on digital video, and all of those would have cost much, much more if they were shot on PanaVision 35mm, or a comporable format. The writer then seems to oddly cite that because special effects, CGI in particular, are still high cost, that somehow that means any savings from the feature being shot in digital video are negated due to the CGI (!?) The same feature would still have those CGI effects on top of the already high costs involved with film. It just seems to be an illogical connect the dots, that makes zero sense!
What really baffles me to know end is when someone stupidly makes a comment like "The complaints for digital are fair game. Name me one digital film that is on par with Battleship Potemkin or Citizen Kane in cinematic art. Sin City? Revenge of the Sith? Maybe Sharkboy and Lavagirl? If you give me this stupid "you're just being subjective" b#llsh*t, I swear I will break something!!! Film only advanced by critism. You have to be able to prove digital will be able to create artistic masterpieces like Potemkin or Kane in order for us "film snobs" to take you seriously. Otherwise, shut up!!!" Ummm! Ok??? (scratches head!) You can make any film with digital video! Are you saying 'Citizen Kane' could not have been made on digital video, had the technology been available?? "Name me one digital film on par with 'Citizen Kane' HUH??????? What does that have to with anything at all? Creative minds create using the tools! The tools themselves do not create the film. The camera does not prohibit creativity, nor does it ensure it! This person seems to have a severe mental impediment, and I don't mean that as an insult. His comments seem to lack any real logic. Orson Welles could have made 'Citizen Kane' using HD DV, the script could have been developed on 'Final Draft' and edited digitally on a G5. It would still be the same film. You are not a 'Film snob' you an illogical, none too bright, quasi-luddite! —The preceding unsigned comment was added by 24.34.179.235 (talk • contribs) .
- What you don't seem to realize is that Orson Welles said himself "there has never been a good film made in color." He also denounced widescreen as a "bag of tricks." Now, I personally don't agree with these asumptions. However, he did have a point. Welles was not a big fan of super-high tech technology. He only switched to color when he had no choice (on his last two films). I bet you can't even name me these films of the top of your head. He was trying to make the point black and white Academy film still could achieve greatness. He made his film on black and white. Mind you, he went to RKO with a contract that gave him unlimited power for "Citizen Kane." On "Lady from Shanghi," he had a big budget with Rita Hayworth as his star. She was in plenty of Technicolor productions. So if he asked Harry Cohn (he was the head of Columbia, for all you digital know-it-alls), he could have easily obtained them. On "Touch of Evil," the studios were switching to color and he probably could have got Eastman stocks.
- In all of those cases, did he persue color? NO!!! He realized black and white is different from color. This would have changed his entire films. They would have lacked the same look as in the older black and white format.
- If his films had been in totally different in Color, imagine them in digital. There would be no flicker, no moving images through a projector, but a digital rendering of images from a computer program. No creative editing, no miese en scene. It would have been totally different.
- So please, all you digital supporters, SHUT THE F#$K UP!!!!!!!!!!!!!!!!!!!!!! Go back to computer programming you do so well and stop acting as if you know about the art and craft of cinema!!!!!!!!!!!!! —The preceding unsigned comment was added by 71.139.36.29 (talk • contribs) .
[edit] Off-the-cuff calculations
Some of the above comments are rather absurd, but also rather lacking in... detail.
Let's do some quick math for fun. Say we're shooting a two-hour feature film, with a comfy 8:1 shooting ratio (for every 8 feet of film we expose, 1 foot ends up screen). That's 3600*2*8 = 57,600 seconds, at 18 inches per second (24fps, 4-perf = 0.75in), comes out to 86,400 feet of film. A quick check of FotoKem's web site doesn't show a price list, but a little Googling turns up a price list for their services to the USC cinema school [3] which should give us a decent start. $0.10/ft for basic negative, another $0.20/ft for one-strike dailies... that's $25,920 to develop all our film and give us a complete set of daily prints to go over in preparation for picking what to edit with.
A little more Googling, here's some company selling filmstock [4]; a 1000-foot reel of, say, the 250D costs $705.64; so let's say $60,967.30 to buy all the film negative in the first place.
Even if we give a huge amount of wiggle room on these figures, we're looking at something on the order of $100,000 for a huge amount of film. Double it and we're still looking at a measly $200,000. Now, compare this to the costs of talent, labor, insurance, location fees, set construction, equipment rental, electrical, post, etc etc. These things add up, especially with the big names... ten million dollars, thirty million, a hundred million, two hundred million?
Now, if you're a no-budget amateur production, sure, that film cost is big, and since it's consumable you can't borrow it from a buddy for free. Going digital can be a big budget-saver for a small production. It can also be really convenient if you're a big-budget CGI-fest, as you can skip film scanning. But the cost of film over tape doesn't seem likely to cause a significant production to decide to run fewer takes; the cost of labor is going to be a much bigger factor.
Disclaimer: just an ex-film student, not a real producer. :) --Brion 08:58, 7 March 2006 (UTC)
The reason for the few more takes is the cost but rather editing equiptment. Before, with either a Movieola or flatbed, you had to actually cut the film. This took a long time and people tried to reduce editing time by taking less shots. Now, with digital editing equiptment, it takes less time to edit so people shoot more. An example: the movie Con Air shot over one million feet of film. They then hired 8 editors to cut the movie.
This is a problem the DGA addressed. People no longer have to map everything out. You shoot first and ask questions later.
[edit] This is getting ridiculous
I posted some warnings at the top of this talk page, hopefully to curb the abuse. From WP:CIV
—Wikibarista 15:34, 10 March 2006 (UTC)
[edit] Quality of Digital Projection versus Film
Ok this is likely going to cause some controversy :D I would, however, like to see some discussion on the quality of digital projection.
Here's what NATO (National association of theatre owners) says: C) High Quality Levels Capable of Exceeding both Film and the Home Digital cinema is capable of achieving quality levels that exceed that of duplicated film, and capable of significantly exceeding that of the home. Image quality is associated with (generally in this order): Color space, Contrast, Resolution. Many have been impressed with the current TI 2K projector. This technology comes close to matching the color space of film, and well exceeds the color space of conventional HDTV. Its contrast is not yet that of film, but has improved significantly over the years. In comparison to the home, it well exceeds that of conventional NTSC television, although this comparison may be less favorable as consumers switch to new high-contrast digital HDTV sets. Resolution, however, is the one number that commands popular focus, and is easiest to market to consumers. In its studies, the ITU demonstrated that duplicated 35mm film has less resolution than HDTV (HDTV has an image resolution of 1920 x 1080 pixels). However, to differentiate the lower range of digital cinema from the consumer image format, DCI has specified a low end “2K” resolution maximizing at 2048 vertical lines by 1080 horizontal lines. An “upper-end” 4K resolution is also specified, maximizing at 4096 vertical lines by 2160 horizontal lines. While 2048x1080 is only slightly larger than the consumer HD 1920x1080 format, the 4K 4096x2160 format offers 4 times the number of pixels found in consumer HDTV. While 4K is the goal, the technology today is only proven for 2K. Sony is demonstrating a 4K projector at trade shows, but the demonstration has yet to match the color space or contrast of the TI projector, making the Sony projector an under-performer visually. The Sony projector has yet to be tested in a busy, metropolitan cinema that operates many shows daily. Even if a 4K projector were available, it would need a 4K server. As of this writing, no vendor has a 4K server on the market, or even in demonstration. Sony, notably, uses 4 servers to drive one projector in its trade show demonstrations. 4K technology is likely to be many years away from achieving theatre-level performance and operation. To insure single-inventory content in a 2K / 4K world, DCI specified a standard compression technology capable of handling both sets of image resolutions in one data file. Using JPEG2000, a compliant server can play a 4K image to a 2K projector by extracting the 2K version of the image from the 4K image file. The specification of JPEG2000 and the specific application of it for single-inventory content distribution was a significant milestone in the DCI process.
-Tying it with economics: A big part of digital cinema specifications is ensuring that the quality stays above that of home theatre, which is a competing market. Hence instead of adopting the 1920X1280 frame size of HD, d-cinema is going with 2K instead.
Glennchan 01:14, 7 July 2006 (UTC)
- "As of this writing, no vendor has a 4K server on the market, or even in demonstration."
- Not true. QuVIS demonstrated its Digital Cinema System 4K at NAB in April, and at IBC in September. QuVIS offers a 4K JPEG2000 mastering system, 4K servers, and 2K servers that can do a realtime extraction from 4K material. 4K is now only hampered by the shortage of projectors, and that will change soon.
- PyroGuy 03:48, 26 September 2006 (UTC)
[edit] Digital Cinema Venues
Since the Digital Rev has started, is it appropriate to start a category of theaters using digital projection technologies?