Last updated: January 2005
[If you want to print this article, the easiest way to avoid trouble is to open this frame in a new window and print it]
'The future has the bad habit of becoming history all too soon'
(Quoting myself :)
The problem with writing about the future is partly explained in my quote. What I call future today will be history tomorrow. And when it comes to texts on a website, this is especially true. A text is so easy to forget for a couple of years and suddenly you realize that half of what you speculated about has been proven wrong by history.
Nevertheless, here's what I'll do. I will divide my text into 'near' and 'far' future. When the near future becomes history I will do the necessary changes.
Past, present and near future: 2000-2005....
PrologueRegardless if a person is religious or not, or regardless of what religion one follows, the New Year is a good way to round things of in the society or in ones life in general. When we learn history it helps to divide it centuries or decades. This very document is divided in decades and that's how we relate to history. The 60's, the 90's and so on. As such, the shift to a new millennium holds a significant importance. It divides history!
Everything that happens after December 31:st 1999, belongs to the next decade, century and millennium. When the day comes and people will be talking about the 20:th and even the 21:st century in the past tense they will tell you that the American civil war was in the 19:th century, the Pentium III was released in the 20:th and the Voodoo 4 in the 21:st.
So what comes in the near future?Most of you who follow the graphics industry have a pretty good idea about what to expect in the coming years. On this page I report what is coming and when future becomes history I update the predictions with facts. Here's the story so far.
The year 2000 really was 'the year of nVidia'. In december, nVidia acquired core assets of the once mighty 3DFX. This was a good reminder to all of us how quickly things can change in the industry. ATI are still going strong and Matrox has announced new products, but overall, nVidia has become the clear and undisputed 'standard' for home-computing. There are still some other manufacturers that compete on the professional market and still give nVidia a good match, but that too is probably only a temporary glitch for what now seems as the 'unstoppable' nVidia.
But let's not forget how unstoppable 3DFX seemed only a few years ago...
2001 saw a continuation of nVidia's dominance of the computer graphics market with an occasional competing product from ATI.
Nintendo release the Gamecube in September 2001 (Japan). Click here for the specs.
Gameboy Advance was released 1H 2001. The big event of 2001 was probably be Microsoft's Xbox console. With nVidia developed graphics chip, HardDrive, fast Intel CPU & more, it's designed to kick ass! Main competitors will be Playstation 2 and Nintendo Gamecube. The once so influential SEGA has given up it's hardware business and will now concentrate on software. The company's new aim is to become one of the major players in the software biz.
The movie scene had it's share of limit-pushing movies, including Final Fantasy: The Spirits Within, maybe the first real attempt to create realistic humans in a completely computer generated motion picture while Pixar's Monsters Inc features some pretty convincing fur.
Jurassic Park 3 did it again, of course, with dinosaurs so real, even a graphics artist can sit down and enjoy the movie without thinking about the special effects. The movie A.I. featured extremely well produced special effects, but they were simply evolutionary works based on the same techniques created for the landmark movie Terminator 2. (Interestingly, it was the same crew Dennis Muren/ Stan Winston that worked with the FX.) The biggest movie of the year award goes to Lord Of The Rings featuring some very ambitious scenes.
Those of you who watch the television series Star Trek will no doubt asked yourself the question why all the alien races look like humans with some minor cosmetic changes such as different nose or some crap glued to the forehead. The answer is of course cost! Star Trek: Voyager actually features a race known as species 8472, which is computer generated. However, the screening time of that species is sparse to say the least. Thanks to the lower prices of special effects, who knows, the latest Star Trek series Enterprise may feature lot's more CG aliens assuming it will live for the standard 7 seasons. (It didn't, ED note)
Q1 2002 saw the release of nVidia's Next-gen GPU. The nv25 chip (GeForce 4 Ti). This is the chip that will make Xbox users understand how fast graphics technology is moving forward. A top-of the line PC in 1H 2002 is already many times more powerful than the Xbox (but of course also more expensive).
ATI released the R300 chips (R200 successor) in July. Powerful DirectX 9.0 chip that will hold the performance crown at least until nVidia releases it's nv30 chip. So you can be sure that as soon as this Christmas, the consoles will be clearly inferior to a decent PC. Because of this, Sony are already releasing details about the Playstation 3 and Microsoft are already working on Xbox 2.
Speaking of ATI, they were responsible for leaking an early version of id Software's Doom III game. This game is the brainchild of programming legend John Carmack. The leaked version spread around the world like wildfire and people quickly realized 2 things. First of all, the game looked incredible, the atmosphere was more movielike than any other game in history. And secondly, they realized that this game is going to force a whole lot of hardware upgrades among the wannabe users. This game was clearly going to need faster GFX chips than were available at the time.
On the movie scene Star Wars: Episode 2 displayed a dazzling amount of incredible CGI shots. They weren't doing things that have never been done before, but they are perfecting what was seen in Episode 1... The visuals aren't perfect yet, but for most of the time, it's difficult to imagine if & how they can be improved. Perhaps one of the greatest advances was made in cloth simulation. Robert Bridson, Ronald Fedkiw (Stanford University) & John Anderson (Industrial Light and Magic) presented a paper on 'perfect' cloth simulation at SIGGRAPH 2002. In many scenes in SW2, the actors were actually replaced by digital stunt-doubles and all the clothing needed to be simulated perfectly to fool the eye.
At the end of 2002, fans of Lord of the Rings in particular and CGI fans in general had the opportunity to watch just how far CGI has come. The Two Towers features a computer generated main character (Gollum) which, while not 100% convincing, looks pretty damn photo realistic anyway. The motion was captured from a live actor and the interaction between the CG character and the physical environment was among the best we've seen so far.
At the end of 2002 everyone was waiting for nVidia to launch their latest graphics chip (GeForce FX, alias nv30). While it was announced earlier, the actual shipments started in January 2003 and even then, it was very rare. Pretty soon it became clear that the chip wasn't exactly what people were hoping for. Even nVidia realized that and immediately started to work on the slightly modified successor (nv35) which they finally announced in May 2003. As always, ATI were there to match their product line quite nicely. Competition is usually good for the customer, but I must say that 2003 also shown exactly what is wrong with this situation. By years end, the Graphics Card market is absolutely flooded with different models released by nVidia & ATI. For someone not very familiar with the market, it's next to impossible to make out exactly which model may be best for them.
Still, the graphics chips are rather useless unless there is some good software that puts them to good use. For a long while, 2003 seemed to be one of the most exciting years for a very long time for gamers and movie fans alike. Ultimately, there were release postponements so the year ended in disappointment, but let's take a look at what happened.
The E3 show was the main event where all the big game titles were revealed. Doom III was shown again, but the game everyone was talking about was definitely Half Life 2. The sequel to the immensely popular Half Life, released in 1998. Carmack himself has said that gaming has reached a 'golden point' graphics-wise because it's possible to do pretty much anything the artist can come up with. The characters in these games look extremely lifelike compared to previous generations of games and they certainly set a new standard for computer game developers everywhere. Another thing that impressed the HL2 audiences was the incredibly sophisticated physics simulation within the game. (Physics engine provided by Havoc TM) Add some very advanced AI to that, and you soon realize that HL2 offers a new kind of gameplay compared to older generations of games. (Doom III will be a similarly realistic experience) As I mentioned, it turns out, neither Doom III nor HL2 were released in 2003. So now they are officially 2004 releases.
Even though the postponed releases disappointed the gaming community, 2003 was a quite extraordinary movie year. Like in the game biz, a lot of much anticipated sequels were launched during 2003. X-Men2 offered pretty much 'standard' special FX (and some lovely non CG footage of Mystique ;-). Matrix 2 again manages to shock audiences with incredible and unique special effects that make everyone go 'how the hell did they do that??'. Terminator 3 was another blockbuster sequel and considering that T2 was such an important landmark in movie-production, it had much to live up to. At the end of the day, the effects in T3 were very nice and polished but in truth not revolutionary at all. Matrix Revolutions featured tons of special effects (in fact too many for some) but at this point we are being so spiled that we hardly raise our eyebrows although admittedly, the quality of the effects was stunning. Certainly the big 2003 finale was the release of the last LOTR movie, The Return of The King. Most if not all special effects were almost perfect, but the most impressive thing was possibly the seamless blend of real and CGI footage. The visualization of the attack on the White City was quite remarkable even by today's standards and I can honestly say that I have not seen anything quite like it before. One thing is certain, all the good stuff in 2003 will spoil the audience to a degree that it's going to be pretty much impossible to impress them in the future... Ah well, there's always Star Wars Episode 3 in 2005. They have their work cut out for them, that's for damn sure.
2004 was a great year for graphics in computer games. Many of the titles that were expected in 2003 actually shipped in 2004. And just as many of us knew, a couple of games in particular raised the bar for graphical quality we expect from video games.
The first positive surprise of the year was the game FarCry which was pretty much the first game to utilize the next-generation graphics and could make use of the latest advancements in computer graphics such as Direct X9.0 shaders. The second big title was the eagerly anticipated Doom3, the sequel to the legendary and revolutionary Doom series. Although the game itself might have left one or two players disappointed, no one could deny that the graphics were nothing short of brilliant making use of dynamic lightning, shadows and very moody surround sound. it truly was more of an interactive horror movie than just a game. Then, towards the end of the year, possibly the most anticipated game of all time finally arrived. It was of course Half-Life 2. Being in development for some 6 years, people were starting to wonder if they could ever live up to the hype, but luckily the answer is YES! Apart from the incredibly realistic graphics, the game also added a whole new dimension of gameplay through it's cleverly implemented physics engine.
All in all, 2004 will be remembered by games as the year when computer graphics took a giant leap forward. All new games will inevitably be compared to the above mentioned milestones. That is good news for the gamers and many sleepless nights for the game developers.
These new landmark titles of course demanded pretty fancy hardware to run as they were supposed to, causing many gamers (including me) to upgrade their hardware. nVidia were struggling with it's FX (GF5) line allowing ATI to gain market share. However, in 2004 nVidia made a glorious comeback with their nv40 hardware. Full PixelShader 3.0 support and massively parallel architecture was the medicine that cured the FX plague. ATI of course released their own products (R420) but the gap from the previous generation of hardware was gone. As it turned out, both nVidia and ATI signed special deals with the makers of Doom3 and Half-Life 2 respectively, making sure the games would run optimally on their hardware. At the end of the day, top of the line models from both makers were more than adequate to play all games perfectly well.
As 2004 was the introduction of a new level of graphics quality in games, it is understandable that the level will stay there for the first year or so, because other game developers will release games based on licensed Doom3 and HL2 engines. It takes a long time to write a new revolutionary 3d engine and the only new engine on the horizon right now is the Unreal 3 engine, expected to arrive in 2006.
One more thing that deserves to be mentioned is the development of graphics power in handheld devices. Both nVidia and ATI now offer 3D graphics chips for mobile phones and PDAs. This is probably where the graphics development will be most noticeable for the next few years. In late 2004 Sony also took a step into Nintendo dominated territory by releasing its PSP console. It's a handheld device with a fairly high resolution widescreen display and hold roughly the same graphical power as the Playstation 2. At the time of writing, the device is yet to be released outside Japan, so time will tell if it a success or not.
As far as movies go, there was a slight feeling of anticlimax after the amazing movie year 2003. The Terminator, Matrix and Lord Of The Ring trilogies seem to be concluded. It seems that special effects have matured to a point where people barely care about them anymore. It seems as if it has all been done already. The purely computer animated movies continued to make it big at the box office. Shrek 2 and Pixar animation's The Incredibles are two examples. Perhaps the most noticeable thing is that the development time for a computer animated feature film has been cut down drastically. It used to take Pixar 3 or more years to complete such a movie, but now they seem to release a new movie every year.
The next generation of video consoles including Playstation 3 and the successor to Nintendo's GameCube (Probably going to be called Revolution) will be released in 2006. Microsoft however, decided to steal this Christmas by releasing its successor to the Xbox (Called Xbox 360) this year. And not only did Microsoft have the only Nex-Gen console this Xmas, they also decided for a simultaneous worldwide release rather than the classic sequential release schedule which often leaves the Europeans waiting for a year or so (Well, it actually was sequential, but it was only a week between the US and the Euro releases). Personally, I think the Playstation 3 hardware could have been rushed out in 2005 but probably not the software. You don't wanna launch your console with a meager library of buggy software. Most Xbox360 launch titles are glorified -current generation- PC games. It will take a couple of year before we really se games utilizing the full potential of these new super-consoles. To be honest, the only E3 demo that really impressed me was the (albeit pre-rendered) PS3 demo 'KillZone'. If that's what the games will look like, then sign me up for a PS3 ;-). If the new Xbox 360 is anything to go by, the games will look better but play-wise they'll be no different than the previous generations of games/consoles.
Needles to say, nVidia and ATI continue to battle each other. The really interesting battle has already been fought. As it turns out, the Playstation 3 is powered by nVidia graphics which in combination with the Cell processor deliver a remarkable 2TFLOPS computational power. The other two consoles, the Xbox360 and Revolution are both powered by GPUs from ATI and CPUs from IBM. However, rumour has it Nintendo will aim for a cheaper console so even though it will be released after the Xbox 360 it may not be as powerful. However, Ninteno's next gen offer is the only console that includes a dedicated physics chip. Nintendo also chose an untraditional controller for its console. Time will tell if that was a wise choice, but considering how little the gameplay experience has changed from the old Xbox to the new one, maybe it's time to start a Revolution (which is also the name of Nintendo's console).
The work done on the consoles usually translates into PC products in the shape of new graphics chips. This time around that particular transition went very fast and nVidia already offers a PC solution that is far more powerful than even the yet unreleased PS3 (2xSLI GF7800GTX512). Both nVidia and ATI also have during 2005 brought forth solutions for combining multiple graphics cards to increase rendering performance. nVidia calls its solution SLI which is a tribute to the technology developed by 3Dfx in the late 1990's (Also called SLI back then but the acronym now means something else - ScanLine Interleave vs Scalable Link Interface). ATI calls its multirendering solution *Crossfire* and fancy names aside, this technology suddenly changes the rules a bit (again). It means that if you buy the fastest graphics card on the market you can still get better performance by buying another one and run them in SLI or Crossfire. This is of course mostly for enthusiasts and hardcore gamers but the point is that the next generation graphics card doesn't look so spectacular if it's only twice as fast as the old one as it then only matches the old generation in SLI mode. It may be cheaper, but for enthusiasts, performance and status is more important than economy.
And guess what, the hardware is being developed much faster than software these days. You can of course run out and buy two nVidia G70 or ATI R520 based cards but there aren't really any games that will stress them at the moment. We will have to wait for Unreal 3 based games to see what these new chips can do. However, people who work with professional graphics will of course welcome the extra performance. 2005 may turn out to be another exciting year for computer graphics but personally I'd still like to see better 3d graphics in handheld devices such as mobile phones. 2005 will probably be remembered as the year when the graphics companies went crazy and released more power graphics than 99,9% of the user base needs (it may be useful in workstations, but the game development lags behind this crazy tempo of new releases.)
The biggest movie of 2005 was probably StarWars Episode III. It might be the very last movie in the StarWars saga so I'm guessing that Lucas wanted this one to absolutely spectacular. As always the amount of digital effects was staggering and you can tell that the people working with the special effects have become better and better with each movie. For example, the computer generated Yoda was very well done and even when the camera zoomed in on the eyes, he still looked very convincing. Perfection is not far away - although this is more true for some scenes than others. Another cool thing used frequently in Episode III was the use of digital face mapping. All the actors were scanned and the stunt doubles then got their faces (digitally) mapped with the faces of the actors they doubled for.
One other movie that also relied heavily on special effects was Peter Jackson's King Kong. While the quality varied somewhat, the main character (King Kong obviously) was absolutely spectacular. Although still not perfect, I think this is the highest quality computer generated main character to appear in a film. It seems people these days take perfect special effects for granted, but that is of course a major misconception that great visual effects "make themselves" at the touch of a button. That kind of quality takes a lot of very talented people enormous amount of time to get right.
The Future: The years ahead...
Photo-realism to the people!
'Difficult to see.. always in motion the future is..' (Yoda)
It is obvious that we are heading towards photo-realism. In the early 90's, there were no true 3D games at all (Wolfenstein and Doom don't qualify as true 3D games) and 1999 saw the release of Quake 3. A game that featured all the latest (at the time) tricks of the industry at the time. some 5 years later, Doom III made its appearance and yet again the distance to the previous generation is extremely noticeable to say the least.
The path that the gaming industry has chosen will eventually bring them to the movie industry. In the future, the artists will be using the same 3d models in games and in movies. As movie footage doesn't have to be rendered in real-time, they will always be allowed to do more complex things, but there will be a certain point in the future when this border between realtime and pre-rendered graphics will be so blurry, only the industry experts will be able to tell them apart.
The next big leap in visual quality will come with the introduction of the next generation consoles. I'm guessing that certain types of games such as car/racing games will look so realistic that it will fool more than one person into thinking that they are watching actual TV footage. The industry is learning more and more tricks about making things look realistic.
It's noteworthy though, that the focus might not just be on pushing more and more polygons around on the screen. According to nVidia's David Kirk, the GFX R&D teams might focus on solving age old problems such as how to get realtime Ray-tracing and Radiosity. The thing is, again according to David Kirk, these things aren't far away at all, and again, come 2006/7 we might be doing Radiosity rendering in realtime. When that happens, there will be another huge leap towards photo-realism in computer games. (Until now, most games use pre-calculated radiosity rendered shadow-maps applied as secondary textures).
It's interesting too look back for a while and look at the rapid development in consumer graphics. As late as in 1997 SGI's multimillion dollar 'Onyx 2, Infinite Reality - Dual rack' (consuming some 14,000 Watts of power) was among the best the graphics biz had to offer. In 2005 a GeForce 7800GTX graphics card can be bought for $300 and GFX performance-wise it completely outclasses the Onyx 2 setup.
So what about the movies?
The 90's saw many milestones in cinematographic special effects. This decade will be no different. The milestones will not be the same, but there will be new ones. What lies ahead can be summarized in three words: Faster, better, cheaper! Long gone are the times when ILM was the only company that could deliver state of the art special effects. Nowadays, competition is hard and that results in those 3 words I mentioned above...
The times when people invented new revolutionary algorithms to render photo realistic scenes are more or less behind us. Currently, photo-realism IS possible, but what still prevents us from creating e.g. perfect humans is within the creation process not the rendering. The software used to model, animate and texture objects is still too clumsy. It's still nearly impossible for an artist to model and texture e.g. human skin that would pass as photo realistic from close range. Other, less complicated objects, such as space ships can already be rendered photo-realistically. Be sure though, that the first decade of the new millennium will see photo realistic humans in a movie. After seeing Star Wars Episode III and the new King Kong I'm confident that photorealistic humans aren't far off in the future. 100% realistically rendered humans may be the final milestone to achieve before computer graphics really can replace anything. (That doesn't mean that it will be practical to do so, but Gollum in LOTR certainly made some real actors look mediocre). Look forward to an amazing (remainder of the) decade of CGI!
All the brands and products mentioned in this article are copyrighted by their respective owners.