Beyond any doubts the most interesting product for our readers at Comdex Scandinavia 2003 was the GeForce FX.
nVidia had some space in Network Technical’s booth, visitors could view the Dawn demo in all its glory. Other than that things were quiet, so we decided to ask the nVidia representatives a couple of questions.
Without further delay here’s our interview with nVidia Europe…
nVidia had some space in Network Technical’s booth, visitors could view the Dawn demo in all its glory. Other than that things were quiet, so we decided to ask the nVidia representatives a couple of questions.
Without further delay here’s our interview with nVidia Europe.
[nV]: There’s quite a few manufacturers involved with the launch of the GeForce FX but as far as exclusive deals goes, that’s just the rumour milk. In Europe our main partners in the retail channel are Creative, Terratec and PNY and and there are others that will be added to that list. We of course have close contact with our other partners who are strong in other segments but as GeForce FX will be launched in retail first, this is where our primary focus will be.
[nV]: We’re looking at a ballpark figure of around 600 Euros however final pricing will be determined by the partners. The high end only makes up for 2% of our sold products and at launch GeForce FX will only be available in limited numbers, for those who want the latest and greatest the GeForce FX will be the product to go for. GeForce FX should pop up at retailers across Europe by the end of February. [NH]: There has been some rumours about core clocks at 600 MHz and beyond being tested at nVidia. Is this just testing going on of a new product or is there any chance of a last minute update of the “current” GeForce FX?
[nV]: You should view the GeForce FX as a part of the NV3x family with the GeForce FX on top as the flagship model and other offsprings below it to satisfy the wants and needs of different segments. The clock speed of the FX will as announced earlier be 500/1000 MHz. As for launch dates regarding the upcoming members of the NV3x line I can not say more than that they can be expected around march or so. [NH]: We’ve recently seen manufacturers like Hercules releasing products that are “water cooling ready” are you currently planning any similar extreme cooling solutions?
[nV]: When it comes to cooling that’s entirely up to our third party manufacturers. We give them a reference design that can be improved upon in many ways.
[nV]: There are several ways we can do this and right now we have not finalized it, however the web is full of a lot of speculation on this right now. [NH]: What types of out and input connections will you be able to offer on the GeForce FX?
[nV]: Again, this is up to the preferences of the third party manufacturers. One of the exciting new features of the GeForce FX in this regard is the inclusion of our own TV-out chip which will ensure higher quality, better compability and increased flexibility.
[NH]: There’s been a lot of talk about “Cinematic Rendering” when do you expect games in general to expose this kind of graphical quality?
[nV]: Since we launched Cg, C for graphics, we’ve made the life much easier for game developers when working with Pixel and Vertex Shaders. With Cg developers can easily program shaders without assembly and then compile their code for Open GL, DirectX and Xbox at once. Cg also works on all hardware platforms so this isn’t an nVidia exclusive “feature” even though it’s optimized for our products. Thanks to Cg I think we’ll see games with Cinematic quality much sooner than anyone thinks today.
[nV]: It’s entirely up to them to write their own profiles for our compiler. [NH]: That ends our question about the GeForce FX, so when do you expect that we can have one in our test lab?
[nV]: We’re doing our best to get as many as possible for you guys to play with – I’m sure we can work something out 🙂 [NH]: Let’s talk about your current products. You recently released the Ti4800SE and Ti4800, don’t you think these cards will confuse consumer? After all in the American market you decided to give them the more “proper” names Ti4400 with AGP 8x and Ti4600 with AGP 8x.
[nV]: The new names was a response to what our partners requested. They will carry the same names in both Europe and USA. [NH]: Don’t you agree that especially the name Ti4800SE adds a lot of confusion? We’ve recently seen a reseller returning a Ti4800SE instead of a Ti4600 for an example.
[nV]: That certainly is an unwanted situation, though to our knowledge this is an isolated case, if you discover more cases like these we hope that you contact us. [NH]: One question that lots of gamers are asking themselves is: how well will Doom 3 run on my Ti4600.. What can owners of current video cards expect?
[nV]: It will be possible to run the game but for the best end user experience, the GeForce FX will be the optimal product. John Carmack has developed this new engine with the FX in mind. [NH]: If we look at driver updates, what type of performance increases can we expect from the current nVidia products? Have the drivers already reached the point of maximum theoretical performance or is there still a bit more to squeeze out of the GeForce 4 Ti-series?
[nV]: nVidia will keep building on our Unified Driver Architecture which means that all nVIdia products across the board (from TNT and up) will receive substantial performance improvements in the future too. You will see the greatest increases when we release new series (such as Detonator 3, XP and 40). We test new drivers on a daily basis optimized for different applications and features and when we release a new driver we incorporate all those optimization in to our latest driver. [NH]: According to recent reports nVidia’s Personal Cinema 2 will be launched based on the GeForce 4 MX 440-series, is there any possibilities of the Ti or FX-series being equipped with Personal Cinema 2?
[nV]: We haven’t launched Personal Cinema 2 in Europe yet so right now Personal Cinema 2 is an exclusive product for the American market. Currently Personal Cinema 2 is based around the MX-series, that’s correct. Right now we can’t comment this further. [NH]: When will the GeForce 4 Go 4200 be available?
[nV]: We’re still discussing this matter with manufacturers so right now we can’t answer that question.
[NH]: Now that nVidia successfully has entered the motherboard market, are there any plans of manufacturing chipsets for the Intel platform?
[nV]: At the moment we are still very new to the motherboard market, and we felt it important to focus on the AMD market to begin with as this is how we can make the most impact in the shortest space of time – according to reviews, nForce2 is the fastest socket A solution for AMD processors, outclassing some of the more incumbent players in this field, so I think we’ve made great progress. As for motherboards for other markets, anything is possible but we still have some work to do before this is feasible.
[nV]: There’s no doubt that the 9700 Pro is a fast videocard. However ATi will not stay at the throne for very long. We are confident that the GeForce FX will bring the performance leadership crown back to nVidia. Our competition will all have the same challenges we have faced with the move to 0.13 micron technology* so we feel we have made the right choice making this move now.
*(From what we at NordicHardware have gathered the successor to 9700 Pro, R350, will be built on 0.15 micron technology.) [NH]: One last thing; are the Maximum PC benchmarks authentic?
[nV]: Yes they are authentic although they are based on a board and drivers that are far from final and thus do not represent the performance of the retail product. [NH]: We would like to thank you for answering our questions!
[nV]: No problem.
Note that this interview is based on our notes from Comdex so this is not a word-for-word representation of what was said, though we have of course not altered any contents, just re-phrased it. A great thank you goes out to Andrew Humber and Adam Foat of nVidia Europe.
We had some questions that nVidia weren’t able to answer in such short notice but they asked us to e-mail the questions so hopefully we’ll be able to update this interview within a few days.
Update: We have now received some of the last questions that we asked nVidia so here goes:
[NH]: There were some rumors about the inclusion of a separate geometry processor on the GeForce FX (much like the one 3Dfx was planning before their demise) are the rumors valid? And if so: what happened to it?[nV]: Sorry, but I’m afraid we cannot comment on rumours. [NH]: What is the maximum level of anisotropic filtering on the GeForce FX?
[nV]: The GeForce FX family has a special adaptive texture filtering mode that the end user can turn on or off using sliders in the control panel. Conservative settings will perform the traditional texture filtering that the application or the user requests. Moving the slider to the Aggressive side will engage the adaptive texture filtering algorithms that can deliver higher performance by analyzing the texture data and making intelligent trade-offs to increase performance while maintaining high image quality. *
[NH]: Is it true that the “FX” in GeForce FX is sort of a “tribute” to 3Dfx?
[nV]: We chose FX for two major reasons. Firstly, GeForce FX is all about rendering cinematic-quality special effects in real time on your PC. It is our goal to continue bridging this gap between what we see in digital Hollywood and what can be done in real-time. Secondly, this is the first GPU that the 3Dfx engineers that came to work for NVIDIA were involved with so we thought it was appropriate to acknowledge the significant contribution they made to getting this product off the ground.
[nV]: As has been announced with the first reviews – there will be a 5800 Ultra – we have not disclosed any other names as yet.
* NH was actually referring to the level as in “4x” or 128 “tap”. After reading the review/preview we see that it’s 8x/64 tap.