August 22, 2005
Futuremark har släppt en ny patch till 3DMark03. Patchens syfte är att alla grafikkort ska mätas på lika villkor. För att undersöka om grafikkortstillverkarna skött sig testar vi nya patchen med ett grafikkort fårn ATi och ett från nVidia.
[Image Can Not Be Found]
Hittills har vi bara testat Radeon 9800 Pro med nya patchen med resultat från GeForce FX 5950 Ultra till kommer strax.
Läs artikeln här.
November 15, 2002
Marquzz wrote: Men vad exakt beror det på? Skulle du vilja förklara lite närmare varför fps:en blir lägre med den nya patchen UndaC? Det hade varit suveränt 🙂
November 29, 2001
Marquzz wrote: Men vad exakt beror det på? Skulle du vilja förklara lite närmare varför fps:en blir lägre med den nya patchen UndaC? Det hade varit suveränt 🙂
"Parts of the program code have been changed so that possible 3DMark03 specific optimizations in current drivers will not work."
Vet ej mer än så.
November 29, 2001
boogey wrote: Har ni använt er av olika processorer i de två testen...???
Nej nVidia presterar bättre. Futuremark skickade denna förklaring till mig:
Futuremark wrote: 3DMark03 is highly dependent on graphics card vertex and pixel shader performance. The CPU speed influences less the total 3DMark score, and therefore there is a separate CPU test. The CPU test measures above all DirectX software vertex shader performance. The result is somewhat influenced by the graphics card's pixel shader performance. Therefore a CPU performance comparison should be run using the same graphics card, graphics drivers and operating system on all systems compared.
---So you should not compare the CPU score of two different graphics cards.
Always compare CPU scores with the same graphics card, this way the measurement reflects better the CPU and memory speed. Only a CPU test with an all black screen would not be affected by the graphics card.
Let's also look at the details about that test:---
- Game Tests 1 and 3.
- Forced software vertex shaders, because software vertex shader speed is what this test is designed to measure.
- 640 x 480 resolution, in order to minize the fill rate influence in the test result. If we would have selected an even smaller resolution, it might have been hard to see if everything really is rendered. Also, many TFT displays can't display smaller resolutions.
- Game Test 3 has the shadows disabled, because the dynamic stencil shadows are rather dependent on the fill rate of the graphics hardware, and that is an aspect we want to minimize in this test.
- Game Test 3 has also the 1.4 pixel shader code path disabled, because otherwise graphics hardware with pixel shader 1.4 support would get a higher score, and we do not want to credit that feature in this test.
---So the difference between game tests 1 and 3 vs CPU test 1 and 2 are that:
- CPU tests do not stress the graphics card's vertex shader speed.
- The lower resolution decreases the influence of the fill rate on the score.
- The lack of shadows in CPU test 2 further decreases the influence of fill rate.
- Only pixel shaders 1.1 are used in CPU test 2.So under these circumstances Nvidia obviously is faster.
August 7, 2003
UndaC wrote: Nej nVidia presterar bättre.
Bättre?
Nu flyger anklagelser om fusk, snabbare än nånsin iallafall :bgrin: upp till 25 Megafuxels/sekund! 😮
Hoppas det är okej jag klistrar in detta:
Nvidia wrote: With the introduction of the GeForce FX - we built a sophisticated real-time compiler called the Unified Compiler technology. This compiler does real-time optimizations of code in applications to take full advantage of the GeForce FX architecture.
Game developers LOVE this - they work with us to make sure their code is written in a way to fully exploit the compiler.
The end result - a better user experience.
One of the questions we always get is what does this compiler do? The unified compiler does things like instruction reordering and register allocation. The unified compiler is carefully architected so as to maintain perfect image quality while significantly increasing performance. The unified compiler a collection of techniques that are not specific to any particular application but expose the full power of GeForce FX. These techniques are applied with a fingerprinting mechanism which evaluates shaders and, in some cases substitutes hand tuned shaders, but increasingly generates optimal code in real-time.
Futuremark does not consider their application a "game". They consider it a "synthetic benchmark". The problem is that the primary use of 3DMark03 is as a proxy for game play. A website or magazine will run it as a general predictor of graphics application performance. So it is vital that the benchmark reflect the true relative performance of our GPUs versus competitors.
And, while they admit that our unified compiler is behaving exactly the way it behaves in games and that it produces accurate image quality, they do not endorse the optimizations for synthetic use. Hence, Futuremark released a patch that intentionally handicapped our unified compiler.
So, we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.
Derek Perez
Director of Nvidia PR
October 1, 2001
Ojojoj, undrar vad Futuremark säger om det. Om det är sant som de säger att deras drivers automatiskt justerar om shaders i alla spel utan att kvaliten försämras så är det ju ingen 3Dmark03 specifik omtimisering och då tycker jag inte att det är fusk. Frågan är hur bra funkar den på riktiga spel gämnfört med 3Dmark03.
December 5, 2001
Tycker inte man ska jämföra hårdvara med benshmark-program. Varför inte enbart hålla sig till spel där man trots allt vill se skillnaderna?? Vad gör t.ex. 3DMark för nytta egentligen? Nä det är enbart ett program som alla överklockare vill slå rekord i. Egentligen ingen prestandamätare enligt mig.
November 29, 2001
Dead Glory wrote: Ojojoj, undrar vad Futuremark säger om det. Om det är sant som de säger att deras drivers automatiskt justerar om shaders i alla spel utan att kvaliten försämras så är det ju ingen 3Dmark03 specifik omtimisering och då tycker jag inte att det är fusk. Frågan är hur bra funkar den på riktiga spel gämnfört med 3Dmark03.
Finns en tråd på swec om detta. Det där är bara lögn från nVidias sida, de har tom. gått ut och öppet erkännt att "de sa fel" nu i efterhand.
2 Guest(s)