Time for our first thorough look at the Radeon X1000-seriess architecture with loads of performance and image quality tests, but we also explore ATi’s 90nm circuits overclocking potential.

It has been about a month since ATi launched their latest generation of graphics cards. We were at the Europe launch, were we had some time to get to know the cards and perform a handful of tests. Back in our own homes, we wanted to make a more thorough test in our own testing environment, using more games, more settings and last but not least, to overclock the cards.

This article adds to our earlier article about the X1000 series, which you can find here, and if you want to know more and make yourself at home in this article, we recommend you to read it. Thus we will take you through more practical aspects concerning these graphics cards and not as much of the technical discussions that can be found in the earlier article. The main priority will be game tests, but other relevant examinations such as sound levels, image quality, power consumption and overclocking are included.

First we will present to you the cards we’ve chosen to include in this performance comparison.

As mentioned earlier, we chose to divide the cards into two groups to compare the cards in a more suitable environment. We won’t earn anything by pushing these cards through high resolutions with high amounts of antialiasing, and in the same way get limited by the other components in the system when we’re running the performance cards in low resolutions.

The cards we have chosen to test in this class is ATi’s X1300Pro and X1600XT, also nVidia’s 6600GT. The X1300Pro card costs about 1000 SEK and the 6600GT can be found at about 1300 SEK. The X1600 card is not yet released in retail stores, and won’t be available until the first weeks of november.

Budget class
X1300Pro X1600XT 6600GT
Manufacturing name: RV515 RV530 NV43
Manufacturing process: 90nm 90nm 110nm
Transistors (million): ~100 ~157 ~146
Pixel shaders: 4 12 8
Vertex shaders: 2 6 3
Frequency (core): 600MHz 590MHz 500MHz
Frequency (memory): 800MHz 1380MHz 1000MHz

Judging by the specifications, the X1300 card doesn’t look that good, but we’ll see if the card manages to keep up the pace with the other cards thanks to the new architecture. The X1600 card, which has an upper hand in all of the categories, shouldn’t have any troubles dealing with the 6600GT card.

We will mainly aim for tests of these cards in the resolutions of 800×600, 1024×768 and 1280×1024 (or 1280×960 in case the game doesn’t support them both). All of the cards theoretically have support for antialiasing and anisotropic filtering, but to be able to really enjoy these games, you have to be careful with these settings when using the mentioned cards.

It’s time to look at the cards of the performance class.

Here we’ve gathered some of the fastest cards on the market, namely the X1800XT and the X1800XL from ATi and the Geforce 7800GT and Geforce 7800GTX from nVidia. As you know, in the present-day situation, nVidia’s cards can be run in SLI mode, but we won’t test any configurations of that kind in this article. Instead, we hope that we can get back to you when ATi has their counterpart available.

The X1800XL card can be purchased today for about 4200 SEK while the X1800XT card, like the X1600, isn’t available in retail stores yet. nVidia’s 7800GTX is available from ~$480 and 7800GT from ~$370.

Performance class
X1800XL X1800XT 6800Ultra 7800GT 7800GTX
Manufacturing name: R520 R520 NV43 G70 G70
Manufacturing process: 90nm 90nm 130nm 110nm 110nm
Transistors (million): ~320 ~320 ~222 ~302 ~302
Pixel shaders: 16 16 16 20 24
Vertex shaders: 8 8 6 7 8
Frequency (core): 500MHz 625MHz 400MHz 400MHz 430MHz
Frequency (memory): 1000MHz 1500MHz 1100MHz 1000MHz 1200MHz

Graphics cards differ more and more and we can clearly see that ATi and nVidia has chosen different paths concerning the design. ATi’s cards has the upper hand in frequencies, while nVidia’s cards has more pipelines and it remains to see which is the most effective in the present-day situation. As we said earlier, we’ll mainly test these cards in the resolutions of 1280×1024 and 1600×1200 to keep the performance of the rest of the system from being limiting. Let us take a look at the testing system before we start out with the tests.

Mainboard Abit

Fatal1ty AN8 SLI, BIOS 17

Processor AMD FX-57 (San

Diego, 0512MPMW)

Memory Kingston HyperX 2x512Mb (BH-5)
Graphic cards ATI Radeon X1300Pro
ATI Radeon X1600XL
ATI Radeon X1800XL
ATI Radeon X1800XT

nVidia 6800Ultra
nVidia 7800GT
nVidia 7800GTX

Power supply OCZ PowerStream 520W
Operating system Windows XP (SP2)
Drivers nForce 6.67
Catalyst 8.173
ForceWare 77.77
Monitoring program Fraps 2.5.0
Test program
3DMark2001 (3.3.0)
3DMark03 (3.6.0)
3DMark05 (1.2.0)
AquaMark 3
Battlefield 2
Colin McRae 2005
Call of Duty 2 – Demo
Doom 3 (1.0.1262)
Far Cry (1.32)
F.E.A.R – Demo
Half-Life 2
Need For Speed Underground 2
Splinter Cell – Chaos Theory
Unreal Tournament 2004 (v3399)
Warcraft 3 (1.17)

We made use of the standard settings for image quality which was "High quality" for both ATi’s and nVidia’s cards.

Now we’ll take a look at the different cooling devices on the cards.

Processors have for a long time been superior when it comes to power consumption and consequently the need of noisy fans. Graphics cards, if not really at the same levels yet, are getting closer and closer. In the performance class, copper-based heatsinks with powerful fans have become the standard and even cheap graphics cards often need an active cooling solution. We’ll begin by looking at how ATi has solved this on their X1300Pro.

The X1300 uses a copper heatsink that only uses one PCI slot and it does what its meant to do without any troubles. Our card hardly got lukewarm during our tests which bodes well for manufacturers that choose to deliver the card with a passive heatsink. Though the noise level is something that really surprised us. The card sounds about as much as the X1800XL when it’s working in high fan speeds and this card doesn’t have an active adjustment of the fan speed. Even so, we are pretty sure that this will be changed for the retail cards and that the fan speed will be knocked down a notch.

Here you can see some photos of the heatsink of the X1600 and if someone wants to declare that it’s identical to the X1300 heatsink, we’re more than willing to concur. Unlike the X1300, the X1600 has the ability to adjust fan speeds, which was a very welcome feature. This card doesn’t get that hot either, only slightly warmer than the X1300. The fan didn’t spin up either during the full duration of our tests.

We’ll move on and look at the bigger cards.

When we enter the X1800 series of graphics cards we suddenly find ourselves in a completely different league. The X1800XL has been equipped with a much more powerful cooling device than its younger brothers. As you can see, they’ve equipped the card with double heatpipes and very thin copper fins to get the largest space toward the air as possible. In spite of this, the card is far from cool, and you can’t put a finger on the heatsink for more than a few seconds when the card is being pushed. Even components from the voltage part have been equipped with a heatsink which has literally been caught in the hot air from the large heatsink.

When it comes to the fan on this card, this one also adjusts its speed, unfortunately with some flaws. It has a relatively low fan speed from the beginning and is the most quiet one among the cards, but even in Windows it reaches a temperature threshold that causes the fan to spin very fast. The temperature decreases pretty fast and after about 20 seconds, it resumes its low fan speed mode. The problem is that this repeats itself even when the card isn’t working on anything which is very irritating. We are once again sure that this is a phenomenon that will be adjusted on the retail models.

Here we can see what it takes to be able to increase the frequency of the core from 500 MHz to 625 MHz – bigger cooling devices. Heatpipes are used on this card aswell, which can be seen on picture #4 and #5. The fan is certainly not a quiet one and during boot, you are given flashbacks to the time NV30. After a few seconds, it will spin down to a much lower sound level though. Even without any load, the fan makes a faint growling noise that gets louder if the temperature increases. Luckily, this card has more fan speed modes than the X1800XL and doesn’t have the same energetic behaviour without any load. As a reference, we can mention that none of the cards are even close to the GTX card we compared them with. That card doesn’t even start its fan when the computer boots up, but slowly begins to spin when the temperature has risen a bit. We didn’t manage to stress the GTX card so that it increased its fan speed either.

It is time to take a look at the image quality among the different cards.

Every now and then, discussions concerning the image quality between ATi’s and nVidia’s cards show up, and far from every article give any concrete results. One reason could be that it’s not that easy to tell what’s good-looking or not, which we will also see in our examinations. To make an as independent comparison as possible, we used Far Cry, where you can take screenshots on separate images in a demo. This way we’ll get pretty much the same image over and over again. The tests are done with ATi’s X1800XT and nVidia’s 7800GTX. The resolution used was 1280×1024. We’ll start off by looking at some antialiasing.

Click the pictures for a high-resolution version.

Here we see the different types of antialiasing supported by the new cards from ATi. New features are Adaptive AA and Temporal AA. As you can see, it’s not hard to tell when AA is activated but to notice a difference between the different levels of antialiasing and filtering is harder. Performance-wise, it’s worth to mention that activating both Temporal AA and Adaptive AA has a great negative impact on the cards. Let’s see how nVidia’s card behaves.

Click the pictures for a high-resolution version.

Even if the differences are small, we see a tendency of ATi’s cards using a more effective technique for antialiasing. In the water, we see how ATi’s card succeeds better while we experience that the right stem is rendered more smoothly with nVidia’s card.

We’ll take another look at image quality in a new scene.

Click on the images for high-res version.

Once again we’re given an example of how antialiasing manages to generate a much better looking gaming experience. The most noticeable example of this is the hang glider’s bracing-wire. Without antialiasing it’s just a few black dots but with antialiasing turned on we see a line appear much more legible. We also notice that the contours of the weapons are much better looking.

Click on the image for high-res version.

Here we have the same setup with nVidia’s card. As we mentioned before there’s not a lot separating the two cards apart. If you’re interested in having a closer look at all the pictures in full size we’ve made a pack that’s available here

:: Scene 1 (6950kb) ::
:: Scene 2 (9230kb)


Now let’s begin with our game tests. First up is Half-Life 2.

A Performance comparison is not complete without some games and Half-Life 2 is one of those games. We recorded a demo on the map d1_canals_9 that we then played as a timedemo. The settings we used in the game are the ones in the following table.

Setting: Value:
Model Details High
Texture Details High
Shader Details High
Water Details Reflect World
Shadow Details High
HDR None

Budget class

The X1600-card takes ATI’s first victory, but the two other cards are not far behind.

When we activate antialiasing and filtering the performance of all the cards drops considerably and at 1280×1024 it’s barely playable.

Performance class

The Performance cards don’t seem to have any difficulties and the X1800XT easily takes the victory in both resolutions.

ATI starts strong and the XT-card takes home a victory without any real problems. The GTX-card can’t make it all the way to the end but manages to leave a gap down to the XL-card, The XL which is having a close battle with the GT-card. It pretty obvious that the 6800-card comes from a earlier generation when you look at the results.

Let’s move on to Battlefield 2.

Battlefield 2 is a relatively fresh FPS-game that’s increasing in popularity amongst the online players. We tested the game by recording a demo where we played a probable game scenario on the map Strike at Karkand and used FRAPS to register the Frames Per Second with one second break during 60 seconds. Vi used the following settings.

Setting: Value:
Overall Quality High
Terrain High
Effects High
Geometry High
Textures High
Lightning Medium
Dynamic Shadows High
Dynamic Light High
Viewing Distance 100%

Budget class

Without any antialiasing or filtering we can see that even ATI’s minor can keep up with nVidia’s 6600GT. X1600 doesn’t have any difficulties in taking control and is without any doubt the winner of the first round.

Here we see that ATI handles antialiasing and anisotropic filtering better than nVidia’s cards. Despite relatively low FPS it was pretty much playable all the way down to under 30 FPS.

Performance class

The X1800-card likes Battlefield 2 and leaves all of nVidia’s cards behind.

The fact that ATI’s cards show very good performance with FSAA and AF doesn’t make things any better for nVidia’s cards that simply can’t keep up in this game.

Next up is Far Cry.

Next challenge: Far Cry. We played a demo of the map Volcano and registered the FPS with the help of a special benchmarking-program from HardwareOC. These are the settings we used.

Setting: Value:
Texture Quality Very High
Texture Filter Quality Trilinear
Particle Count Very High
Special Effects Quality Very High
Environment Quality Very High
Shadow Quality Very High
Water Quality Very High
Lightning Quality Very High

Budget class

Once again, nothing strange here, the cards perform as before.

With FSAA and AF on the 6600GT and the X1300-card are having noticeable problems in 1280×1024. The 6600GT is having a real fight against the x1300-card but manages to win by a small margin. The X1600-card continues to dominate the budget-class.

Performance class

For once the GTX-card manages to snatch the win from the XT-card and the GT-card manages to get by the XL-card and thereby leaving it second from the bottom. The 6800-card tries for all it’s worth but doesn’t have much to show against the latest generation cards.

With FSAA and AF the fun was over for nVidia’s GTX-card. The two ATI cards shows more efficient antialiasing and filtering and climbs the rankings

And then we have Doom 3.

Doom 3 has been with us for a while, but that’s not a reason for us not to include it in the tests. ATI has had a tough time with OpenGL before, which Doom 3 utilizes. We ran "timedemo demo1" to compare the cards. First some settings, then we will see if ATI has made any progress on this front.

Setting: Value:
Special Effects Yes
Enable Shadows Yes
Enable Specular Yes
Enable Bumpmaps Yes

Budget class

For the first time the 6600GT gets the chance to show off, if even only for a short while. ATI manages through its effective antialiasing algorithms to take back the lead when the antialiasing is activated.

Performance class

ATI has had a tough time with OpenGL games like we mentioned before, which Doom 3 is. FSAA or AF can’t save the winnings for ATI which is falls behind in this game.

Update! ATI released a beta driver during the test to correct the
performance problems they had in OpenGL based games, we have tested
this driver in the end of this review.

Next out is F.E.A.R.

Through rumours we’ve gathered the intelligence that F.E.A.R. would be a heavy load to carry, which of course was something that we would like to test the graphics cards against. Once again we used FRAPS and played a minute of a intensive battle in the game. We should also add that it was incredibly difficult to get consequent numbers because the game varies a lot graphics wise, and the AI is very good.

Setting: Value:
Effect Details Max
Model Decals Max
Water Resolution Max
Reflections and Displays Max
Volumetric Lights On
Volumetric Light Density Max
Light Details Max
Enable Shadows On
Shadow Details Max
Soft Shadows On
Texture Resolution Max
Videos Max
Pixel Doubling OFF
DX8 Shaders Off
Shaders Max

Budget class

This game proved to extremely demanding for the graphic cards. Despite that all the cards has a FX-57 whipping their backs it did not help much for them. The 6600GT card with its 128MB did simply not want to play this game. Sometimes we got playable framerates, but overall it was completely unplayable. Despite playable 40FPS it dropped sometimes to way under 10FPS and none of the budget cards gave acceptable game performance at 1280×960, which was also the case with FSAA and AF.

Performance class

The game finally begun to be playable with these cards. We can see that the XT card pulls ahead a lot with its 512MB of video RAM while the 256MB cards perform like before. As we pointed out earlier it was very difficult to get consequent results in this game so the numbers should be interpreted more like a hint on how the cards perform.

Next game out is Unreal Tournament 2004.

Unreal is another game which has been with us for a while, but we chose to put it through a couple of tests anyway for all eventual enthusiasts. We used the heaviest setting the game offered, namely:

Setting: Värde:
Texture Detail Highest
Character Detail Highest
World Detail High
Physics Detail High
Dynamic Mesh Load Highest
Decal Stay High
Character Shadows Full

Budget class

As we expected Unreal Tournament was no match for any of our cards. The X1300 is lacking behind a bit in 1600×1200 with eyecandy activated though.

Performance class

The same scenario repeats itself here. The performance cards has no problems whatsoever independent of settings and we can see that despite the solid test system, it is the limiting factor here.

We close Unreal Tournament 2004 as a test title and move on to Call of Duty 2.

Call of Duty 2 definitly looks to be a promising game with a lot of interesting effects and enviroments. We played 60 seconds in the beginning of the demo and again used Fraps for comparison. Here are the settings we used.

Setting: Value:
Number of Dynamic Lights High
Model Detail Normal
Z Feather Smoke Everything
Number of Corpses Insane

Budget class

The X1300 card has a tough time in this game and it is really only playable at 800×600. The X1600XT and 6600GT gets off a bit easier as long as you don’t enable any form of eye candy. Despite the relativly low update frequency it is constant and the game feels playable all the way down to the 20-30FPS region.

Performance class

Similar to F.E.A.R. this game puts a lot of pressure on even the latest graphic cards which we can see here. The performance pattern is what we have gotten used to except the X1800XL card which fell behind a bit in high resolutions with lots of eye candy. We did not manage to find out what could cause this despite several tries. Such as F.E.A.R. this game was tested by us playing the same scenario again and again which adds a human factor to the end result.

We proceed with another game demo, Splinter Cell – Chaos Theory.

We also chose to include Splinter Cell, which isn’t a full-blown ”First Person Shooter” game, to see what it takes to make this kind of games playable. The settings were as following.

Setting: Value:
Shader Model Type 3.0
Shadow Resolution High
Trilinear Filtering On
Specular Lightning On
High Quality Textures On
HDR Rendering Off
Parallax Mapping On
High Quality Soft Shadows On

By some unfathomable reason, we couldn’t manage to activate FSAA or AF in the game, and even if we forced the drivers to these settings, they weren’t activated in the game.

Budget class

The cards keep on performing as earlier. Since the game type is slow, the same update frequency as for an FPS game isn’t needed, so in spite of the low values, the game was playable on the whole on all cards, no matter the resolution, with some patience.

Performance class

In the performance class we see a strong effort from ATi’s card which wins in every different setting.

Now we’ll leave the current tests and try a whole different type of game – racing games.

We chose to include some other games that aren’t of the FPS type. First out is Colin McRae 2005 which is one of the newest rally games on the market. We used Fraps here aswell when we drove a stage in the game. The settings we used were as following.

Setting: Value:
Texture Quality High
Drawing Distance 10 (max)
Post Processing Effects Yes

Budget class

We noticed pretty fast that the game wasn’t very demanding which made us increase the resolutions, even for the budget class. The game was on a whole playable in all settings except for 1600×1200 with X1300 and 6600GT. At times where the frame rate was below 30 FPS, the response became noticeably slow and the car was difficult to drive. The X1600 wins by ease once again and as we’ve declared earlier, we can once again see proof of ATi’s more effective management of antialiasing and filtering.

Performance class

Here we can see another even fight until we turn on FSAA and AF which leaves all of nVidia’s cards at the bottom of the charts. To put the power in the performance cards in perspective to the budget class, we can see that the X1800XT with maximum resolution, FSAA and AF, beats almost all budget cards despite the resolution. In this relation, it is also worth noting that the X1600XT closes in on the 6800 Ultra when we activate FSAA and AF, which isn’t all that bad either.

We continue within the same genre.

Yet another racing game has joined the range of tests – Need For Speed Underground 2. Like with most of the other games, we used Fraps when we ran a normal game scenario. We used the following settings in the game.

Setting: Value:
Level of Detail Max
Car Reflection Update Rate Max
Car Reflection Detail High
Car Shadow Max
Car Headlight On
Car Geometry Detail High
Crowd On
World Detail Max
Road Reflection Detail Max
Light Trails On
Light Glow On
Particle System On
Motion Blur On
Fog On
Depth of Field On
Tinting On
Horizon Fog On

Budget class

After keeping up for a long time, the X1300 finally managed to outrun the 6600. ATi’s FSAA and AF performance has to do with this, of course. Below 30 FPS, we find the same tendencies like in Colin McRae that the car gets slow in the reactions thus making the game not as playable as we would like it to be.

Performance class

Here we receive another confirmation that ATi is making a great effort in this game. The XT card has more than double the frame rate compared to the GTX card in the two largest resolutions with FSAA and AF.

Let’s round up the game tests with a completely different type of game.

We also wanted to include a strategy game in our testing range. Warcraft 3 came to us as a popular candidate. The graphics settings we used were the following.

Setting: Value:
Model Detail High
Animation Quality High
Texture Quality High
Particles High
Lights High
Unit Shadows On
Occlusion On
Spell Detail High

Warcraft 3 was obviously not as demanding as we had in mind, which made us only look at tests with FSAA and AF.

Budget class

It’s only really 1600×1200 that gives the budget cards any challenge. Even though we forced vsync to be turn off, the game refused to give us a faster frame rate than 64 FPS, hence the strange maximum value.

Performance class

A diagram tells more than a thousand words.

Now it’s time to look at some synthetic tests.

We have now reached the part which people are most frequently talking about when it comes to the performance of graphic cards, namely 3DMark.

We used all three versions of 3DMark with the default settings.

Budget class

As seen in the earlier tests we see that the X1600XT cards takes home the victory without any major problems. Followed by the 6600 and the X1300 last.

Performance class

We have to admit that we were a bit surprised that the X1800XL card didn’t manage to squeeze in between 7800GT and 7800GTX with 3DMark 2001 and 3DMark03 which it did at several occasions with the game tests. The X1800XT card is king of the hill with 2001 and ’05, while ’03 appears to be a trump for nVidia’s cards.

Before we move on to the overclocking we have a few comments about the power consumption of the cards.

During our tests we measured the power consumption for the entire system with each card. After getting to know the X1800-series hot cards and the knowledge that high frequencies increase a components
power drainage we estimated pretty early that these cards would end up at the top, which turned out to be correct.
You should therefore make sure you have a good power supply before buyying the X1800, 6800 and 7800 cards.

It should come as no surprise that graphics card processors, much like other processors, are branded within a certain clock speed bracket. This is due to manufacturing processes, and to allow for a temperature envelope in which the processor can operate. It’s always interesting to find samples that happen to be very similar to their bigger siblings. The X1300 and X1600 cards are both reminiscent of their bigger brothers in the 1800 class. Let’s see if they’ve inherited any of the clocking potentials.

In order to determine a stable clock frequency, each graphics card had to successfully complete the 3DMark 03 test. The thermal grease was removed from all cards and replaced with Arctic Silver 5. The tests were conducted in an open system, with an ambient temperature of about 23°C.

The grey bars indicate stock speeds. Overclocked frequencies indicating core/memory respectively.

The X1300 card did really well, and managed a core increase from 600MHz to 675MHz. This, along with a memory increase of 12%, resulted in nearly 13% performance increase in 3DMark 03. The X1600 card possesses a much more complex core with more transistors, an thus has more obstacles to hinder overclocking. It only managed 6% and 4% increases for the core and memory respectively.

Time to overclock the performance class.

As with the budget-class cards, the performance cards were required to complete the 3DMark 03 test successfully with no visual artefacts.

The grey bars indicate stock speeds. Overclocked frequencies indicating core/memory respectively.

Technically, the only difference between these cards is the clock frequency. On the XT card, we found an increase in both core and memory voltage to enable the higher speeds. The software allowed us to turn up the rpm speeds to maximum. This made a great difference in overclocking abilities. The air temperature at the exhaust was measured at more than 55°C, while the heat sink on the power source was above 67°C.

The incredible increase in clock speed on the XL card is worth mentioning. The engineering sample we were given had the same memory as the XT card. We found out that this won’t be the case when the cards hit retail. Instead, the XL will have cheaper, slower memory. However, thanks to this faster memory, we saw a tremendous leap in 3DMark 03 scores for the XL card.

After we had almost finished this review, we heard that ATI had released yet another beta driver that claimed to improve the Open-GL performance for these cards. We decided to try Doom 3 again to see if ATI managed to address the OpenGL issues

Setting: Value:
Special Effects Yes
Enable Shadows Yes
Enable Specular Yes
Enable Bumpmaps Yes

Budget class

After trying several times to install these drivers on the X1300- and X1600 cards, we discovered that they don’t support the drivers. Therefore, only X1800 results can be posted.

Performance class

No tangible improvements when neither FSAA nor AF are implemented. But activate these, and things get interesting. At 1600×1200 we see almost a 50% increase for the X1800-XT card! That’s quite a driver adjustment they pulled off over at ATI.

The time has come to conclude this test session.

Overall, the numbers speak for themselves, and we can easily conclude that ATI have really succeeded in producing a very competent generation of graphics cards. Here are some of the main points:

  • We see a significant leap in performance due to the new architecture, a design that also spans into the budget segment.
  • The algorithms for anti-aliasing and anisotropic filtering have improved, along with optimizations leading to a lower impact on performance compared to earlier generations.
  • The drivers have received new optimizations for OpenGL, and ATI have reduced the gap to NVidia quite a lot.
  • In Doom3, when playing without anti-aliasing and anisotropic filtering, the ATI cards still suffer behind the NVidia GTX offerings. When activated, however, the situation changes, thanks to ATI’s solid anti-aliasing performance. A few things worth mentioning abut today’s graphics card situation.

  • The drivers available today are still in the beta stage, and cannot be considered stable. There doesn’t have to be anything wrong with beta drivers per se. The ones we tested do have cosmetic problems, and need to be polished a bit more before they achieve the quality we’re used to seeing.
  • The noise levels for the tested cards, especially the budget cards, are noticeable.
  • These points are far from critical, but some of the cards have been shipped, and one would expect decent drivers. And as for the sound levels, we’ve already seen pictures of passively cooled X130 cards, and we’re sure that ATI’s partners will work on making the cards quiet.

    Notice: Keep in mind this review was written before the launch of GeForce 7800 GTX 512MB. However we have an extensive review of this card coming to complete the coverage of this generation of graphic circuits.
    Also keep in mind that since the review was posted nVidia has released new drivers and two new cards, which we unfortunately haven’t had the time to test.

    ATI Radeon X1000 series

    + Best performance as of today
    + Supports new rendering techniques throughout the family
    + Good performance with anti-aliasing and anisotropic filtering

    – The drivers are still beta
    – Noise level

    Finally, we would like to thank ATI for sending us the ATI sampled, Leadtek for the 7800GTX, and Gigabyte for the GT card.


    Leave a Reply

    Please Login to comment
    Notifiera vid