ATI FireGL V7200 (Radeon X1800XT) Retro Review

An X1800XT in FireGL Suit? Can it play games?

  • RRP: £730
  • Release date: October 2005
  • Purchased in October 2025
  • Purchase Price: £5.60 Delivered

X1800XT cards seem to be rare on Ebay right now, I can only see one in the ‘sold and completed’ category – someone that isn’t me got a bargain at £20 on that one. There are some international listings at closer to £200, a bit too much I think.

I was very happy to find this V7200 which is based on the exact same hardware. It looks pretty damn cool as well.

It was one of those auctions listed as ‘untested’ well, that’s usually a reason to stay clear, especially when the seller lists other tech items but in this case it was a one-off graphics card.

I was feeling lucky so I reached out to the seller and asked for the story behind it. Apparently he had just had it for years in a box, he was more into old Macs. He dropped the price to something crazy and said I owed him a box of biscuits if it worked – well it fired up straight away, so I better get around to sorting that out.

Introduction – The Ati R500 Series

The Last of the Fixed-Function Giants

In autumn 2005, ATI unveiled the R500 family, known commercially as the Radeon X1000 series.
This generation was built on a brand-new architecture fundamentally different from its predecessors, but it remained rooted in the fixed-function pipeline model that had defined PC graphics cards since the 1990s.

R500 featured dedicated vertex and pixel shader hardware, supporting Shader Model 3.0, and brought significant improvements such as 90nm manufacturing, high precision FP32 shader support, and ATI’s new “Ultra-Threaded” dispatch engine, which increased efficiency for complex shading operations.

While unified shaders were still on the horizon, the R500 series gave ATI its first competitive edge over NVIDIA’s GeForce 6 and early GeForce 7 lines, especially in DirectX 9.0c and high dynamic range rendering tasks.

Notably, the technology and experience ATI gained with R500 contributed directly to their groundbreaking work on unified shaders for the Xbox 360’s Xenos GPU, which would later influence the design philosophies of the next generation of graphics cards.

The Radeon X1000 series marked a turning point, serving as both the pinnacle of fixed-function programmability and the launchpad for ATI’s leap into unified shaders in the years that followed.

ATI vs Nvidia: The final Fixed function showdown

The ATI R500 series, launched in 2005 as the Radeon X1000 lineup, competed directly with Nvidia’s GeForce 7 series—mainly the G70-based GeForce 7800 GTX and 7800 GT cards.

Architecturally, R500 introduced Shader Model 3.0 support, a 90nm manufacturing process, and ATI’s Ultra-Threaded dispatch engine, optimizing the use of its dedicated vertex and pixel shader hardware.

By contrast, Nvidia’s GeForce 7 series also used dedicated pipelines, featuring 24 pixel and 8 vertex shader units on top models, and implemented the CineFX 4.0 engine, boosting efficiency for lighting and shading effects as well as high-dynamic-range rendering.

Both lineups supported DirectX 9.0c, but while ATI invested in more efficient scheduling and precision FP32 execution, Nvidia’s design focused on maximizing pixel throughput and raw fill rates, often leading their cards to dominate benchmark scores in shader-heavy titles of the time.

Ati FireGL vs Ati Radeon

FireGL cards such as the V7200 were derived from the same core architectures as the gaming-oriented Radeon X1800 XT, but their histories and design philosophies diverged to serve distinct markets. ATI’s FireGL lineage began as a professional graphics solution, focused on CAD, 3D modeling, and digital content creation, benefitting from specialized drivers and hardware validation for stability and precision in workstation environments. Radeon cards, by contrast, prioritized high frame rates and visual effects for games, leading to a rapid and iterative development cycle driven by consumer demands and benchmarking trends.

During the mid-2000s, many gamers and enthusiasts noted the hardware similarities between FireGL and Radeon cards—some even attempted “softmodding” Radeons into FireGLs by flashing BIOSes and modifying drivers, hoping to unlock professional capabilities for lower cost. Nevertheless, ATI purposely differentiated FireGL cards with optimized OpenGL drivers, extended warranties, and, in some models, professional-grade memory including ECC RAM

ECC (Error-Correcting Code) RAM is a type of memory that detects and corrects single-bit errors in data, greatly reducing the risk of crashes and corruption in mission-critical applications. While not all FireGL cards featured ECC, its presence underscored their emphasis on reliability for long-duration tasks and complex 3D rendering pipelines—features that were irrelevant to gaming cards.

In terms of gaming performance, FireGL cards often performed similarly to their Radeon counterparts in raw benchmarks, but the professional drivers could allegedly lead to lower frame rates or incompatibility with certain DirectX games. Games of the era, especially those relying on DirectX 9 were supposed to run best on Radeon cards with gaming-focused Catalyst drivers, while FireGL cards truly excelled in OpenGL-based and workstation applications.

Actual real-life copper right here.

R500 FireGL and Radeon Translator chart

The first chart shows the FireGL card and it’s Radeon equivalent, pretty useful stuff when hunting for a bargain.

As you can see, it’s horribly confusing… and this just has the FireGL’s of this one generation.

ModelRadeon Equivalent (Core)ROPs/TMUsCRAM OptionsCore ClockMemory Clock
FireGL V3300Radeon X1300 Pro (RV515)4 / 4128MB DDR2600 MHz400 MHz
FireGL V3350Radeon X1300 Pro (RV515)4 / 4256MB DDR2600 MHz400 MHz
FireGL V3400Radeon X1600 Pro (RV530)4 / 4128MB GDDR3/DDR2500 MHz500 MHz
FireGL V5200Radeon X1600 XT (RV530)4 / 4256MB GDDR3/DDR2600 MHz700 MHz
FireGL V7200Radeon X1800 XL (R520)16 / 16256MB GDDR3600 MHz650 MHz
FireGL V7300Radeon X1800 XT (R520)16 / 16512MB GDDR3600 MHz650 MHz
FireGL V7350Radeon X1800 XT (R520)16 / 161024MB GDDR3600 MHz650 MHz

Top-end Radeon R500 series (and the V7200)

It’s a busy place at the top end of the R500 line, the internet says that this is because the R580 was pin-compatible with the R520, meaning ATI could reuse PCBs across multiple product iterations, encouraging more SKUs with minor changes instead of designing entirely new boards. 

I’d love to find an affordable X1950XT with all those TMU’s but I can’t imagine that they show up on Ebay often.

As you can see, the V7200 is a slightly underclocked X1800XT so, down the lower end of the high-end cards.

ModelCodeROPs/TMUsVRAM OptionsCore Clock (MHz)Memory Clock (MHz)
Radeon X1800 XLR52016/16256/512 MB500500 (1000 DDR3)
Radeon X1800 XTR52016/16256/512 MB625750 (1500 DDR3)
FireGL V7200R52016/16256 MB600650 (1300 DDR3)
Radeon X1800 GTOR52016/12256 MB500500 (1000 DDR3)
Radeon X1900 GTR58012/36256 MB575600 (1200 DDR3)
Radeon X1900 XTR58016/48256/512 MB625725 (1450 DDR3)
Radeon X1900 XTXR58016/48512 MB650775 (1550 DDR3)
Radeon X1950 ProRV57012/12256/512 MB575690 (1380 DDR3)
Radeon X1950 XTR58016/48256/512 MB625900 (1800 DDR3)
Radeon X1950 XTXR580+16/48512 MB6501000 (2000 DDR3)
Upside-down GPU.. outrageous

Turning FireGL into Radeon

The internet is full of rumours stating that the FireGL drivers are for professional use and not optimised to run games.

To test this rumour, I needed to get the Radeon Drivers running on my Windows XP System.

It was easy enough to do. You just run the Catalyst installation which will fail but does unpack the drivers. You can then you can go into Device manager, find the driver-less video card and manually point it at the .inf in the folder.

This is the lazy method and my start point, as part of the tinkering, I did edit the .inf to officially persuade the system that the card is supported.

This is also simple if you wanted to do the same, firstly find the driver ID in device manager like this:

Then, Ctrl+A when to highlight, Ctrl+c to copy then paste into notepad.

You can then find the X1800 driver line in the .inf file. Copy this line, tweak the name then paste in the Driver ID for the FiregL card

Once done and saved, Windows accepts that the driver works for the hardware and it installs without the error message.

It doesnt take long but is it worth it? Probably not.. it didn’t affect the performance at all.

Another option if you were feeling adventurous is to blindly flash a retail X1800XT Bios onto this card.

It’s not hard to do, but seems a little too high risk for a card that I would really like to keep. Doing so would affect voltages and the fan curve and could easily brick the card with no chance of recovery.

The Card

The mounting bracket can be discarded easily.. improving the looks somewhat

Here’s GPU’z screenshot of my example:

And with the Radeon Drivers:

The Test System

I use the Phenom II X4 for XP testing, not the most powerful XP processor but its 3.2Ghz clock speed should be more than enough to get the most out of mid-2000s games, even if all four of its cores are unlikely to be utilised.

The full details of the test system:

  • CPU: AMD Phenom II X4 955 3.2Ghz Black edition
  • 8Gb of 1866Mhz DDR3 Memory (showing as 3.25Gb on 32bit Windows XP and 1600Mhz limited by the platform)
  • Windows XP (build 2600, Service Pack 3)
  • Kingston SATA 240Gb SSD as a primary drive, an AliExpress SATA has the Win7 installation and games on.
  • ASRock 960GM-GS3 FX
  • The latest WinXP supported driver version Catalyst 13.4

Onto some benchmarks!:

Return to Castle Wolfenstein (2001)

Game Overview:

Return to Castle Wolfenstein launched in November 2001, developed by Gray Matter Interactive and published by Activision.

Built on a heavily modified id Tech 3 engine, it’s a first-person shooter that blends WWII combat with occult horror, secret weapons programs, and Nazi super-soldier experiments.

The game features a linear campaign with stealth elements, fast-paced gunplay, and memorable enemy encounters.

The engine supports DirectX 8.0 and OpenGL, with advanced lighting, skeletal animation, and ragdoll physics.

The Framerate is capped at 92fps

Performance Notes:

Technology moved on a lot in the four years between this game and our card being released.

As the hardware is quite end, it should be no surprise that the frame cap was hit and the 1% low was only a little lower. Clearly a smooth experience.

This would not run on the Radeon drivers however, oops, not exactly the performance comparison I was looking for!

OpenGL is apparently a problem.

Unreal Tournament 2003 (2002)

Game Overview:
Released in October 2002, Unreal Tournament 2003 was built on the early version of Unreal Engine 2. It was a big leap forward from the original UT, with improved visuals, ragdoll physics, and faster-paced gameplay.

The engine used DirectX 8.1 and introduced support for pixel shaders, dynamic lighting, and high-res textures all of which made it a solid test title for early 2000s hardware.

Still a great game and well worth going back to, even if you’re limited to bot matches these days. There’s even a single-player campaign of sorts, though it’s really just a ladder of bot battles.

The game holds up visually and mechanically, and it’s a good one to throw into the testing suite for older cards. The uncapped frames are pretty useful (and annoyingly rare) on these old titles.

Performance Notes

Pretty great really, a shame the consistency is not quite there.. still, who can complain when the settings are maxed out (at least on this monitor) and were hitting nearly 180 fps.

The game performs the same under both sets of drivers, easily within the margin of error.

Need for Speed Underground (2003)

Game Overview:

Released in November 2003, Need for Speed: Underground marked a major shift for the series, diving headfirst into tuner culture and neon-lit street racing.

Built on the EAGL engine (version 1), it introduced full car customisation, drift events, and a career mode wrapped in early-2000s flair.

The game runs on DirectX 9 but carries over some quirks from earlier engine builds.

Apparently v1.0 of this game does have uncapped frames but mine installs a later version right off the disk.

Performance Notes:

Capped at the monitors 75fps and running at this consistently.

The Radeon drivers did perform slightly worse, I retested and did a few more laps and that definitely was the result.. Not really a whole lot in it to be fair, but still.

No complaints here eitehr then. With settings absolutely maxed out we were hitting the monitors 75fps, all good!

Doom 3 (2004)

Game Overview:
Released in August 2004, Doom 3 was built on id Tech 4 and took the series in a darker, slower direction. It’s more horror than run-and-gun, with tight corridors, dynamic shadows, and a heavy focus on atmosphere. The engine introduced unified lighting and per-pixel effects, which made it a demanding title for its time, and still a good one to test mid-2000s hardware.

The game engine is limited to 60 FPS, but it includes an in-game benchmark that can be used for testing that doesn’t have this limit.

Performance Notes:

Looks at that framerate in Doom3, triple digits on most settings, only falling under 100fps on High settings with 4xAnti Aliasing.

There is a CNET Article where Doom3 was tested with an X1800XT 512Mb version which gave a score of 113 at 1024×768 and 81fps at 1280×1024, showing that this FireGL card is right where it should be.

Also note the entire absence of Radeon Driver results – The OpenGL issue again:

FarCry (2004)

Game Overview:
Far Cry launched in March 2004, developed by Crytek and built on the original CryEngine. It was a technical marvel at the time, with massive outdoor environments, dynamic lighting, and advanced AI. The game leaned heavily on pixel shaders and draw distance, making it a solid stress test for mid-2000s GPUs. It also laid the groundwork for what would later become the Crysis legacy.

For these tests I utilised the HardwareOC Far Cry Benchmark, this is impressive looking and free software that lets me get on with other things whilst it does three runs at chosen different settings then gives the average result with no input required.

The 1% Low figures are sadly missing but in it’s place we have some very accurate average FPS results:

Performance Notes:

Amazing performance really, easily over 100fps with Ultra Settings and 4x Anti-Aliasing.

As you can see, taking the time to force through Radeon Drivers was very much not worth the effort.. giving the same results, or worse by the smallest margin

Adding in 8 x Anti Aliasing and 16 x Anisotropic Filtering and we are still hitting big numbers of frames.

Very impressive stuff.

F.E.A.R. (2005)

Game Overview:
F.E.A.R. (First Encounter Assault Recon) launched on October 17, 2005 for Windows, developed by Monolith Productions. Built on the LithTech Jupiter EX engine, it was a technical showcase for dynamic lighting, volumetric effects, and intelligent enemy AI.

Performance Notes:

Maxed out and giving 40fps with only a slight drop when AA is enabled.

A pretty good result though in actual gameplay you’d probably be tempted to turn things down a notch.

The Radeon drivers gave comparable results, where this is an actual in-game benchmark, each run will be identical but still, margin of error stuff here.

This is with soft-shadows on

Battlefield 2 (2005)

Game Overview:
Battlefield 2 launched on June 21, 2005, developed by DICE and published by EA. It was a major evolution for the franchise, introducing modern warfare, class-based combat, and large-scale multiplayer battles with up to 64 players. Built on the Refractor 2 engine, it featured dynamic lighting, physics-based ragdolls, and destructible environments that pushed mid-2000s hardware.

Performance Notes:

A game released in the same year as the card was released and we’re still hitting triple digits of frames here, AA continues to make little difference as well.

What a great result.

Need for Speed: Carbon (2006)

Game Overview:

Need for Speed: Carbon hit the streets on October 30, 2006, developed by EA Black Box and published by Electronic Arts. As a direct sequel to Most Wanted, it shifted the franchise into nighttime racing and canyon duels, introducing crew mechanics and territory control. Built on the same engine as its predecessor but enhanced for dynamic lighting and particle effects, Carbon pushed mid-2000s GPUs with dense urban environments, motion blur, and aggressive post-processing.

Performance Notes:

Ok so finally we’ve found the limit, which is a shame, but then these mid-2000s NFS games do seem to run poorly

Earlier GPU testing was done on NFS: Most Wanted and that was even worse, this game is based on the same engine but is actually better optimised.

Still Medium Settings give nice smooth gameplay, a shame to have to dial things back.

Medieval II: Total War (2006)

Game Overview:
Released on November 10, 2006, Medieval II: Total War was developed by Creative Assembly and published by Sega. It’s the fourth entry in the Total War series, built on the enhanced Total War engine with support for Shader Model 2.0 and 3.0. The game blends turn-based strategy with real-time battles, set during the High Middle Ages, and includes historical scenarios like Agincourt.

Performance Notes:

I worry that this game is CPU limited and these results don’t help with that concern.

This game really does hammer a single core and these old Phenom II’s were not known for their single core performance.

Still, best settings and we can get a somewhat playable framerate, we seem to be a long way from the triple digits heights of 2005 and before though.

Test Drive Unlimited (2006)

Game Overview:
Released on September 5, 2006, Test Drive Unlimited was developed by Eden Games and published by Atari. It marked a major technical leap for the Test Drive franchise, built on the proprietary Twilight Engine, which supported streaming open-world assets, real-time weather, and Shader Model 3.0 effects. The game ran on DirectX 9, with enhanced support for HDR lighting and dynamic shadows, optimized for both PC and seventh-gen consoles.

At launch, TDU was praised for its ambitious scale, vehicle fidelity, and online integration, though some critics noted AI quirks, limited damage modeling, and performance bottlenecks on lower-end rigs. The PC version especially benefited from community mods and unofficial patches that expanded car libraries and improved stability.

Performance Notes:

A more modern game engine here and the FireGL struggles to keep up. 40FPS is fine but the 10fps 1% low highlights the stuttering that you can definitely feel.

The Radeon Drivers actually improved the 1% Low figure, albeit bringing the average framerate down at the same time.

I checked back at the recent mid-range cards from 2007 and the Nvidia 8600GT was at 61fps average and 51fps 1% Low.

Definitely a game that likes it’s unified shaders then, sadly medium settings or lower on the V7200.

Oblivion (2006)

Game Overview:
Oblivion launched on March 20, 2006, developed by Bethesda Game Studios. Built on the Gamebryo engine, it introduced a vast open world, dynamic weather, and real-time lighting. The game was a technical leap for RPGs, with detailed environments and extensive mod support that kept it alive well beyond its release window (it’s just had a re-release recently).

Known for its sprawling world Oblivion remains a benchmark title for mid-2000s hardware. The game’s reliance on draw distance and lighting effects makes GPUs struggle.

Performance Notes:

I refused to give up on the card and aimed high again in Oblivion, starting testing at Ultra and getting some good results.

44fps average at 1280×1024 with 4xAA is no mean feat. The Radeon Drivers were comparable, but slightly worse, continuing with that pattern.

Crysis (2007)

looking pretty but, at a cost in frames
showing options all set to Medium in this example

Game Overview:
Crysis launched in November 2007 and quickly became the go-to benchmark title for PC gamers. Built on CryEngine 2, it pushed hardware to the limit with massive draw distances, dynamic lighting, destructible environments, and full DirectX 10 support.

Performance Notes:

68fps average at 1024×768 is not bad, even if it had to be on low settings.

Medium and high are out of reach for this old card, this is Crysis though so it is to be expected.

The Radeon drivers give the same results when tested.

S.T.A.L.K.E.R. (2007)

Game Overview:
Released in March 2007, S.T.A.L.K.E.R.: Shadow of Chernobyl was developed by GSC Game World and runs on the X-Ray engine. It’s a gritty survival shooter set in the Chernobyl Exclusion Zone, blending open-world exploration with horror elements and tactical combat. The engine supports DirectX 8 and 9, with optional dynamic lighting and physics that can push older hardware to its limits.

Dynamic lighting – sure does make things look pretty, but at quite a significant cost in performance

Performance Notes:

A very good result showing with Static lighting, the Radeon drivers gave roughly the same framerate but with a significant reduction in 1% low figure.

Getting full dynamic lighting on was a nightmare though, on the official drivers it would crash before I could get a result. With the Radeon drivers it actually gets a bit further but with a huge pause every few seconds.

If you want to play S.T.A.L.K.E.R with dynamic lighting, get a better graphics card than this one.

The difference in behaviour (instant crash vs. running with big stalls) really points at driver code paths: FireGL builds hitting some unhandled state, Catalyst builds working but being overwhelmed by the workload and perhaps juggling VRAM aggressively.

Assassins Creed (2007)

Game Overview:
Assassin’s Creed launched in November 2007, developed by Ubisoft Montreal and built on the Anvil engine. It introduced open-world stealth gameplay, parkour movement, and historical settings wrapped in sci-fi framing. The first entry takes place during the Third Crusade, with cities like Damascus, Acre, and Jerusalem rendered in impressive detail for the time.

Performance Notes:

Jogging around the city in the First Assassins Creed game and things are looking pretty good really, with 55fps and a 38fps 1% low. This feels about as good as it get’s at this resolution, I’ve never played this game as a silky smooth experience on any hardware tested so far.

Even when maxing out the settings at 1024×768 performance doesn’t drop too dramatically.

At these highest settings, I tested against the Radeon driver which gave comparable results – an actual slight increase in average framerate with the Radeon drivers though not by any significant margin.

Synthetic Benchmarks

3d Mark 2000

ATi FireGL V7200ATi FireGL V7200 (Radeon Drivers)
Score35,09735,226

3d Mark 2001 SE

ATi FireGL V7200
ATi FireGL V7200 (Radeon Drivers)
Score36,05335,998

3d Mark 2003

ATi FireGL V7200ATi FireGL V7200 (Radeon Drivers)
Score17,34717,421
Fill Rate (Single Texturing)3449.63,449.6
Fill Rate (Multi-Texturing)8145.48,144.6
Pixel Shader 2.0147.1147.1

3d Mark 2005

ATi FireGL V7200ATi FireGL V7200 (Radeon Drivers)
Score89118998

I did find some online comparisons to the X1800XT:

https://www.cnet.com/reviews/ati-radeon-x1800-xt-512-mb-review/

shows a score for the X1800XT 512Mb version at 9,240 at the same resolution in this benchmark.

3d Mark 2006

ATi FireGL V7200 ATi FireGL V7200 (Radeon Drivers)
Score5,7605,754
Shader Model 2.0 Score2,0462,045
HDR/Shader Model 3.0 Score2,1782,173

Unigine Haven

ATi FireGL V7200ATi FireGL V7200 (Radeon Drivers)
Score1,4821,479
Average FPS34.934.9
Min FPS19.822.8
Max FPS46.645.1

Summary and Conclusions

It’s a great card this and I’ve really enjoyed playing about with it, I may just be biased because it looks cool but, when you own as many video cards as I do, it’s good that it stands out.

The actual copper being used in the cooling solution shows proper quality.. it does weigh a lot!

Performance wise – it is very impressive, at the time of release, this would play most titles well

Things do seem to fall off a cliff from 2006 onwards, the more advanced features of the new games and tailored towards unified shader GPU’s does the card no favours.

I don’t have much to compare this too yet, but I will put a X1950 pro in there, soon. I’m on the lookout for a good GeForce 7800/ 7900 card which would be great, these don’t seem to be quite so rare, but supply can hardly be called plentiful.

I’m reasonably certain that the FireGL V7200 is every bit as capable as the X1800XT would have been (with a slight handicap due to the slightly lower clock speed).

The drivers seem to make little difference either way and it wasn’t really worth the effort!

As with all articles, I aim to return to this one, perhaps with some more bechmarked games or other testing.

Leave a comment

Is this your new site? Log in to activate admin features and dismiss this message
Log In