A return to ATIs TeraScale 1.0 midrange
- RRP: £90 (well, this exact model would have been in a pre-built PC)
- Release date: June 28 2007
- Purchased in November 2024
- Purchase Price: £8
Continuing on with the testing of the first unified shader designs, I did test a 256Mb version of the HD2600xt here which may be worth reading, this card was then handily beaten by the Nvidia 8600GT here.
Would twice the VRAM even things up a little? Some of the later games tested did seem to sit close to that 256Mb buffe. Let’s find out.

Introduction – The HD2000 Series
From Fixed-Function to Unified Shaders
In the summer of 2007, ATI unveiled the RV600 series, known as the Radeon HD 2000 series.
These new GPUs were built on a new architecture called TeraScale 1 and marked a radical shift in GPU design, moving away from previous generations’ fixed-function pipelines which could be found on all cards back to the very first 3d accellerators.
Individual vertex and pixel shaders were replaced by a unified shader model in which a pool of stream processors could handle vertex, pixel, and geometry shading interchangeably.
This series of graphics cards was not ATi’s first foray into the world of unified shaders as they had experience developing the chip contained within the Xbox 360 Console, with a GPU called Xenos.
The RV600 series directly competed with Nvidia’s G80 architecture from the GeForce 8 series which also utilised unified shaders and was released half a year earlier in November 2006.
ATI vs Nvidia: Shader Efficiency and Design
The Unified shader architectures allowed both ATI and Nvidia to assign processing power where needed which was a huge advance over earlier designs. The two companies approached the challenge differently:
- ATI TeraScale/HD2000: Implemented fewer, but wider SIMD shader units. Each ATI “stream processor” was highly efficient, able to parallelize workloads better for certain compute tasks, meaning ATI cards often delivered higher performance-per-watt than their Nvidia rivals of the time. For example, the HD 2600 variants had 120 stream processors; the HD 2900 XT had 320. The architecture’s ability to assign any shader processor to any task, and its optimizations for efficiency, sometimes resulted in superior performance in shader-heavy applications despite lower raw hardware counts.
- Nvidia Tesla/G80: Focused on having more individual shader units, but each was less wide and specialized than ATI’s. This approach led to higher raw peak throughput but could result in lower efficiency under real-world loads, especially those requiring extensive parallelism. Nvidia’s design sometimes excelled in brute-force scenarios but could lag in efficiency per shader, particularly in DirectX 10 titles at the time.
Think of ATI shader units as crew teams working together handling multiple pixels simultaneously. Nvidia shader units for comparison, are individual workers with superhuman strength.
Comparing the actual numbers of shader units is like comparing hours worked by teams vs hours worked by individuals, numbers alone don’t tell the whole story without knowing the size and power of each worker or team.
In summary, the ATi cards sound better with their vast numbers of shader units but this doesn’t always translate into better performance.
Shader Handling: Games and Compatibility
Unified shader cards transformed gaming visuals in DirectX 10 titles which were incorporated into Windows Vista and later.
They allowed dynamic assignment of computing resources, boosting flexibility and performance in modern games that were built for these new capabilities.
Performance and image quality improved, especially in games leveraging Shader Model 4.0 features like geometry shaders and high dynamic range processing.
However, adoption was gradual, and new cards faced mixed results until the software caught up.
Many popular games during the HD 2000 era (and later) still targeted DirectX 9 and Windows XP. Here, performance wasn’t always optimal, in some cases, HD 2000 cards were outperformed by older hardware more tailored for legacy APIs, especially before drivers matured.
ATI’s HD2000 Lineup Overview

With codenames like RV610, RV630, and R600, the HD2000 series spanned three performance tiers:
- RV610 (HD 2400): Designed for budget systems, featuring efficient power use and enhanced media playback through its on-die Unified Video Decoder (UVD).
- RV630 (HD 2600): Struck a balance between gaming and multimedia, incorporating both performance and tuning for video decoding.
- R600 (HD 2900): The flagship, built for raw power, equipped with a massive 512-bit memory bus and workstation-class ambitions, though it notably omitted dedicated hardware UVD.
Despite impressive specs, especially on the HD 2900 XT, the generation was a mixed bag for enthusiasts.
Driver immaturity and lack of hardware video decode (in the high-end model) tempered excitement in early months.
The Card
This HD2600XT is an OEM card, one of thousands made on mass on behalf of a large system integrator.
OEM cards don’t have a great reputation, they are expected to be made with cost savings as the priority, worse cooling solutions, cheaper memory, as much taken out as possible whilst retaining the same name.
As you can see on this version, CrossFire support is gone, it’s also very functional with just a heatsink and fan covering the GPU itself.
The fan does look up to the task though and there does seem to be airflow over those memory chips to keep them cool which isn’t always the case.
I/O includes two DVI outputs and S-Video as standard, no problems there either.

The price of £8 can hardly be argued with either, I did pay more for the Sapphire version of this card tested, but then Ebay pricing on these things is all over the place really.
The complete lack of branding on the card left me searching online using the serial numbers on the PCB. those numbers led me to the evidence suggesting this card was manufactured for Lenovo, likely by MSI.
Here’s where our card sits in the generation:

This model has the slower GDDR3 memory but 512Mb of it at least – I’ve never seen a GDDR4 version advertised anywhere before.
This data mainly comes from Wikipedia though, which has definitely let me down before!
Here’s GPU’z screenshot of my example:

Note the subvendor of MSI, the likely manufacturer of the card for Lenovo.
The actual MSI version of the 2600XT is below, showing the difference between MSI made-for-OEM and true MSI:

Lenovo IdeaCentre K210
I was interested to see the sort of system that this card would have found itself sold in, some research suggests that this could be an IdeaCentre K210.

This was a system designed for the consumer desktop market and came with a Intel Core processor (up to Core2 Quad) and up to 3Gb of DDR2 RAM.
The only GPU upgrade available was this HD2600XT, a definite improvement over the standard Intel GMA3100 integrated graphics.
Prices for the system started at $379 after a mail in rebate, I assume that HD2600XT option would have cost a fair bit more, but I could find no details on the subject.
Here’s a semi-interesting fact for you – this was the first Lenovo desktop computer sold outside of China
The Test System
I use the Phenom II X4 for XP testing, not the most powerful XP processor but its 3.2Ghz clock speed should be more than enough to get the most out of mid-2000s games, even if all four of its cores are unlikely to be utilised.
The full details of the test system:
- CPU: AMD Phenom II X4 955 3.2Ghz Black edition
- 8Gb of 1866Mhz DDR3 Memory (showing as 3.25Gb on 32bit Windows XP and 1600Mhz limited by the platform)
- Windows XP (build 2600, Service Pack 3)
- Kingston SATA 240Gb SSD as a primary drive, an AliExpress SATA has the Win7 installation and games on.
- ASRock 960GM-GS3 FX
- The latest WinXP supported driver version Catalyst 13.4
I will be testing this card against the Sapphire Version of the HD2600XT and Nvidia’s competition, with the 8600GT.
How the three stack up:

On to some benchmarks!:
Return to Castle Wolfenstein (2001)

Game Overview:
Return to Castle Wolfenstein launched in November 2001, developed by Gray Matter Interactive and published by Activision.
Built on a heavily modified id Tech 3 engine, it’s a first-person shooter that blends WWII combat with occult horror, secret weapons programs, and Nazi super-soldier experiments.
The game features a linear campaign with stealth elements, fast-paced gunplay, and memorable enemy encounters.
The engine supports DirectX 8.0 and OpenGL, with advanced lighting, skeletal animation, and ragdoll physics.
The Framerate is capped at 92fps

Performance Summary
Just a test of compatibility, the Lenovo card seemed to give out a higher average framerate by a small amount, despite the lower clock speed.
I would discount this as just margin of error though, afterburner showed that the VRAM used at this resolution never reaches 130Mb so the larger buffer on the Lenovo card is unlikely to have contributed to the result.
Whilst the HD2600XT and 8600GT both give a good showing in this old title, it’s clear that the Nvidia architecture works the best with getting on for double the 1% low result.
Mafia (2002)
Game Overview:
Mafia launched in August 2002, developed by Illusion Softworks and built on the LS3D engine. It’s a story-driven third-person shooter set in the 1930s, with a focus on cinematic presentation, period-authentic vehicles, and a surprisingly detailed open world for its time. The engine uses DirectX 8.0 and was known for its lighting effects and physics.


It’s more of a compatibility test these days than a performance one. The game has a hard frame cap of 62 FPS built into the engine, which makes benchmarking a little odd. Still, it’s a good title to check how older GPUs handle legacy rendering paths and driver quirks.

Performance Summary
On the cap as expected, the exact same result as the other cards, as expected.
Let’s move on quickly… this doesn’t make for interesting reading.

Unreal Tournament 2003 (2002)
Game Overview:
Released in October 2002, Unreal Tournament 2003 was built on the early version of Unreal Engine 2. It was a big leap forward from the original UT, with improved visuals, ragdoll physics, and faster-paced gameplay.
The engine used DirectX 8.1 and introduced support for pixel shaders, dynamic lighting, and high-res textures all of which made it a solid test title for early 2000s hardware.

Still a great game and well worth going back to, even if you’re mostly limited to bot matches these days. There’s even a single-player campaign of sorts, though it’s really just a ladder of bot battles.
The game holds up visually and mechanically, and it’s a good one to throw into the testing suite for older cards. The uncapped frames are pretty useful (and annoyingly rare) on these old titles.

The Lenovo 2600XT falls slightly behind the Sapphire example in average FPS and slightly above in 1% Low.
VRAM is not an issue, staying well under 150Mb at all time so, margin of error stuff again.
Need for Speed Underground (2003)
Game Overview:
Released in November 2003, Need for Speed: Underground marked a major shift for the series, diving headfirst into tuner culture and neon-lit street racing.
Built on the EAGL engine (version 1), it introduced full car customisation, drift events, and a career mode wrapped in early-2000s flair.
The game runs on DirectX 9 but carries over some quirks from earlier engine builds.
Apparently v1.0 of this game does have uncapped frames but mine installs a later version right off the disk.


Performance Notes:
It’s a constant pain having performance limited in these earlier titles. Would be great to know how many frames these later cards could throw out, but there are game-engine reasons why they don’t.
I really don’t know why the Lenovo card stutters so hard in this game. It should perform close to on-par with the Sapphire version, it certainly shouldn’t be so far adrift on the 1% lows.
I played the same race at the same settings using the same car and repeated the test when it appeared unlikely. It’s poor performance is something of a mystery!
Doom 3 (2004)
Game Overview:
Released in August 2004, Doom 3 was built on id Tech 4 and took the series in a darker, slower direction. It’s more horror than run-and-gun, with tight corridors, dynamic shadows, and a heavy focus on atmosphere. The engine introduced unified lighting and per-pixel effects, which made it a demanding title for its time, and still a good one to test mid-2000s hardware.
The game engine is limited to 60 FPS, but it includes an in-game benchmark that can be used for testing that doesn’t have this limit.

Performance Ntes:
At 1024×768, the Nvidia 8600GT continues its winning streak in raw performance, leading the pack in both 0xAA and 4xAA tests. However, the Lenovo HD2600XT shows a surprising edge over the Sapphire variant, despite having a lower core clock.
The key difference is VRAM: Lenovo’s 512MB allows it to handle texture loads more efficiently, especially when anti-aliasing is applied. This is the first benchmark where memory capacity begins to visibly influence results, hinting at a shift in how Doom 3 stresses the hardware.

As resolution increases, the impact of VRAM becomes even more pronounced.
The Lenovo HD2600XT maintains its lead over the Sapphire card, and even overtakes the Nvidia 8600GT in 4xAA performance.
The extra memory helps the Lenovo card avoid bottlenecks that affect its 256MB rivals, especially when higher resolution and anti-aliasing combine to push texture demands.
This marks a turning point: clock speed alone is no longer enough, and memory capacity starts to define the upper limits of smooth gameplay.

Doom 3’s Ultra setting requires 512MB of VRAM, effectively locking out both the Sapphire HD2600XT and the Nvidia 8600GT.
The Lenovo HD2600XT stands alone here, not just running Ultra mode but delivering playable frame rates at both resolutions—even with 4xAA enabled.
This is the clearest demonstration yet of how VRAM can unlock higher fidelity settings and maintain performance.


annoyingly, this game doesn’t seem to report back used Memory, which would have been very useful.
I checked GPU-z also to see if I can record things from there but no luck, this was the same for the other cards also.
FarCry (2004)
For these tests I ditched the normal benchmarking gameplay run and instead utilised the HardwareOC Far Cry Benchmark, this is impressive looking and free software that lets me get on with other things whilst it does three runs at chosen different settings then gives the average result with no input required.
This did require me to retest the other two cards so this caused a little messing about.
Game Overview:
Far Cry launched in March 2004, developed by Crytek and built on the original CryEngine. It was a technical marvel at the time, with massive outdoor environments, dynamic lighting, and advanced AI. The game leaned heavily on pixel shaders and draw distance, making it a solid stress test for mid-2000s GPUs. It also laid the groundwork for what would later become the Crysis legacy.


At minimum settings, the Lenovo HD2600XT shines. It edges out the Sapphire card at 800×600 and remains close at higher resolutions, with only a small drop at 1280×1024.
The 8600GT leads overall, but the Lenovo’s performance is more than sufficient for ultra-high framerates, even at 1280×1024. This chart confirms that the Lenovo card is not only capable of handling legacy titles with ease, but also benefits from its larger VRAM pool when pushing frame rates into the hundreds.
It may be that the CPU may be limiting things here, results are suspiciously similar at 800×600 and only separate as the resolution increases

The Lenovo HD2600XT performs admirably under Ultra settings without anti-aliasing, staying within 5% of the Sapphire card at all tested resolutions. At 800×600 and 1024×768, it delivers smooth framerates well above 60 FPS, with only a modest drop at 1280×1024.
The 8600GT leads at higher resolutions, likely due to architectural advantages in raw fillrate, but the Lenovo card remains competitive. This chart shows the Lenovo’s ability to handle demanding visuals without the added strain of AA, making it a solid choice for high-detail play at lower resolutions.

Enabling 6xAA introduces a steep performance penalty across the board, and the Lenovo HD2600XT is no exception. While it trails slightly behind the Sapphire card at all resolutions, the gap remains consistent and modest.
The 8600GT, however, pulls far ahead, suggesting stronger multisample AA performance on Nvidia’s architecture. Despite this, the Lenovo card still delivers playable framerates at 800×600 and 1024×768, and its 512MB VRAM helps prevent any major stutter or asset streaming issues.
This test highlights the limits of the HD2600XT’s AA capabilities, but also its stability under pressure.
F.E.A.R. (2005)

Game Overview:
F.E.A.R. (First Encounter Assault Recon) launched on October 17, 2005 for Windows, developed by Monolith Productions. Built on the LithTech Jupiter EX engine, it was a technical showcase for dynamic lighting, volumetric effects, and intelligent enemy AI. The game blende

Performance Notes:
At standard resolution with maximum settings, the Lenovo HD2600XT delivers a mixed showing. With 4xAA and 16xAF enabled, its average frame rate lands behind the Sapphire variant by a small margin, and well below the Nvidia 8600GT. However, when anti-aliasing is disabled, the Lenovo card matches the Sapphire frame-for-frame, both achieving a solid 69 fps average. This suggests that while the Lenovo’s lower core clock may limit its peak performance under heavy load, it holds its own when AA is removed.
Interestingly, Afterburner reports that VRAM usage never approached the 512MB ceiling, even with all settings maxed. This implies that F.E.A.R.’s memory demands are modest, and the Lenovo card’s extra capacity doesn’t translate into a performance advantage here. Unlike Doom 3, where VRAM was a decisive factor, F.E.A.R. seems more sensitive to raw GPU throughput and shader efficiency

At the highest resolution supported by this monitor, performance dips across the board, but the Lenovo HD2600XT remains consistent. Whether soft shadows are enabled or disabled, its average frame rate stays locked at 22 fps. The warning message about soft shadows being a high-performance feature proves misleading—there’s no measurable impact on performance when toggling the setting.
Compared to the Sapphire card, the Lenovo trails slightly in minimum and average frame rates, but the gap is narrow. Against the Nvidia 8600GT, however, the difference is stark. Nvidia’s card maintains nearly double the average frame rate, reinforcing its dominance in shader-heavy titles like F.E.A.R.
Battlefield 2 (2005)
Game Overview:
Battlefield 2 launched on June 21, 2005, developed by DICE and published by EA. It was a major evolution for the franchise, introducing modern warfare, class-based combat, and large-scale multiplayer battles with up to 64 players. Built on the Refractor 2 engine, it featured dynamic lighting, physics-based ragdolls, and destructible environments that pushed mid-2000s hardware.

Performance Notes

The Lenovo HD2600XT delivers a respectable 92 FPS average, but its 1% low of 40 FPS reveals instability during combat-heavy sequences. Interestingly, MSI Afterburner logged VRAM usage peaking at 325MB—well above the 256MB threshold of its Sapphire sibling—yet this surplus didn’t translate into smoother lows or tighter frame pacing. The 52 FPS delta suggests frequent dips, likely tied to its lower core clock and OEM tuning, which seem to bottleneck throughput despite the available memory headroom.

With 4xAA enabled, the Lenovo card’s performance drops sharply to a 43 FPS average and a 1% low of just 26 FPS, making it the weakest of the trio. Again, VRAM usage peaked at 325MB, but the extra buffer failed to offset the AA load, reinforcing that memory capacity alone isn’t enough without matching bandwidth and core strength. Compared to the Sapphire’s slightly better frame floor and the 8600GT’s commanding lead, the Lenovo’s 17 FPS delta underscores its struggle to maintain consistency under visual stress.

Need for Speed: Carbon (2006)

Game Overview:
Need for Speed: Carbon hit the streets on October 30, 2006, developed by EA Black Box and published by Electronic Arts. As a direct sequel to Most Wanted, it shifted the franchise into nighttime racing and canyon duels, introducing crew mechanics and territory control. Built on the same engine as its predecessor but enhanced for dynamic lighting and particle effects, Carbon pushed mid-2000s GPUs with dense urban environments, motion blur, and aggressive post-processing.
I forgot the open world for these benchmarks, too much variation. Instead a single sprint race up Agostini Avenue with 6 opponents.

At the lowest resolution tested, the Lenovo card’s performance ceiling becomes clear: despite the reduced load, it peaks at just 68 FPS and bottoms out at 28, far behind both the Sapphire and the 8600GT. This consistent underperformance across all settings implies either thermal throttling or firmware-level constraints typical of OEM variants, making it a less ideal candidate for archival benchmarking or enthusiast use.

Under medium settings, the Lenovo HD2600XT delivers a surprisingly strong minimum frame rate of 44 FPS—equal to the Sapphire—but its peak and average remain subdued. This suggests that while the card can maintain baseline fluidity, it lacks the burst performance seen in its higher-clocked sibling, reinforcing the idea that its OEM tuning favors stability over speed.

The Lenovo card shows modest gains at reduced resolution, nearly matching the Sapphire in peak performance without AA, but still trailing in minimums. When 4xAA is enabled, its frame rate collapses to a mere 13 FPS peak and 3 FPS minimum, confirming that its extra VRAM offers no meaningful buffer against the AA-induced load. Compared to the 8600GT’s resilience under AA, the Lenovo’s limitations are stark and consistent.

At high resolution with anisotropic filtering and no anti-aliasing, the Lenovo HD2600XT struggles to keep pace with its Sapphire sibling and the 8600GT, posting the lowest minimum and average frame rates. Despite its 512MB VRAM advantage, the 100MHz core clock deficit appears to bottleneck performance, especially during heavy post-processing scenes, suggesting that memory alone can’t offset architectural limitations at this tier.
Medieval II: Total War (2006)
Game Overview:
Released on November 10, 2006, Medieval II: Total War was developed by Creative Assembly and published by Sega. It’s the fourth entry in the Total War series, built on the enhanced Total War engine with support for Shader Model 2.0 and 3.0. The game blends turn-based strategy with real-time battles, set during the High Middle Ages, and includes historical scenarios like Agincourt.


Peformance Notes:
Medieval II: Total War shows the Lenovo HD2600XT holding steady at medium settings with Shader Model 1.0, trailing the Sapphire card slightly and falling behind the Nvidia 8600GT in average frame rate. The gap widens when anti-aliasing is added, though the difference remains modest.
At “Best” settings with Shader Model 2.0, the Lenovo card struggles more noticeably. Its lower core clock must be a limiting factor compared to the Sapphire variant. The retest confirms this drop isn’t a fluke, performance consistently lags behind the sapphire card in these settings.
Interestingly, at the highest tested settings with 4xAA and 2xAF, all three cards converge. The Lenovo matches the Sapphire exactly, while Nvidia edges ahead slightly. This suggests a shared architectural ceiling across the mid-range lineup, where memory and clock speed give way to broader GPU limits.
Test Drive Unlimited (2006)
Game Overview:
Released on September 5, 2006, Test Drive Unlimited was developed by Eden Games and published by Atari. It marked a major technical leap for the Test Drive franchise, built on the proprietary Twilight Engine, which supported streaming open-world assets, real-time weather, and Shader Model 3.0 effects. The game ran on DirectX 9, with enhanced support for HDR lighting and dynamic shadows, optimized for both PC and seventh-gen consoles.
At launch, TDU was praised for its ambitious scale, vehicle fidelity, and online integration, though some critics noted AI quirks, limited damage modeling, and performance bottlenecks on lower-end rigs. The PC version especially benefited from community mods and unofficial patches that expanded car libraries and improved stability.

Performance Notes
At 1024×768 with high settings, the Lenovo HD2600XT delivers a respectable 51 FPS average and a 1% low of 31. It trails the Sapphire version slightly, which benefits from a 100MHz clock bump, but the gap is modest. When 4xAA is enabled, both HD2600XT cards drop sharply, Lenovo lands at 25 FPS average and 13 FPS 1% low, showing that anti-aliasing is a major stress point.
The extra VRAM doesn’t offer much relief here, and the Nvidia 8600GT clearly leads with smoother performance under load.

At 1280×1024, the Lenovo card matches Sapphire on average FPS (and edges ahead on 1% lows , hinting that its 512MB VRAM may help buffer heavier scenes when AA is off.
But once 4xAA is applied, both HD2600XT variants stall at around 17–18 FPS, with 1% lows stuck at 9. The 8600GT again maintains a clear lead, suggesting that fill rate and bandwidth matter more than memory size in this title.
Overall, Lenovo’s HD2600XT is consistent and capable, especially in non-AA scenarios, but it doesn’t quite break away from its siblings.
Oblivion (2006)
Game Overview:
Oblivion launched on March 20, 2006, developed by Bethesda Game Studios. Built on the Gamebryo engine, it introduced a vast open world, dynamic weather, and real-time lighting. The game was a technical leap for RPGs, with detailed environments and extensive mod support that kept it alive well beyond its release window (it’s just had a re-release recently).
Known for its sprawling world Oblivion remains a benchmark title for mid-2000s hardware. The game’s reliance on draw distance and lighting effects makes GPUs struggle.

At 800×600, the Lenovo HD2600XT performs well across all settings, with an 82 FPS average on medium and 58 FPS on ultra high. It stays close to the Sapphire variant, which benefits from a slight clock bump, and trails the 8600GT by a modest margin. With 4xAA enabled, Lenovo drops to 41 FPS average and 12 FPS 1% low which is nearly identical to Sapphire while the 8600GT maintains a strong lead. Despite Lenovo’s 512MB VRAM, Afterburner confirms the buffer wasn’t saturated, suggesting memory wasn’t the limiting factor.

At 1024×768, Lenovo continues to hold its own. It averages 69 FPS on medium settings and 49 FPS on ultra high, again just behind Sapphire and the 8600GT. With 4xAA, performance dips to 31 FPS average and 9 FPS 1% low, matching Sapphire and falling well below the 8600GT’s 54 FPS average. The consistent results between the two HD2600XT cards hint that the extra VRAM on the Lenovo model doesn’t offer a clear advantage, and minor differences may simply reflect run-to-run variance.

At 1280×1024, Lenovo averages 38 FPS on ultra high and 22 FPS with 4xAA, identical to Sapphire and well behind the 8600GT. The 1% lows drop to 13 and 6 respectively, showing that this resolution with AA pushes both HD2600XT cards to their limits. Again, VRAM usage stayed below the 512MB ceiling, reinforcing that memory size wasn’t a bottleneck. Lenovo’s performance remains consistent, but not exceptional, making it a reliable if unremarkable entry in the mid-range collector’s lineup.
Call of Juarez DX10 (2006)
Call of Juarez launched in June 2006, developed by Techland and powered by the Chrome Engine 3. It’s a first-person shooter set in the American Wild West, blending gritty gunfights with biblical overtones and dual protagonists: Reverend Ray, a gun-slinging preacher, and Billy Candle, a fugitive accused of murder. The game alternates between stealth and action, with period-authentic weapons, horseback riding, and stylized sepia-toned visuals.
Built for DirectX 9.0c and optionally DX10, the Chrome Engine delivers dynamic lighting, HDR effects, and physics-driven interactions.
I only have the GOG version of the game which will only run in DX10 (the DX9 executable seems to be missing).
the game comes with DX10 benchmarking software so I gave it a run using the Windows 7 partition – therby taking advantage of the 8Gb of System RAM, half of which remains hidden to WinXP.

| Preset | Settings |
| Low | Shadows Quality 0, Super Sampling off, Multi-Sampling off |
| Custom | Shadows quality 1, Super Sampling off, Multi-Sampling off |
| Balanced | Shadows quality 1, Super Sampling off, Multi-Sampling x2 |

The Lenovo HD2600XT’s 512MB of GDDR3 initially seemed like its trump card, especially in Call of Juarez where VRAM usage pushed past 400MB.
Yet in stock form, it lagged behind the Sapphire variant across all presets averaging 31.7 FPS on Low versus Sapphire’s 34.4.
I don’t fully trust this OEM card and suspect something more is going on, so I knocked up the clock speed to match Sapphire’s 800MHz core and sure enough. At the Balanced preset, its average jumped from 15.3 to 16.6 FPS, overtaking the Sapphire’s 16.1 and even widening the minimum-to-average delta.
What a difference 100Mhz makes, obvious when you think about it.
Crysis (2007)


Game Overview:
Crysis launched in November 2007 and quickly became the go-to benchmark title for PC gamers. Built on CryEngine 2, it pushed hardware to the limit with massive draw distances, dynamic lighting, destructible environments, and full DirectX 10 support.

At 800×600, the Sapphire HD2600XT and the 8600GT deliver a smoother experience than the Lenovo card, especially under medium settings where the Lenovo’s 1 percent lows suggest more frequent stutter. High settings widen the gap further, with the Lenovo card struggling to maintain consistency. The Sapphire and 8600GT remain neck and neck, showing that VRAM size alone doesn’t guarantee better performance.

At 1024×768, the Lenovo card shows a strange split personality. On low settings, it posts the highest average of the trio, yet its 1 percent lows trail behind, hinting at instability. Medium and high settings expose its weakness more clearly, with the Sapphire and 8600GT offering steadier frame pacing. The Lenovo’s drop in minimums is especially sharp at high settings, suggesting a bottleneck that isn’t just about memory bandwidth.

The 1024×768 with 4xAA test is where the Lenovo card truly falters. Despite having more VRAM, it posts the lowest figures across the board. The Sapphire card also struggles, but its frame pacing remains tighter. The 8600GT edges ahead, showing that its architecture handles AA more gracefully in this title. The Lenovo’s poor showing here reinforces the idea that something deeper is amiss, possibly driver-level or a quirk in the OEM BIOS.

At 1280×1024, the Lenovo card is overwhelmed. Its performance nosedives, with both minimum and average figures well below the others. The Sapphire card holds up surprisingly well, maintaining playable averages and avoiding the severe dips seen in the Lenovo. The 8600GT continues its trend of consistent pacing, proving that its balance of memory and core efficiency still holds up in this resolution bracket.

Crysis 64Bit (Win7)

All you need to do to play DX10 Crysis is to run it on a 64bit operating system, the DX10 enabled version of the game begins.
Research suggests that everything needs to be switched onto ‘very high’ to get the actual DX10 experience though, posing something of a problem as the we saw these cards struggle, even at lower settings
Still, it’ll be fun finding out:

VRAM usage peaked at 500Mb according to afterburner, you would think that would mean this Lenovo card would trounce the competition but.. again, no.
It seems to have at least overcome the 100Mhz clock speed deficit at least, with the 13fps average matched and a 1% low improvement. A quick overclock to the 800Mhz sees us get a whole extra frame (well, I always round up a .5 so.. less than that).
Both HD2600XT’s trounce the 8600GT at these settings but none are playable.
S.T.A.L.K.E.R. (2007)
Game Overview:
Released in March 2007, S.T.A.L.K.E.R.: Shadow of Chernobyl was developed by GSC Game World and runs on the X-Ray engine. It’s a gritty survival shooter set in the Chernobyl Exclusion Zone, blending open-world exploration with horror elements and tactical combat. The engine supports DirectX 8 and 9, with optional dynamic lighting and physics that can push older hardware to its limits.


At 1024×768 with medium settings and static lighting, all three cards hit the GOG version’s 75fps ceiling, but the 1 percent lows reveal subtle differences in frame pacing. The Lenovo card fills its buffer aggressively, reaching over 400MB of textures, yet this doesn’t translate into smoother delivery. The 8600GT and Sapphire card both maintain tighter frame consistency, suggesting better memory management or driver behavior under static lighting.
When pushed to maximum settings with low AA and static lighting, the Lenovo card shows a clear drop in minimums. Despite its larger VRAM buffer, it struggles to keep pace with the 8600GT, which remains locked at 75fps with minimal dips. The Sapphire card performs slightly better than the Lenovo in terms of average framerate, but both HD2600XTs suffer from noticeable stutter compared to the Nvidia card’s near-perfect pacing.
Dynamic lighting at 1024×768 exposes the architectural limits of all three cards, but especially the HD2600XTs. The Lenovo card’s minimums drop sharply, and the Sapphire card fares even worse. The 8600GT, while not immune to dips, manages a much higher average and a wider margin between lows and highs. This suggests its pipeline handles dynamic lighting more efficiently, even if its raw memory footprint is smaller.
At 1280×1024 with max settings and static lighting, the Lenovo card continues to show signs of strain. Its average framerate is playable but far behind the 8600GT, which again posts a strong showing. The Lenovo’s higher VRAM usage doesn’t appear to offer any advantage here, reinforcing the idea that its bottleneck lies elsewhere—possibly in bandwidth or shader throughput.
Pretty disappointing, I was excited to see afterburner reporting the big increase in VRAM usage but it didn’t seem to actually help this OEM card.
Assassins Creed (2007)
Game Overview:
Assassin’s Creed launched in November 2007, developed by Ubisoft Montreal and built on the Anvil engine. It introduced open-world stealth gameplay, parkour movement, and historical settings wrapped in sci-fi framing. The first entry takes place during the Third Crusade, with cities like Damascus, Acre, and Jerusalem rendered in impressive detail for the time.


At 1024×768 with mid-range settings, the Sapphire HD2600XT pulls ahead with a smoother experience, while the Lenovo card trails despite similar specs. VRAM usage stayed modest across all cards, never breaching 140MB, which suggests the bottleneck lies elsewhere. The 8600GT lands between the two Radeon cards, offering a balanced mix of frame pacing and average throughput. The Lenovo’s higher lows hint at decent consistency, but it lacks the punch of the Sapphire’s delivery.
Cranking up to full graphical quality and detail while keeping multisampling light, the gap narrows. The 8600GT maintains its lead in average performance, while the Sapphire card continues to edge out the Lenovo in both minimums and overall smoothness. The Lenovo card’s drop in lows is noticeable, suggesting it’s less comfortable when shader complexity ramps up, even if memory usage remains tame.
With post-processing enabled and multisampling maxed, the Lenovo card unexpectedly rebounds in minimums, outperforming the Sapphire in that metric. However, the Sapphire card still holds a slight edge in average framerate. The 8600GT once again proves its consistency, delivering the most stable experience of the trio. The Lenovo’s performance here is curious—perhaps its driver handles post effects more gracefully than expected, or the extra VRAM helps buffer the load just enough to avoid stutter.
A strange game where the different settings don’t seem to have too much of an impact on the results
Far Cry 2 (2008)

Far Cry 2 launched in October 2008, developed by Ubisoft Montreal using the Dunia Engine, a heavily modified offshoot of CryEngine. Set in a war-torn African nation, it’s a gritty open-world shooter where players take on mercenary contracts amid factional conflict, malaria, and weapon degradation. The game emphasizes realism, emergent gameplay, and environmental storytelling.
Unlike its predecessor, which leaned on scripted spectacle and DirectX 9, Far Cry 2 was built with full DirectX 10 support, showcasing dynamic fire propagation, real-time weather, and destructible foliage.
The Dunia Engine pushed GPU boundaries with long draw distances, volumetric lighting, and physics-driven terrain deformation, making it a benchmark staple for late-2000s hardware. Its commitment to immersive systems, like jamming weapons and dynamic fire

The Lenovo HD2600XT delivers the highest average framerate in this test, outperforming both the Sapphire and 8600GT despite its lower clock speed. This suggests that Far Cry 2’s texture streaming benefits from the Lenovo’s 512MB VRAM, allowing it to maintain smoother performance under load. When overclocked to 800MHz, the Lenovo card extends its lead further, confirming that the combination of extra memory and modest clock tuning yields the best overall experience. The 8600GT holds its own in minimums but can’t match the Lenovo’s consistency.

At higher settings and under DX10, the Lenovo HD2600XT continues to show its strength. While the Sapphire and 8600GT see only modest gains, the Lenovo card maintains a strong average and a wider gap between minimum and average framerates. The overclocked variant again leads, showing that DX10’s heavier memory demands reward cards with larger buffers. The 8600GT’s performance flattens out here, likely due to architectural limitations and memory pressure.

This is the most demanding scenario, and all cards dip into sub-20 FPS territory. The Lenovo HD2600XT at 800MHz still manages to edge out the competition in average framerate, despite a slightly lower minimum than the 8600GT. The VRAM ceiling is clearly being tested, and the Lenovo’s 512MB allows it to sustain higher throughput even when anti-aliasing and HDR are combined. The Sapphire card, constrained by 256MB, falls behind in both minimums and averages. This chart highlights the Lenovo’s resilience under memory-heavy conditions, especially when overclocked.
Dragon Age: Origins (2009)
Dragon Age: Origins debuted in November 2009, developed by BioWare using the Eclipse Engine. Set in the dark fantasy realm of Ferelden, it’s a story-driven RPG where players assume the role of a Grey Warden, tasked with uniting fractured factions to combat a monstrous Blight. Tactical combat, branching dialogue, and moral choices define its rich narrative experience.
While BioWare’s previous RPGs leaned on older engines and simplified rendering, Dragon Age: Origins embraced DirectX 9 with optional DirectX 10 enhancements, including improved lighting and ambient effects.
Built for scalability across a wide range of systems, the game features detailed character models, dynamic shadows, and large-scale battles, balancing cinematic storytelling with strategic depth. It pushed GPUs with high-resolution textures and shader effects while remaining accessible to legacy hardware


The Lenovo HD2600XT holds its own against the faster-clocked Sapphire variant, matching its 1% lows and trailing only slightly in average FPS. Despite the clock disadvantage, the 512MB VRAM ensures smoother asset handling, especially with high texture detail enabled. Compared to the 8600GT, the Lenovo card delivers a far more stable experience, with double the 1% lows and a 26% higher average. This resolution showcases the Lenovo’s balance: not the fastest, but consistently fluid in this game.

Anti-aliasing introduces a memory bottleneck that the Lenovo card navigates better than expected. While both HD2600XTs drop into the teens, the Lenovo’s extra VRAM helps it maintain parity with the Sapphire despite the lower core clock. The 8600GT, meanwhile, suffers a sharp drop in 1% lows, highlighting its limited headroom. Afterburner confirms VRAM usage exceeding 256MB, suggesting fallback to system RAM on the smaller cards, a penalty the Lenovo avoids.

At this resolution, the Lenovo HD2600XT pulls ahead decisively. Its 512MB buffer allows it to absorb the increased texture load without stutter, nearly doubling the average FPS of the Sapphire and outperforming the 8600GT by a wide margin. The 1% lows remain stable, confirming that the card isn’t just faster — it’s smoother. This chart is the clearest demonstration of the Lenovo’s strength: when VRAM limits are breached, it becomes the most capable card in the lineup.
Just Cause 2 (2010)

Just Cause 2 launched in March 2010, developed by Avalanche Studios using the Avalanche Engine 2.0. Set in the tropical nation of Panau, it’s an open-world action game starring Rico Rodriguez, whose grappling hook and parachute enable chaotic stunts and explosive sabotage.
The previous games were DirectX9 games with bits of DirectX 10 possible.
Just Cause 2 was built for DirectX 10, the game features vast draw distances, dynamic weather, and physics-driven destruction, pushing GPU limits with cinematic flair and sandbox freedom


Performance Notes
High Settings
At high detail, the Lenovo HD2600XT shows minimal gain from overclocking, suggesting the bottleneck lies beyond core speed likely in shader throughput or memory bandwidth. The extra VRAM doesn’t seem to get tapped, with Afterburner reporting usage under 256Mb on all cards.
Medium detail unlocks a clearer benefit from the 800MHz overclock, with the Lenovo card outperforming both the Sapphire and 8600GT. The reduced shader load likely allows the higher clock to stretch its legs, though VRAM usage remains modest despite the 512MB capacity.


Tomb Raider (2013)

Game Overview:
Tomb Raider launched in March 2013, developed by Crystal Dynamics using the Foundation Engine. It reboots the franchise with a gritty origin story for Lara Croft, stranded on the mysterious island of Yamatai. Blending survival mechanics, cinematic storytelling, and environmental puzzles, it redefined the series with emotional depth and visceral action.
Unlike earlier entries that relied on fixed camera angles and simpler rendering, Tomb Raider 2013 was built with full DirectX 11 support, showcasing advanced tessellation, dynamic lighting, and realistic physics.
The game pushed GPU boundaries with TressFX hair simulation, high-resolution textures, and volumetric effects, making it a favorite for benchmarking mid-2010s hardware. Its scalable engine allowed for DirectX 9 fallback, but the full experience especially Lara’s reactive animations and weather effects shines best under DX11 but modest improvements can be found when using DirectX 10 capable graphics cards.

At normal settings with FXAA and anisotropic filtering enabled, the Lenovo HD2600XT trails slightly behind the Sapphire and 8600GT in average framerate. However, its 1% lows remain competitive, suggesting stable frame pacing even under heavier rendering loads. The extra VRAM doesn’t offer a major advantage here, as the resolution and settings stay within the comfort zone of all three cards. This chart shows the Lenovo card holding its ground, but not yet flexing its memory advantage

Dropping to low settings unlocks more of the Lenovo HD2600XT’s potential. The average framerate climbs to 29 FPS, and the 1% lows improve significantly. While the Sapphire and 8600GT still lead in raw numbers, the Lenovo card narrows the gap and maintains consistent performance. The 512MB VRAM helps absorb texture loads without fallback stutter, and the card’s behaviour here suggests it’s well-suited for low-detail play at this resolution, especially in longer sessions.

At 800×600, the Lenovo HD2600XT finally stretches its legs. With an average of 45 FPS and a 1% low of 35, it delivers a smooth and responsive experience. While the Sapphire and 8600GT edge ahead in raw framerate, the Lenovo card’s performance is close enough to be indistinguishable in practice. The extra VRAM ensures no sudden dips or asset streaming hiccups, making this resolution the sweet spot for the Lenovo card in Tomb Raider (2013). It’s a reminder that OEM doesn’t mean underpowered, just tuned for a different kind of resilience

Synthetic Benchmarks
3d Mark 2000


| Nvidia 8600GT | Lenovo HD2600XT | Sapphire HD2600XT | |
| Score | 29,282 | 32,373 | 33,245 |
3d Mark 2001 SE

| Nvidia 8600GT | Lenovo HD2600XT | Sapphire HD2600XT | |
| Score | 35,362 | 30,827 | 30,145 |
| Fill Rate (Single Texturing) | 2,445.8 | 2,271.4 | 2,264.9 |
| Fill Rate (Multi-Texturing) | 8,120.6 | 5,325.1 | 6,115.3 |
3d Mark 2003



| Nvidia 8600GT | Lenovo HD2600XT | Sapphire HD2600XT | |
| Score | 16,848 | 13,334 | 12,896 |
| Fill Rate (Single Texturing) | 2125.4 | 2020.5 | 1997.2 |
| Fill Rate (Multi-Texturing) | 7752.20 | 5317.4 | 5999.8 |
| Pixel Shader 2.0 | 220.1 | 124.1 | 112.1 |
3d Mark 2006



| Nvidia 8600GT | Lenovo HD2600XT | Sapphire HD2600XT | |
| Score | 7261 | 6000 | 6072 |
| Shader Model 2.0 Score | 2733 | 1856 | 1831 |
| HDR/Shader Model 3.0 Score | 2685 | 2551 | 2635 |
Unigine Haven

| Nvidia 8600GT | Lenovo HD2600XT | Sapphire HD2600XT | |
| Score | 2194 | 1505 | 1615 |
| Average FPS | 51.7 | 35.5 | 38.1 |
| Min FPS | 34.5 | 26.9 | 29 |
| Max FPS | 66 | 47.3 | 51 |
3d Mark Vantage

The 3d mark version that was made for DX10 and Vista. I ran this on the Entry preset and disabled the CPU tests



| Nvidia GeForce 8600GT | Sapphire HD2600XT | Lenovo HD2600XT | Lenovo HD2600XT (800Mhz) | |
| GPU Score | 8201 | 7900 | 7226 | 8023 |
The results show a win for the 8600GT but with reasonably small margin, at least compared to some of the earlier tests in the suite.
The 100Mhz clock speed difference makes a big difference to the Lenovo cards score, with the slight overclock it beats the Sapphire competition and is only a little under the GeForce score of 8201.
Summary and Conclusions
So that’s the Lenovo HD2600XT, perhaps the only deep-dive this card will ever receive online.
Now I would have expected that a 512MB variant would outperform its 256MB sibling, even with a 100MHz core clock deficit in most titles. Evidence suggests that this is not often the case.
The extra VRAM rarely sees full utilization in earlier games, with Doom 3 standing out as a rare exception and even then, it has it’s own special mode.
Similar to how we’ve seen 8Gb VRAM stagnation in more modern times, games seem to have been designed to top out at 256Mb back in the 2000s.
By the time DX10 titles arrive later in the decade, 512MB becomes more relevant, yet the bottleneck simply shifts, the GPU lacks the raw horsepower to turn that memory into playable performance.
Being an OEM card didn’t hold it back dramatically, cooling is sufficient, and thermals remain tame.
Still, the absence of a CrossFire connector is a missed opportunity; it would’ve been fascinating to see how two HD2600XTs scaled together.
As it stands, the Lenovo’s modest performance and understated design likely consign it to a quiet retirement in a collector’s box. A little harsh, perhaps.. now I wonder what a 512mb GDDR3 8600GT can do? hmmm…
Leave a comment