So I bought a Titan... A770 (an Arc OC thread!)

NattyKathy

[H]ard|Gawd
Joined
Jan 20, 2019
Messages
1,483
Yep, an Intel Titan! GG Sparkle on the dubious naming ;-)

IMG_20231117_190530937~2.jpg

I've been wanting an A770 since before it was released and with this model bringing the 16GB version below $300 it is time to give Arc a try. Why this potato when I already have a 3090Ti and a 7900XT in my main rig plus a 3070Ti sitting unused?

abde616a85e9ce1640f88d20e25f3414.jpg


Seriously, I just like GPUs! Each different architecture is a new overclocking and optimization adventure. I have some ideas about power limit hardmodding I want to test, that would also be applicable to Radeon GPUs. And of course this thing is gonna get a DIY hybrid mod since I apparently can't get enough of those.

Here's to silly hardware purchases! I feel like one of those Car People who always has a new project despite owning a perfectly good daily driver :-P
 
Last edited:
Took a little while to get everything to play nice with this terrifying temporary setup involving PCIe cable, a stack of boxes, and oh yeah three GPUs.

IMG_20231118_033519405~2.jpg

Some UEFI and Windows settings needed changed but hooboy it's going now! I've said it before but Win11 is been good IME with multi GPUs- pinning actually works.

Power consumption on the 8-pins is capping out around 300W with maxed OC settings in Arc Center. I'm assuming it's not drawing some heinous amount of current from the PCIe so I'd guess that's close to the TBP.

IMG_20231118_051045735~2.jpg

Needs more power, there's more voltage and clock headroom for sure. Max OC I got so far is ~2800Mhz at ~1.18V (edit- this seems dubious now based on some further tweaking, I think there was some sneaky power throttling going on there) but it wants to go higher.

Despite the card hanging off chipset lanes at 4x4, I already managed to break 15K GFX in Time Spy, not bad.

I did some actual gaming too- Cyberpunk with LOTS of texture/LOD/script mods that hit GPU/VRAM/CPU. At 3440x1440, XeSS Quality, Ultra settings, the OC'd A770 is doing 60-70fps when GPU-bound. The game is CPU-bound a lot of the time because of mods & any drops below 60fps weren't the GPUs fault.

VRAM usage in Cyberpunk with all the texture mods is around 12-13GB most of the time which kinda vindicates the 16GB memory, and makes me curious to do a head-to-head vs my 3070Ti with its 8GB.

Damn fun GPU so far, glad I finally got one to try.
 
Last edited:
Today I pulled the heatsink to check out the PCB and was greeted with a surprise- this is in fact a slightly different PCB than the Sparkle Titan A750 that TPU reviewed.

IMG_20231118_184741346~2.jpg

This A770 has an 8-phase VRM with Monolithic Power Systems 70A Power Stages instead of the 6-phase with Intersil 90A Power Stages on the A750. Fortunately there is a complete public datasheet for the MP86956 components which may come in handy. There's also shunt resistors present, something that is optional and unused on the A750 version of the board. Finally, the through-hole polymer "can" capacitors present on the A750 version for VCore output filtering are not present. In fact, there are no caps of that type at all on this board- everything is SMD polymer or ceramic. Memory VRM config appears roughly the same as other ACM-G10 boards, nothing interesting there.

IMG_20231118_185404366~2.jpg

Moving to the back of the board, we can see a good number of SMD polymer capacitors for output filtering- this is similar to the reference config on Intel's "Limited" Edition cards and should perform better than the through-hole caps on Sparkle's A750 model. There isn't much input filtering however... Maybe something to add, as there's spaces for a few SMD caps on the 12v side.

We can also get a look at the power management controller- in this case an MPS MP2975 instead of the Reneasas RAA229001 on Sparkle's A750 version. There is no public datasheet for the MP2975 but there is one for MP2965 which according to an MPS roadmap is the direct predecessor and should be near-identical except for support for 8 phases on the '75 instead of 7 on the '65.

Now, my initial plan was to mod the output current sensing on the controller. For those not in the know, modern Smart Power Stages can measure their output current by essentialy using their inductor as a shunt resistor. This measurement is sent out as a proportional analogue signal to the Controller which monitors the individual phases for load-balancing, as well as taking a sum of the signals into a single pin for overall current sensing. This is how AMD GPUs monitor their power- and NVidia GPUs have the capability also, but for some reason NV continues to rely shunts. From what I've gathered from SkatterBencher's A770 World Record article, the inductor current sensing method is used on Arc cards. The good news is, it turns out a hardmod may not be needed at all here- more on that later.

Back to the shunt resistors. If the GPU is monitoring its power from the summed phase inductor current, why are there shunts as well? I don't have an answer for that but let's mod 'em anyways.
 
So funny story, I lost my box of 8mOhm shunts. No idea where I put it. But that's ok because it seems the stock shunts on ACM-G10 boards are a miniscule 1mOhm and stacking 8mOhm on top of 1 won't make a major change to the voltage drop. So I used my wire method instead. I don't remember if I posted about this here or just on OCN, but I successfully used this method on my 3070Ti for awhile before switching to proper stacked SMD shunts.

IMG_20231118_202211664~2.jpg

Yeah it's ugly. Don't care.

The reason that "wiremod" shunt modding tends to not work so great is because usually when people try this, they use like 0.5cm of the thickest wire they can find which results in an a voltage drop across the shunt so low that the card can struggle to properly monitor power at all. Remember the point of a shunt is that there's some measurable voltage drop present. So the key is to use thinner wire, and more of it. I use 22ga solid-core copper wire for this. In this case, I used 8cm lengths which should result in around 4mOhm resistance (I didn't check this time, but for the 3070Ti mod I actually checked the voltage drop across the wire using a benchtop supply). And yes, 22ga wire is okay for this- per simulations and tests I did for the 3070Ti mod, the max current that will be flowing across the shunt wires is around 5A. Bit high for 22ga but not "melty wire" territory.

Okay! So I've shunt modded the card! What did that accomplish?

F*ck if I know. Absolutely nothing changed. I wasn't expecting this to affect the core current sensing that guides PL1/PL2 limits since as previously stated that's done more directly. I think the shunts may have something to do with PL4? Although PL4 has been described along the lines of "an absolute max peak current limit", from my observations it seems more complex than that. It may be more like EDC/TDC on AMD- a made-up number that's calculated from a number of factors.
 
Yep, an Intel Titan! GG Sparkle on the dubious naming ;-)

View attachment 614300

I've been wanting an A770 since before it was released and with this model bringing the 16GB version below $300 it is time to give Arc a try. Why this potato when I already have a 3090Ti and a 7900XT in my main rig plus a 3070Ti sitting unused?

View attachment 614315

Seriously, I just like GPUs! Each different architecture is a new overclocking and optimization adventure. I have some ideas about power limit hardmodding I want to test, that would also be applicable to Radeon GPUs. And of course this thing is gonna get a DIY hybrid mod since I apparently can't get enough of those.

Here's to silly hardware purchases! I feel like one of those Car People who always has a new project despite owning a perfectly good daily driver :-P
It's a hobby and why your are here on the [H]. You shouldn't be so embarrassed of your socially shunned hobby as to post a SIMPsons potato meme to self justify. Just own it :woot:
 
Ok, so in previous post I mentioned that hardmodding the Power Management Controller's current sensing (or using an EVC2 to accomplish the same thing with a firmware offset) may not really be needed. What's up with that?

Yeah so there's a hilarious software trick that can set Arc PL1/PL2 to completely arbitrary values.
Like recent Radeon GPUs, Arc cards get hardware settings through a driver-firmware interface, rather than being hardcoded into the VBIOS. Normally there's software limits imposed on how far PL can be raised, but Acer left a fun backdoor in their tuning software for their BiFrost A750/A770.

PL420Lol.jpg


PL420.jpg


Lol. Lmao even. It's worth noting that the Frequency & Voltage settings in the PredatorBifrost configfile don't seem to actually do anything, but the PL adjustment does work as evidenced by an increased power draw being reported by the TG WireView I've connected- as high as ~360W under certain conditions, 20% more than stock. Hilarious, and useful.

Unfortunately though that's not the end of the story. Arc Control OC settings and PredatorBifrost OC settings are mutually exclusive and Bifrost seems unable to tune Frequency & Voltage, which means to keep the high PL active alongside an actual OC, another method is needed.

Enter Arc OC Tool from Shamino, via SkatterBencher.

A770-ArcTool.jpg


This tool allows Frequency & Voltage to be locked in, without limits to what can be set. Super useful, love it. Note that even though Frequency & Voltage are unlocked using this method, Power is not unlocked and will cap at whatever the Arc driver says. Hence the use of PredatorBifrost alongside to raise PL1/PL2.

So all good, right?

Welllll, not quite. Arc GPUs are f*ckin weird and have many invisible limits going on in the background. The main limitation I'm running into currently is that if requested voltage goes too high (around 1.2V seems to be the limit), the core clocks will take a big dump. Problem is, we really need higher than 1200mV to keep increasing core clock. The settings in the above screenshot are my current best. That voltage setting results in just barely under 1.2V which is just on the edge of throttling the core frequency- the 2800Mhz I'm requesting turns into around 2700-2750Mhz under load. I dare not go any lower on voltage because that kills stability so I'm at a bit of an impasse. I will do a little more tuning within these constraints as I'm tantalizingly close to cracking top-50 in A770 TS GFX (currently #53!) but for a truly legendary OC I'll need some extra tricks.

So what's next? Likely voltmodding. The PredatorBifrost trick really does do the thing for raising PL1/2, and whatever the Shunt Resistors are doing is being "adjusted" by my wiremod, so as of now I'm no longer power limited anyways.

What I'll probably try is messing with the MP2975, specifically the voltage feedback pin. Forget I2C- we're doing this Buildzoid style.

My notion is that if I can give a bit of an offset on the actual Vcore output (perhaps 100-200mV), then I can set well under 1.2V in software to keep the driver happy and prevent clock throttling, while still having the actual voltage required to reach for 3Ghz. This is basically what SkatterBencher did for his World Record LN2 OC, except he did everything though an EVC2.

Stay tuned!
 
I love the blue pcb.... So sick of plain black ones! :D
Gigabyte and later Intel and ABit used blue PCBs back about 10-15 years ago. Those were the days, back when motherboards were priced much more sanely...

Glad you're having fun with your potato, OP. I planned on getting a 16GB A770 myself at some point, even though my 3080 will give it a hearty spank. Last time I played around with an Intel GPU was when they were still called 3D accelerators... Heh. Even though it was late to the party, I remember the i740 having really good image quality, near that of a reference rasterizer; easily on par with 3dfx. It was just a generation behind at that point in performance; roughly as fast as a Voodoo Graphics 4MB board in the days of the Voodoo 2 - needless to say it wasn't a hit.
 
Last edited:
Gigabyte and later Intel and ABit used blue PCBs back about 10-15 years ago. Those were the days, back when motherboards were priced much more sanely...
Yep, had an sli pair of gigabyte gtx 670s that were blue :D.
Ah, yeah I remember the blue Gigabyte PCBs. I had a couple Giga 970 mobos for AM3... Kinda terrible boards tbh but they were super affordable! I think I briefly had a Radeon HD7870XT (technically a HD7930) with a blue pcb... No recollection of the brand. May have been Gigabyte.

Glad you're having fun with your potato, OP. I planned on getting a 16GB A770 myself at some point, even though my 3080 will give it a hearty spank. Last time I played around with an Intel GPU was when they were still called 3D accelerators... Heh. Even though it was late to the party, I remember the i740 having really good image quality, near that of a reference rasterizer; easily on par with 3dfx. It was just a generation behind at that point in performance; roughly as fast as a Voodoo Graphics 4MB board in the days of the Voodoo 2 - needless to say it wasn't a hit.
I think I could definitely recommend A750/A770 at this point! Driver situation seems much better according to everyone who has tested one and I haven't experienced any weird fuckery from the Arc Command app like what was happening at launch. Interesting cards. Lacking in shader power but in some ways they're much more high-end than their basic gaming performance lets on. Memory bandwidth is a strong point, and ACM-G10 is absolutely loaded with TMUs and ROPs. RT performance is quite good too for the price segment. I think 16GB A770 will have some interesting edge case uses, and A750 being so cheap now actually makes the lower model a compelling budget option for gaming/general use.

I hope Intel sticks with it... A future with a real third GPU option in the midrange segment would be a bright one. A770 was almost there. If it had hit a little earlier and been just a little faster at stock it wouldve been an amazing alternative to 3070 / 6700XT. As it is only the A750 is really worth buying for normal people imo but A770 is pretty cool for weirdo hardware enjoyers like us.
 
Last edited:
More modding!

The coldplate was deeply grooved so I hit it with some 800grit taped to a flat block. Looks bad but it's flatter and smoother now.
IMG_20231119_183601488_HDR~2.jpg

Sparkle has apparently joined the "we don't believe in X-Brackets" club and the PCB was bending which had me concerned about contact and pressure. I made an ABS shim so the heatsink can be bolted all the way thru the backplate. Did this on my 7900XT previously and it helped, but since N31 uses almost the exact same hole pattern as LGA1700 I gave the Radeon a random Intel X-bracket from an AiO kit when I hybrid modded it. That's not an option for Arc though since Intel used this weird w i d e hole pattern. It's almost AM4 dimensions o_O
Gave it some thermal pads on the rear to get heat into the backplate. They look used because I stole them from the 7900XT backplate that I 86'd for Reasons.

IMG_20231119_191631343~2.jpg

Liquid metal time! And U-6 Pro for the VRAM which is highly-regarded over on OCN for helping mounting pressure, got the same stuff on my 7900XT. On the right we can see I added an I2C breakout cable in case I want to softmod. I've thought more about voltage hardmod and I'm not sure I want to commit to that for a card that I'm actually gonna use.

IMG_20231119_195734413~2.jpg

Put all back together we get a look at my horrible genius heatsink mounting solution. PCB is not bent anymore!

IMG_20231119_202250493~2.jpg
 
She's runnin good. All todays mods resulted in a nearly 10C drop in core temps! Great news, because this thing really does not like going over 80C and staying below 70C is better. Currently barely reaching 70C in TimeSpy whereas it was hitting 80C before, nice.

Currently #8 for A770 TS GFX! Not bad for an aircooled board, that's faster than most 3070 TS scores and comfortably in 3070Ti / 6800 territory :sneaky:. I'm so close to getting #7 but have been hitting a wall.
NumberEight.jpg
 
The other day I yeeted the 3090Ti because it needs an overhaul (AiO is slowly dying) which let me get the A770 on gen4x8 CPU lanes. Everything so far was on gen4x4 chipset lanes which was probably holding it back.

IMG_20231123_064216643_HDR~2.jpg

I did try mounting it in the case without the riser and thermals were horrendous. With this cooler the card would have to be the only expansion card and in the top slot to have good thermal performance on major OC. Would probably be fine in the stock ~230W power range tho.
(also behold the sad, power-limited 7900XT in the background envious of the A770s OC gainz)

With better PCIe connectivity and some CPU tuning the "Titan"s Time Spy score is through the roof

Screenshot_20231123-065219~3.png

Fastest Aircooled A770 on the Time Spy GFX chart currently! AFAIK any of the >16K results out there are on LN2.

Hybrid Mod soon maybe? Though I'd almost feel bad ditching the stock cooler, I think it's actually kind of nice and appropriate for this card going at more reasonable power consumption levels.
 
Last edited:
$269.99 right now at Newegg
A good buy at that price IMO as long as one is ready to be patient with potential issues given what it is.

Early gaming impressions:
- quiet at stock settings or mild OC
- No obvious driver issues so far
- Raytraces very well for the price range
- quality of RT denoising is excellent, better than AMD
- the Arc-exclusive version of XeSS 1.2 scaling looks phenomenal (at least in Cyberpunk 2077, IMHO, YMMV) Better than the generic DP4a version and especially at lower resolutions is a whole different class than FSR2.x and depending on quality level can (IMO) match DLSS3.x
- 16GB of VRAM enables use of high-rez texture packs / mods that 8GB cards struggle with, even in this performance segment.
 
I been using the same card for maybe 2 months now, it is picking up speed in the driver area with 4972 drive to help fix the Starfield patch, the vram was the selling point to me at that price with the only hopes of going were my RTX 3070 cannot with the 8Gb limit, to bad I could not take those Samsung memory chips and hot glue them to my RTX 3070 being also Samsung.
 
I'm not sure if these Intel cards are good for gaming as of now, but at this price they seem good for Blender rendering. With resizeable bar enabled it does speed up the rendering process atleast in Windows. I don't think Nvidia and AMD has this capability enabled in Blender yet.
 
I'm not sure if these Intel cards are good for gaming as of now, but at this price they seem good for Blender rendering. With resizeable bar enabled it does speed up the rendering process atleast in Windows. I don't think Nvidia and AMD has this capability enabled in Blender yet.
They are significantly better at DX9 and DX11 than at launch, but really in games that utilize RT the A770 can provide fairly compelling performance for the price.
 
Some updates on this...

The stock cooler is really good, but of course I want MORE. Thusly,
IMG_20240315_194142864_HDR~2.jpg


IMG_20240316_061306686_HDR.jpg


IMG_20240317_011625081~2.jpg


With some 3D printed parts, it's now liquid-cooled :cool:

Temps are nice and cool now, not passing 50* on core and low 70s on VRAM and VRM even with raised Power Limit.

I've got an EVC2 hooked up to it for voltmodding, but I need to sort out the software profile to get the EVC and PWM controller communicating properly.

Still having a great time with Arc tuning!
 
Nice, I like the repurposed m2 cooler. :D

I may need a similar (but shorter) solution for my low profile A380. It's right up against the PSU, and the cooler has less than ~8mm on all sides, with the fan removed, and no vents above it. Maybe okay with light load, but not ideal.
 
Nice, I like the repurposed m2 cooler. :D
Thanks! And good eye ;-) I also have an M.2 water block on hand, but the heatsink + baby Noctua seems sufficient so far.

I may need a similar (but shorter) solution for my low profile A380. It's right up against the PSU, and the cooler has less than ~8mm on all sides, with the fan removed, and no vents above it. Maybe okay with light load, but not ideal.
Sounds like a tight fit! Funnily, there's actually full-cover waterblocks available for some A380 models. Not the cheapest option but might be something to look into
 
Not for the LP cards, asrock or sparkle. I have a Koolance GPU-210 but squeezing a pump and rad into the case I have would eliminate aircooling the CPU as an option, and being over the CPU would limit rad height to under 60 mm or so, and at just 120 mm square. It's really not a wc case. lol

It might be fine if I remove the 3mm brace from above the PCIe slot, though. I have two 80 mm case fans pulling air in, and a 120 pushing air out the top. I may have to flip those all around, but one way or another the heat's coming out. Just don't know if it'll pull it away fast enough.
 
Nice, I like the repurposed m2 cooler. :D

I may need a similar (but shorter) solution for my low profile A380. It's right up against the PSU, and the cooler has less than ~8mm on all sides, with the fan removed, and no vents above it. Maybe okay with light load, but not ideal.
Those are nothing new, they just used to call them motherboard VRM coolers 😂
 
So softmodding voltage with the EVC ended up a dead end without a full list of MP2975 i2c registers, but I was successful with a hardware voltmod after some probing to find the right pins.

IMG_20240321_230312697_HDR~2.jpg


1.1x voltage offset can be applied with the push of a button

IMG_20240322_022959187~2.jpg


Now VCore can be raised to around 1.3V before the driver clock throttles instead of the usual 1.18V

The result is lots of power consumption and a 2950Mhz OC

IMG_20240322_030751382~2.jpg


Even stable enough to (briefly) run Cyberpunk, where power consumption is less extreme than Time Spy

A770-2950Mhz-Cyb~2.jpg


The GPU voltage readout in RTSS won't take an offset for some reason, the actual was ~1.29V.

3Ghz feels within reach!
 
NattyKathy, you should change the username to Doctor Frankenstein, nice to know it can handle 433watts and not melt.
 
Those are nothing new, they just used to call them motherboard VRM coolers 😂
Maaaan, the good ole days when a Mobo heatsink had heatpipes and fins and wasn't just a slab of cast Al.
The "real" heatsinks on my X570S Aorus are a breath of fresh air in these undercooled times.
 
Very cool stuff here.. and love that Wir(e)View power adapter!
Thanks!
Yeah the WireView is a great Thermal Grizzly gadget, very handy to have around.

NattyKathy, you should change the username to Doctor Frankenstein, nice to know it can handle 433watts and not melt.
It's alive! It's aliiiivvveeee! Muahahahaha
The VRM on this board is pretty robust for what it is, tho the active VRM cooling is absolutely needed here.
 
Back
Top