Forums » MacOS X

flicker @ 1680x1050 32b!

Aug 28, 2005 Invader J link
Hi all,

Got a Dual 2.7GHz G4 with an X850xt card. Trying to run the game @ 1680x1050 resolution @ 32-bit color results in complete flicker in-game, and corrupts everything onscreen after I quit. Running with all options on but no combination of on/off seems to do the trick. Game runs fine at 16-bit (and at lower resolutions) but of course things look washed out, especially dark colors.

Tried doing the new ATI 4.5.5 update but that didn't help. Running latest Tiger build.

Any ideas? =(
Aug 28, 2005 greengeek link
Strange, my Powerbook G4 can run at that resolution just fine (On my external 20" lcd). It has an NVidia FX5200 with 64 MB of VRAM, so I'd think that your machine should be able to do the job. I'm not aware of any other concerns for specific video cards.

Are you running more than one monitor? Are there other programs running in the background that might be trying to use the video card at the same time?
Sep 08, 2005 mgl_mouser link
Er… that would be a dual G5 right?

I'm running on a dual 2ghz G5 w/ Radeon 9600 Pro (64meg) and everything runs just dandy.

Go into options ans start reducing some of the graphics detail. One hogger is the Full Screen Illumination effect. Start from there and see how that goes.
Sep 11, 2005 roguelazer link
Is vsync on? Are texture depth and color depth the same (32-bit)? Why not just post your config.ini?