Post by Cobra King Mohamdu on May 15, 2020 12:31:28 GMT -5
In this thread we preemptively hype up the most powerful consumer grade graphics card that will start the decade off right. First of all I'd like to open this post by saying to all of you who bought a 2000 series GPU, I'm sorry.
No really, I'm sorry you fell for the 2000 series meme. Your graphics cards will by the end of the year not only be significantly outdated, but it's ray tracing capabilities will be functionally irrelevant. Even the lowest end 3000 series GPUs will outpace it. These new GPUs are set to be a significant step up from its predecessors. 4k gaming at 120+fps with RTX enabled will become not only feasible, it will become normal on the high end 3000 series cards. Ray tracing now apparently causes significantly less drops. However, I will thank you guys for beta testing DLSS and RTX for us 3000 chads. At this point, the RTX 3080 Ti is rumored to have 21 Tflops of power as well as negligible performance drops with RTX on.
All hail Jensen "Huge Wang" Huang, our Lord and Savior. 4K @ 120+fps RTX on is no longer a meme.
Post by Cobra King Mohamdu on May 24, 2020 5:04:49 GMT -5
Honestly I was shocked at how well the LG C9 ran Modern Warfare and Doom Eternal using only my old 1070 GTX, that GSync really is something. I was running the render resolution at 4k and it was still hitting close to 120fps with most settings turned up, a few things turned off and RTX off to keep the framerate at 120. This RTX 3080 Ti card was half the reason I was so obsessed with getting the LG C9, since you get HDMI 2.1 with GSync enabled on an OLED TV. All I need is that 25ft HDMI 2.1 8K@60Hz/4K@120Hz cable, and my neg hole will be fully prepped to get pozzed by Ampere.
I should really get a 144hz 1440p monitor but most of my time is spent on work rather than gaming i needs the pixels
I would never go back to 60Hz monitors. Got my 144Hz monitor in 2016 and it basically turned me into a hard framerate fag. No one can ever convince me anything matters more than framerate. Kadeem couldn't understand why right after spending thousands of dollars on Kinoberus, why I had basically 0 interest in putting the settings on max, even just for kicks. I did it for like 2 minutes to humor him, got bored of the dogshit framerate real fast, then immediately got pumped as fuck to turn all the settings down low because I could run Modern Warfare at rock solid 120+ fps. Seriously, I can't wait for these new GPUs with HDMI 2.1 to drop so I can run Modern Warfare on my tv at 4k@120fps with ray tracing enabled. And if for some reason even the top Ampere card can't do that, I'll fine tune the settings so that it's always 120fps while looking decent. 4k graphics are entirely meaningless to me without a high framerate.
I would love to ditch my shitty HP Envy 27s for something higher framerate with HDR support but I can't justify it while I'm still running an RX 580.
just render at 1080p bro. and if your gpu is still too trash to push 120+fps, just lower it until it is. when you get fully framerate pilled like me you will realize that buttery smooth 720p@144Hz is still better than 1440p@60Hz. That and having a 144Hz monitor will have you fully prepped for when the Ampere GPUs drop, cause you will cream your pants the first time you get to play those games with a sexy high framerate and resolution.
Post by Cobra King Mohamdu on Jun 16, 2020 1:25:45 GMT -5
That's fair enough, I think in fps games specifically tracking speed is the most important factor for increasing your kills. So I'm fine with rendering games at 1080p especially if it's a twitch reaction based game.