This is a split board - You can return to the Split List for other boards.

What video card can I go up to without my FX-6300 bottlenecking it?

#21Hi CPosted 7/21/2014 5:37:37 AM
ThisGuy101 posted...
Is it a smart idea buying a used GPU from Amazon?


It's never a good idea buying a used GPU or hardware in general. Triple for used AMD gpus.
#22Hi CPosted 7/21/2014 5:44:02 AM
Freedan12 posted...
ThisGuy101 posted...

AMD FX-6300 | Gigabyte GA-990FXA-UD3 ATX AM3+ | Kingston HyperX 8GB DDR3-1600 | MSI Radeon R9 270X 2GB TWIN FROZR.

Don't fall into the pc money trap.

If the games you want to play will run at an acceptable setting to you on your current rig, ask yourself why do you need an upgrade.

Watch this youtube vid before making up your mind.
www.youtube.com/watch?v=8TiRg5X7OM0


Exactly, that's why either AMD or Intel is fine.

"Holy s*** my 6300 bottlenecked my 290x down to 100fps instead of 120 at 1080p guess I need to upgrade to a 4790!"

Most of the upper mid to high hardware on the market now will die from natural causes before you will be forced to upgrade.
#23KURRUPTORPosted 7/21/2014 6:51:35 AM
DarkZV2Beta posted...
8350 isn't any better off, as multithreaded performance isn't the issue.


The 8350 and even 6300 is going to be fine for the vast majority of games out there. Plus more and more games are actually using more cores now which is bumping amd benchmarks up without them even doing anything. In reality AMD CPU's are the stronger processors but many programs just haven't been coding to properly use them to their upmost, but like I said more and more programs (including games) are finally starting to code their stuff with the ability to use as many cores as your CPU has to offer.

People look at stupid graphs where they play a game at 1024x768 with all the graphic options set to the minimum and intel will have 200fps and amd will have 180 fps. Both are gross overkill and don't even make a difference. People play games at 1080p or higher these days and generally turn some graphic options on, in real world gaming your GPU is going to be your frame limiter in 95% of games.

TC: I would just stick with your CPU and get a better GPU, it will give you a huge performance increase in almost every single game you play. If you upgrade your cpu you won't get any sort of gaming increase in anything but strategy games. Then again if you are a strategy game junkie then there is a legit reason to spend (a lot) more by going intel. Normally though they are just a straight up waste of your money.
---
Drugs are never the answer, unless the question is what isn't the answer.
#24tiger8191Posted 7/21/2014 7:56:15 AM
I always love topics where people claim that the fx6300 can't play games. Most if not all never used a PC with the CPU. I've had it for well over a year. There isn't a game in my 170+ library that doesn't run with my fx6300/7850. My issues with any game are due to my gpu but I still run most games at high or high/ultra and a few on ultra at 60fps. What games exactly can I not play due to my weak and useless CPU?

But buying a 6300/270x and expecting to play every game at 60fps fully maxed out 1080p is wishful thinking.
---
www.twitch.tv/fearsomebeaver
#25ThisGuy101(Topic Creator)Posted 7/21/2014 9:14:24 AM
tiger8191 posted...
I always love topics where people claim that the fx6300 can't play games. Most if not all never used a PC with the CPU. I've had it for well over a year. There isn't a game in my 170+ library that doesn't run with my fx6300/7850. My issues with any game are due to my gpu but I still run most games at high or high/ultra and a few on ultra at 60fps. What games exactly can I not play due to my weak and useless CPU?

But buying a 6300/270x and expecting to play every game at 60fps fully maxed out 1080p is wishful thinking.


I didn't expect it to max out everything, but whenever it couldn't max out a 2 (4 if you count the original 360 release) year old game, I got to thinking about games in a year or so.

I want to futureproof now.
---
AMD FX-6300 | Gigabyte GA-990FXA-UD3 ATX AM3+ | Kingston HyperX 8GB DDR3-1600 | MSI Radeon R9 270X 2GB TWIN FROZR
My first gaming rig
#26JinchusePosted 7/21/2014 9:27:07 AM
ThisGuy101 posted...
tiger8191 posted...
I always love topics where people claim that the fx6300 can't play games. Most if not all never used a PC with the CPU. I've had it for well over a year. There isn't a game in my 170+ library that doesn't run with my fx6300/7850. My issues with any game are due to my gpu but I still run most games at high or high/ultra and a few on ultra at 60fps. What games exactly can I not play due to my weak and useless CPU?

But buying a 6300/270x and expecting to play every game at 60fps fully maxed out 1080p is wishful thinking.


I didn't expect it to max out everything, but whenever it couldn't max out a 2 (4 if you count the original 360 release) year old game, I got to thinking about games in a year or so.

I want to futureproof now.


Futureproof is just marketing bull that manufacturers try to sell you. There's no such thing when it comes to PC hardware. You might as well buy the best GPU/CPU that makes sense for your budget and needs today.
#27KURRUPTORPosted 7/21/2014 10:18:15 AM
ThisGuy101 posted...
tiger8191 posted...
I always love topics where people claim that the fx6300 can't play games. Most if not all never used a PC with the CPU. I've had it for well over a year. There isn't a game in my 170+ library that doesn't run with my fx6300/7850. My issues with any game are due to my gpu but I still run most games at high or high/ultra and a few on ultra at 60fps. What games exactly can I not play due to my weak and useless CPU?

But buying a 6300/270x and expecting to play every game at 60fps fully maxed out 1080p is wishful thinking.


I didn't expect it to max out everything, but whenever it couldn't max out a 2 (4 if you count the original 360 release) year old game, I got to thinking about games in a year or so.

I want to futureproof now.


You should watch that video that was linked earlier, it addresses exactly the mind set you are in right now.

You can spend a grand or more every single year when a new cpu and gpu come out. You really have to ask yourself if that bran new shiny game on ultra is going to be worth that much or if you can have just as much fun playing with the settings only on high.
---
Drugs are never the answer, unless the question is what isn't the answer.
#28PraetorXynPosted 7/21/2014 10:43:11 AM
KURRUPTOR posted...
DarkZV2Beta posted...
8350 isn't any better off, as multithreaded performance isn't the issue.


The 8350 and even 6300 is going to be fine for the vast majority of games out there. Plus more and more games are actually using more cores now which is bumping amd benchmarks up without them even doing anything. In reality AMD CPU's are the stronger processors but many programs just haven't been coding to properly use them to their upmost, but like I said more and more programs (including games) are finally starting to code their stuff with the ability to use as many cores as your CPU has to offer.

People look at stupid graphs where they play a game at 1024x768 with all the graphic options set to the minimum and intel will have 200fps and amd will have 180 fps. Both are gross overkill and don't even make a difference. People play games at 1080p or higher these days and generally turn some graphic options on, in real world gaming your GPU is going to be your frame limiter in 95% of games.

TC: I would just stick with your CPU and get a better GPU, it will give you a huge performance increase in almost every single game you play. If you upgrade your cpu you won't get any sort of gaming increase in anything but strategy games. Then again if you are a strategy game junkie then there is a legit reason to spend (a lot) more by going intel. Normally though they are just a straight up waste of your money.


You really need to look into the Bulldozer/Piledriver architecture and learn the difference between a core and an integer processing unit.
---
Console war in a nutshell:
http://imgur.com/xA6GJZ9.png
#29DarkZV2BetaPosted 7/21/2014 10:54:39 AM
ThisGuy101 posted...
Freedan12 posted...
ThisGuy101 posted...

AMD FX-6300 | Gigabyte GA-990FXA-UD3 ATX AM3+ | Kingston HyperX 8GB DDR3-1600 | MSI Radeon R9 270X 2GB TWIN FROZR.

Don't fall into the pc money trap.

If the games you want to play will run at an acceptable setting to you on your current rig, ask yourself why do you need an upgrade.

Watch this youtube vid before making up your mind.
www.youtube.com/watch?v=8TiRg5X7OM0


Because I'd like to play games at high or ultra settings at 60 FPS. That barely works for a 2 year old game like Alan Wake, so how's it gonna work for a next-gen game? I don't have any games that came out this gen yet. My entire library is summer sale games and indies like Amnesia/Euro Truck simulator. Which this rig also can't max out at 60.

That's why I want to improve.


Alan Wake is a pretty demanding game, actually. Lots of very nice lighting effects. They really did a lot for the PC version.
Actually, a lot of PC games are designed to not run 1080p/60fps on the best single-GPU hardware out there, just so that they can continue to be used as benchmarks for years down the line. In particular, a bunch of AMD Gaming Evolved games use Supersampling, the oldest, slowest, and nicest looking form of AA, that you normally have to force via driver hack because it's just silly demanding. It's basically like running 2-4x the resolution you currently are.

Basically, the difference between "high" and "Ultra/MAX" is usually some extra demanding fluff that makes very little visual impact, but is there for super high end hardware to flex it's legs a bit. You shouldn't worry to much about "all games at ultra settings and 60fps", as that's kind of an eternal dream to chase that you'll only ever achieve for a moment at best.

KURRUPTOR posted...
DarkZV2Beta posted...
8350 isn't any better off, as multithreaded performance isn't the issue.


The 8350 and even 6300 is going to be fine for the vast majority of games out there. Plus more and more games are actually using more cores now which is bumping amd benchmarks up without them even doing anything. In reality AMD CPU's are the stronger processors but many programs just haven't been coding to properly use them to their upmost, but like I said more and more programs (including games) are finally starting to code their stuff with the ability to use as many cores as your CPU has to offer.

People look at stupid graphs where they play a game at 1024x768 with all the graphic options set to the minimum and intel will have 200fps and amd will have 180 fps. Both are gross overkill and don't even make a difference. People play games at 1080p or higher these days and generally turn some graphic options on, in real world gaming your GPU is going to be your frame limiter in 95% of games.

TC: I would just stick with your CPU and get a better GPU, it will give you a huge performance increase in almost every single game you play. If you upgrade your cpu you won't get any sort of gaming increase in anything but strategy games. Then again if you are a strategy game junkie then there is a legit reason to spend (a lot) more by going intel. Normally though they are just a straight up waste of your money.


When speaking of the ever important lows that define your experience, on settings people actually use, Intel wins by a wide margin. Anyone considering any type of competitive-oriented game/genre shouldn't be looking at AMD for even a moment.
---
god invented extension cords. -elchris79
Starcraft 2 has no depth or challenge -GoreGross
#30KURRUPTORPosted 7/21/2014 11:13:06 AM
PraetorXyn posted...
KURRUPTOR posted...
DarkZV2Beta posted...
8350 isn't any better off, as multithreaded performance isn't the issue.


The 8350 and even 6300 is going to be fine for the vast majority of games out there. Plus more and more games are actually using more cores now which is bumping amd benchmarks up without them even doing anything. In reality AMD CPU's are the stronger processors but many programs just haven't been coding to properly use them to their upmost, but like I said more and more programs (including games) are finally starting to code their stuff with the ability to use as many cores as your CPU has to offer.

People look at stupid graphs where they play a game at 1024x768 with all the graphic options set to the minimum and intel will have 200fps and amd will have 180 fps. Both are gross overkill and don't even make a difference. People play games at 1080p or higher these days and generally turn some graphic options on, in real world gaming your GPU is going to be your frame limiter in 95% of games.

TC: I would just stick with your CPU and get a better GPU, it will give you a huge performance increase in almost every single game you play. If you upgrade your cpu you won't get any sort of gaming increase in anything but strategy games. Then again if you are a strategy game junkie then there is a legit reason to spend (a lot) more by going intel. Normally though they are just a straight up waste of your money.


You really need to look into the Bulldozer/Piledriver architecture and learn the difference between a core and an integer processing unit.


I know the difference but they act as an extra core for tasks just the same as an i7 hyper threading does. Also I don't see how that is relevant to anything I was talking about.
---
Drugs are never the answer, unless the question is what isn't the answer.