This is a split board - You can return to the Split List for other boards.

I5 3570k vs 8350?

#21GreenMage7Posted 8/17/2013 8:17:36 PM
PhilOnDez posted...
Well, depending on how long you're planning on keeping it and how much you pay for power the 8350 can actually be more expensive than a 3570k (Not sure about the 4670k, I haven't seen the comparison numbers on that one). At stock clocks, the 3570k pays for itself in 3 years, faster if you overclock or just got a 3570. That doesn't sound like an unreasonable amount of time to keep a processor.


This is true if you only consider the price of the processor. However, Intel motherboards with equivalent features tend to be more expensive than their AMD counterparts and negate any energy savings. It depends on what you are looking for in your motherboard. If you are not overclocking and aren't looking for any specific motherboard features other than it being functional, then they tend to be close to the same price.
#22DarkZV2BetaPosted 8/17/2013 8:28:22 PM
daemon_dan posted...
DarkZV2Beta posted...
GreenMage7 posted...
daemon_dan posted...
Boge posted...
Some argue the 8 cores from the 8350 will make it the choice CPU in the long run since the new consoles have 8 cores.

I feel that by the time the number of cores actually came into play beyond 4, you'll be looking to buy a new CPU anyway.


except four of those cores aren't even real cores.


Actually, they are real cores. After some research, Windows only registers it is a four core processor because they are on the same module and share an L2 Cache. It actually does have 8 physical cores though.


Assuming you consider a "core" to be an integer cluster.


This. A real core HAS an L2 cache. It's an integer cluster.


And dedicated floating point hardware.
And dedicated instruction decode/execution hardware.
ect.
---
Want that Shield!
Ball and Cup on ps mobile has framerate issues. -stargazer64
#23VoxwikPosted 8/17/2013 8:46:14 PM(edited)
Arguing the merits of calling the modules something different or two cores that share resources is irrelevant. It's more important to look at the real world performance, which unfortunately for AMD it's behind in.

It's not like you can't game on AMD, which some make it sound like, but sadly there's just no contest in a question like the topic creator asked.

What I'm really interested in seeing on the AMD front is how advanced the integrated graphics will be getting within a few years from now. I think it's an exciting prospect if we see entry level gaming minimum requirements transition to APUs, which could lower the price of PC gaming even more than it has plummeted over the last few years. Those who want to spend more of course could pump out better graphics with discrete cards, but a bigger audience means better support for PC gamers, which is a good thing in my opinion. It helps that all three newer consoles use APU-like design.

Because of its purchase of ATI way back when, AMD may have a distinct advantage on that front, even if its CPUs lag far behind, which could potentially make it more attractive for budget gamers once again.
#24GreenMage7Posted 8/17/2013 8:49:36 PM
DarkZV2Beta posted...
And dedicated floating point hardware.
And dedicated instruction decode/execution hardware.
ect.


Well isn't this getting more and more off-topic? Cores sharing an L2 cache is how it works in consoles. Same basic idea but different execution. Either way, it means that console game will likely be programmed to take advantage of six threads, which would be a small boon but not any reason to buy one over the other which was stated about 10 posts ago...
#25GreenMage7Posted 8/17/2013 8:53:14 PM
Voxwik posted...
Arguing the merits of calling the modules something different or two cores that share resources is irrelevant. It's more important to look at the real world performance, which unfortunately for AMD it's behind in.

It's not like you can't game on AMD, which some make it sound like, but sadly there's just no contest in a question like the topic creator asked.

What I'm really interested in seeing on the AMD front is how advanced the integrated graphics will be getting within a few years from now. I think it's an exciting prospect if we see entry level gaming minimum requirements transition to APUs, which could lower the price of PC gaming even more than it has plummeted over the last few years. Those who want to spend more of course could pump out better graphics with discrete cards, but a bigger audience means better support for PC gamers, which is a good thing in my opinion. It helps that all three newer consoles use APU-like design.

Because of its purchase of ATI way back when, AMD may have a distinct advantage on that front, even if its CPUs lag far behind, which could potentially make it more attractive for budget gamers once again.


Well, we pretty much are at the point where minimum specs are at APU level. Tomb Raider and Crysis 3 on low both run fine on APUs, that's the point of optimizations we are at. It is super exciting, and AMD buying ATI and starting this pushed Intel to take integrated graphics seriously, which they had never done before. It was good for everyone.
#26DarkZV2BetaPosted 8/17/2013 10:08:42 PM
GreenMage7 posted...
DarkZV2Beta posted...
And dedicated floating point hardware.
And dedicated instruction decode/execution hardware.
ect.


Well isn't this getting more and more off-topic? Cores sharing an L2 cache is how it works in consoles. Same basic idea but different execution. Either way, it means that console game will likely be programmed to take advantage of six threads, which would be a small boon but not any reason to buy one over the other which was stated about 10 posts ago...


"Cores" sharing L2 cache goes all the way back to Intel's first multicore CPUs. People complained about them not being real multicore CPUs because of that, actually.
BD shares a lot more than L2 between "cores".
https://upload.wikimedia.org/wikipedia/commons/e/e9/AMD_Bulldozer_block_diagram_%28CPU_core_bloack%29.PNG
---
Want that Shield!
Ball and Cup on ps mobile has framerate issues. -stargazer64
#27GreenMage7Posted 8/17/2013 10:23:38 PM
DarkZV2Beta posted...
"Cores" sharing L2 cache goes all the way back to Intel's first multicore CPUs. People complained about them not being real multicore CPUs because of that, actually.
BD shares a lot more than L2 between "cores".
https://upload.wikimedia.org/wikipedia/commons/e/e9/AMD_Bulldozer_block_diagram_%28CPU_core_bloack%29.PNG


But exactly what point are you trying to make as it relates to this thread?

This side track started here:

daemon_dan posted...
Boge posted...
Some argue the 8 cores from the 8350 will make it the choice CPU in the long run since the new consoles have 8 cores.

I feel that by the time the number of cores actually came into play beyond 4, you'll be looking to buy a new CPU anyway.


except four of those cores aren't even real cores.


I'm seriously lost as to what you are trying to say. How does what you are saying relate to the discussion? Maybe then I'll understand.
#28VoxwikPosted 8/17/2013 10:27:28 PM
GreenMage7 posted...
Voxwik posted...
Arguing the merits of calling the modules something different or two cores that share resources is irrelevant. It's more important to look at the real world performance, which unfortunately for AMD it's behind in.

It's not like you can't game on AMD, which some make it sound like, but sadly there's just no contest in a question like the topic creator asked.

What I'm really interested in seeing on the AMD front is how advanced the integrated graphics will be getting within a few years from now. I think it's an exciting prospect if we see entry level gaming minimum requirements transition to APUs, which could lower the price of PC gaming even more than it has plummeted over the last few years. Those who want to spend more of course could pump out better graphics with discrete cards, but a bigger audience means better support for PC gamers, which is a good thing in my opinion. It helps that all three newer consoles use APU-like design.

Because of its purchase of ATI way back when, AMD may have a distinct advantage on that front, even if its CPUs lag far behind, which could potentially make it more attractive for budget gamers once again.


Well, we pretty much are at the point where minimum specs are at APU level. Tomb Raider and Crysis 3 on low both run fine on APUs, that's the point of optimizations we are at. It is super exciting, and AMD buying ATI and starting this pushed Intel to take integrated graphics seriously, which they had never done before. It was good for everyone.

Wow. I had no idea it's already at that point.
#29Reaper_MinionPosted 8/18/2013 12:13:14 AM
Since you are clearly buying a new motherboard as well, get an i5-4670(k) if you decide on the i5. It currently costs just about the same.
---
/|\ http://i.imgur.com/hK9c1lo.jpg?1
\|/ http://www.youtube.com/watch?v=vE7x0WLz7SY
#30DarkZV2BetaPosted 8/18/2013 1:21:22 AM
GreenMage7 posted...
DarkZV2Beta posted...
"Cores" sharing L2 cache goes all the way back to Intel's first multicore CPUs. People complained about them not being real multicore CPUs because of that, actually.
BD shares a lot more than L2 between "cores".
https://upload.wikimedia.org/wikipedia/commons/e/e9/AMD_Bulldozer_block_diagram_%28CPU_core_bloack%29.PNG


But exactly what point are you trying to make as it relates to this thread?

This side track started here:

daemon_dan posted...
Boge posted...
Some argue the 8 cores from the 8350 will make it the choice CPU in the long run since the new consoles have 8 cores.

I feel that by the time the number of cores actually came into play beyond 4, you'll be looking to buy a new CPU anyway.


except four of those cores aren't even real cores.


I'm seriously lost as to what you are trying to say. How does what you are saying relate to the discussion? Maybe then I'll understand.


This topic is clearly for discussion of 3570k and 8350, in particular, their relative performance and rationale for choosing one over the other.
How is the core structure of the 8350, as compared to the more traditional core structure of 3570k, not relevant?
---
Want that Shield!
Ball and Cup on ps mobile has framerate issues. -stargazer64