Sign in with
Sign up | Sign in
Your question

i5 3570K vs AMD 8350: Help Needed

Last response: in CPUs
Share
31 March 2013 02:22:05

I don't know whether or not to get an i5 or a fx 8350. I know that the i5 is better for gaming but by how much? Also, will the fx 8350 be better in the long run because games will perhaps use 6 cores?

More about : 3570k amd 8350 needed

a b 4 Gaming
a b À AMD
a c 104 à CPUs
31 March 2013 02:29:43

It'll be better by a fair amount for gaming.

Also, to answer your second question, NO. The amd 8-cores aren't actually 8 physical cores; they have a process (kind of like hyperthreading on the i7's, if you don't get technical) that lets them use four "modules." Each module has two double-precision cores, but for floating point calculations, i.e. gaming, it's only got one FPU per module, which means it's just a slower i5.
m
0
l
a b à CPUs
31 March 2013 02:32:58

DarkSable said:
It'll be better by a fair amount for gaming.

Also, to answer your second question, NO. The amd 8-cores aren't actually 8 physical cores; they have a process (kind of like hyperthreading on the i7's, if you don't get technical) that lets them use four "modules." Each module has two double-precision cores, but for floating point calculations, i.e. gaming, it's only got one FPU per module, which means it's just a slower i5.


+1

amd cores are more like hyperthreading for intel..and amd modules are more like intel cores...

i would go with the i5. its the best for gaming in most (if not nearly all) scenarios
m
0
l
Related resources
31 March 2013 02:40:17

Crysis 3 does surprisingly well on the FX, due to its strong (and rare among most games) multicore support.

It's also worth noting that future games ported from PS4 will have native 8-core support, as the PS4 will be using an eight core APU.

However, the i5 is sufficient for at least 2-3 years.

The question is, are you willing to pay a higher price for the i5? What GPU do you plan on using? A 8350 is sufficient for a GTX 660 or Radeon 7870, or other mid-range GPUs. The i5 3750k is better paired with higher end GPUs due to its higher compute performance.
m
0
l
a b 4 Gaming
a b À AMD
a c 104 à CPUs
31 March 2013 03:40:12

A Bad Day said:
It's also worth noting that future games ported from PS4 will have native 8-core support, as the PS4 will be using an eight core APU.

The question is, are you willing to pay a higher price for the i5?


1) There is absolutely no guarantee that the ports we get will have native 8-core support - most ports from the consoles NOW only support two cores at best.

2) The i5-3570k is a shocking whole TWENTY DOLLARS more expensive! Who can afford that much on top of their $200 processor!?

m
0
l
a b 4 Gaming
a b À AMD
a b à CPUs
31 March 2013 04:40:42

In the long run it all comes down to what you want. I have three rigs a FX-8120 FX-8350 and a i5 3570K and can tell in every single game I have (over 200 in Steam and another 50+ in Origin) both of my FX rigs can keep up in every way with my i5 3570K all on max/high settings.

I built this i5 rig because of all of the post about how much better intel CPU's are compared to AMD and after all of the building and testing I find that is completely untrue. While the my i5 rig is nice it is in no way faster in any game than my FX rigs it is just untrue period. I did however spend more money building the intel rig than I did building ether FX rig.
m
0
l
a b à CPUs
31 March 2013 04:46:44

bryonhowley said:
In the long run it all comes down to what you want. I have three rigs a FX-8120 FX-8350 and a i5 3570K and can tell in every single game I have (over 200 in Steam and another 50+ in Origin) both of my FX rigs can keep up in every way with my i5 3570K all on max/high settings.

I built this i5 rig because of all of the post about how much better intel CPU's are compared to AMD and after all of the building and testing I find that is completely untrue. While the my i5 rig is nice it is in no way faster in any game than my FX rigs it is just untrue period. I did however spend more money building the intel rig than I did building ether FX rig.


only time your going to see a real differance is playing at low resolutions...where the GPU can out pace the cpu...so generally speaking this is correct...the differane @1080p + will be minor and likely not noticable
m
0
l
a b à CPUs
31 March 2013 05:01:54

bryonhowley said:
In the long run it all comes down to what you want. I have three rigs a FX-8120 FX-8350 and a i5 3570K and can tell in every single game I have (over 200 in Steam and another 50+ in Origin) both of my FX rigs can keep up in every way with my i5 3570K all on max/high settings.

I built this i5 rig because of all of the post about how much better intel CPU's are compared to AMD and after all of the building and testing I find that is completely untrue. While the my i5 rig is nice it is in no way faster in any game than my FX rigs it is just untrue period. I did however spend more money building the intel rig than I did building ether FX rig.


I also have an i7 rig with a 3930k that I take back and forth with me from work every six months or so when I need to finish a project (both have the same gpu and ram setups). Although my i7 benchs much higher, with a similar overclock (due to its stronger cores). In games though the FX performs nearly as well in just about every game.

To optimize more effectively with future game ports (All new consoles will be AMD), I would pickup an 8350, it will compete with the 3570k, neck and neck and it is going to do well due to its advantage in future games that will be optimizied for more than 4 cores (due to the multicore apus going into the new PS4 and XBOX 720).



m
0
l
31 March 2013 17:27:49

if the only use is gaming then go for the i5!for multitaskin probably the fx 8350 is better!in real no game needs 6 cores it just need 4 cores!
also dont forget that the 8350 each core is half!
m
0
l
a b à CPUs
31 March 2013 17:42:40

Gennaios said:
if the only use is gaming then go for the i5!for multitaskin probably the fx 8350 is better!in real no game needs 6 cores it just need 4 cores!
also dont forget that the 8350 each core is half!


Nope, each "module is half" there are 8 cores, 8 threads, 4 physical, 4 virtual. Plenty of articles out there on cpu architecture, go read one to brush up knowledge. Also, many games now use more than 4 cores and are optimized as such. Crysis 3 is a perfect example, as well as the newer Tomb Raider, both of those games use all 8 of my cores when running, I've tested and re'tested myself.

m
0
l
a b 4 Gaming
a b À AMD
a c 210 à CPUs
31 March 2013 17:45:36

Games are going toward using more cores, here's some reasons why:

(1) Current gen consoles only have 3 cores maximum on the CPU they used, so the current ports from consoles are not designed to run on many cores like PC's have in their CPUs. However, PS4 and XBOX720 are both going to have 8 core CPU's from AMD.

(2) The last few games to come out, like Crysis 3, Far Cry 3, Metro 2033, Battlefield 3, they all use a lot of cores now, and we are seeing the playing field level, anyone on here advising you to buy intel cannot dispute the difference in frame rate is less than 5 FPS on any of the games mentioned above at max settings between the 2 you selected. More games are going to be this way...GTA5 is coming in september, it's supposed to be 8 core aware when it hits the ground...you want to play it on a 4 core machine?

(3) If your monitor is a 60 Hz monitor, then your maximum frame rate is only 60 FPS anyway...so the difference between 110.4 FPS and 111.7 FPS is irrelevant entirely...you could spend the extra $90-100 to build an intel machine, and you would never realize the difference in FPS anyway because your monitor cannot display such frame rates.

(4) If your monitor would display the frame rates up to 120 FPS (120 MHz monitor)...the average human eye is only sensitive to about 30 FPS, anything beyond that and you cannot tell, some people are a little more sensitive, but in reality, a lot of them it is all in their head. PS3 XBOX 360 FPS are ~30-35 FPS to give you a valid real world comparison, if those frame rates are good for you, then anything over 30-35 FPS is fine.

(5) Software development always runs about 18-24 months behind hardware, because it takes about 2 years or so to develop a game around current hardware. The Piledriver architecture has been out for about 18-24 months now, and look at what we have here...a boat load of games coming designed around heavily threaded code and 4+ cores CPUs. Coincidence? Couldn't possibly be!

(6) The only advantage the i5-3570k has over the FX-8350 is in SOME games...and frankly...if you do anything other than game...the FX-8350 is a better solution for that...there are benchmarks out there to prove it too. You should also view any benchmarks shown with a grain of salt. openbenchmarking.org actually check the compilers in the benchmarks they use to make sure that the compiler doesn't have the intel "check AMD" flag. Which intel used to slow down AMD CPUs by making the code more convoluted. If you think I am lying to any of you, google it and see for yourself, AMD sued and won in court.
m
0
l
31 March 2013 17:57:11

little are the cases which somebody buy one 3930k just for gaming!if a real octacore would core 200$ i would have seen that in worlds guinness records!
m
0
l
a b à CPUs
31 March 2013 17:58:48

Gennaios said:
little are the cases which somebody buy one 3930k just for gaming!if a real octacore would core 200$ i would have seen that in worlds guinness records!


i did just that and love my 3930k :) 
m
0
l
31 March 2013 18:00:50

hahaa its lovable thats true!
m
0
l
a b 4 Gaming
a b À AMD
a c 104 à CPUs
31 March 2013 18:48:20

8350rocks said:
(4) If your monitor would display the frame rates up to 120 FPS (120 MHz monitor)...the average human eye is only sensitive to about 30 FPS, anything beyond that and you cannot tell, some people are a little more sensitive, but in reality, a lot of them it is all in their head. PS3 XBOX 360 FPS are ~30-35 FPS to give you a valid real world comparison, if those frame rates are good for you, then anything over 30-35 FPS is fine.


Not going to get dragged into a flame war by responding to the rest of your post, but this right here is an absolute myth.
Almost everyone (and by that I'm simply excluding people with severe sight problems) can tell the difference between 120fps and 60fps. The reason that BS myth about the human eye "only being able to see more than 30 frames a second" came about because of old 24fps movies. Yes, those look fluid to us, but that's because of motion blur, which games don't have or need. 24 fps is just about the bare MINIMUM of what the human eye will see as fluid motion, as opposed to a series of frames - it can see and interpret much, much greater numbers than that.

If you want a simple, visual proof that you should stop spreading this myth, take five minutes and check out this site: http://boallen.com/fps-compare.html

EDIT: Another, more in-depth article: http://amo.net/NT/02-21-01FPS.html
m
0
l
31 March 2013 19:08:50

DarkSable said:
8350rocks said:
(4) If your monitor would display the frame rates up to 120 FPS (120 MHz monitor)...the average human eye is only sensitive to about 30 FPS, anything beyond that and you cannot tell, some people are a little more sensitive, but in reality, a lot of them it is all in their head. PS3 XBOX 360 FPS are ~30-35 FPS to give you a valid real world comparison, if those frame rates are good for you, then anything over 30-35 FPS is fine.


Not going to get dragged into a flame war by responding to the rest of your post, but this right here is an absolute myth.
Almost everyone (and by that I'm simply excluding people with severe sight problems) can tell the difference between 120fps and 60fps. The reason that BS myth about the human eye "only being able to see more than 30 frames a second" came about because of old 24fps movies. Yes, those look fluid to us, but that's because of motion blur, which games don't have or need. 24 fps is just about the bare MINIMUM of what the human eye will see as fluid motion, as opposed to a series of frames - it can see and interpret much, much greater numbers than that.

If you want a simple, visual proof that you should stop spreading this myth, take five minutes and check out this site: http://boallen.com/fps-compare.html

EDIT: Another, more in-depth article: http://amo.net/NT/02-21-01FPS.html


Some games DO have motion blur and it helps nothing. You can see hell lot of frames but 30 is fluid, 60 is great, 120 is overkill and over 9000 is just too much to make highly minor diff.
m
0
l
a b 4 Gaming
a b À AMD
a c 104 à CPUs
31 March 2013 19:18:28

tadej petric said:
Some games DO have motion blur and it helps nothing. You can see hell lot of frames but 30 is fluid, 60 is great, 120 is overkill and over 9000 is just too much to make highly minor diff.

Sorry, might not have been clear - that's exactly what I meant.

Though I disagree with you on the fact that 120fps is overkill - I have a 120Hz monitor, and I have to tell you, 60fps looks jerky and laggy now; 120 is bloody amazing.

m
0
l
31 March 2013 19:24:56

DarkSable said:
tadej petric said:
Some games DO have motion blur and it helps nothing. You can see hell lot of frames but 30 is fluid, 60 is great, 120 is overkill and over 9000 is just too much to make highly minor diff.

Sorry, might not have been clear - that's exactly what I meant.

Though I disagree with you on the fact that 120fps is overkill - I have a 120Hz monitor, and I have to tell you, 60fps looks jerky and laggy now; 120 is bloody amazing.



Depends on person really. I played 10 years on my old computer so Im used to 20 frames. Still I like more but it fells fluid enough. But 120 looks nice, I agree. Its like having ferrari and clio. Both can drive (making fluid picture) but ferrari (120hz) is just better. You dont get much of it tought.
m
0
l
a b à CPUs
1 April 2013 00:48:32

Just keep in mind, all console games are set to a 30fps cap.
m
0
l
a b 4 Gaming
a b À AMD
a c 210 à CPUs
1 April 2013 20:26:30

chrisafp07 said:
Just keep in mind, all console games are set to a 30fps cap.


+1
m
0
l