Is Haswell worth upgrading from Ivy Bridge?

  • Thread starter Thread starter ElliotSmith
  • Start date Start date
  • Tags Tags
    Bridge
Click For Summary

Discussion Overview

The discussion centers around the potential benefits of upgrading from an Ivy Bridge to a Haswell or Haswell-E processor, particularly in the context of gaming performance and overall value for money. Participants explore various aspects of the microarchitectures, including speed differences, gaming performance, and future upgrade considerations.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Exploratory

Main Points Raised

  • Some participants suggest that the performance increase from Ivy Bridge to Haswell is minimal, estimating around 5% clock-for-clock, and may not be noticeable outside of benchmarks.
  • Others argue that for gaming, the performance is largely GPU-bound, and an overclocked 3770K may perform similarly to a Haswell-E.
  • Concerns are raised about whether a 3770K would bottleneck high-end GPU setups, particularly in multi-GPU configurations.
  • Some participants mention that dual SLI GTX 980 setups may be overkill for 1080p gaming, while others believe that having more power is beneficial for maxing out settings.
  • There is a discussion about the value of upgrading to higher resolution monitors and the impact of pixel density on image quality.
  • Some participants note that while Haswell laptops show increased power savings, this may not directly relate to gaming performance comparisons.
  • Several participants highlight that certain games are CPU-bound, suggesting that the CPU upgrade may be relevant for specific titles.
  • There is mention of the evolution of Intel's microarchitecture, with some participants expressing a preference to wait for future releases like Skylake.

Areas of Agreement / Disagreement

Participants express a range of views on the upgrade's value, with no clear consensus on whether the upgrade is worthwhile. Disagreements exist regarding the impact of CPU performance on gaming and the necessity of high-end GPUs for current resolutions.

Contextual Notes

Participants reference various benchmarks and personal experiences, but there are unresolved assumptions regarding the specific performance metrics and conditions under which the CPUs are evaluated. The discussion also touches on the evolving nature of gaming demands and hardware capabilities.

ElliotSmith
Messages
167
Reaction score
104
Is it worth the money upgrading from an Ivy Bridge to a Haswell or Haswell-E?

Clock for clock, about how much faster is the new microarchitecture?
 
Computer science news on Phys.org
I found several reviews that compare the two architectures. Here's one: http://www.pcmag.com/article2/0,2817,2421019,00.asp

There's a summary on Wikipedia: http://en.wikipedia.org/wiki/Haswell_(microarchitecture )

You should get into the habit of doing some basic research first before asking questions like this one.
 
Last edited by a moderator:
I've played with each generation of intels consumer line of processors since the P4 570J. IMO, it's not worth the upgrade yet. True there'a slight increase in speed clock-for-clock, about 5% or so, but no difference you'll notice except with benchmarks. The jump from an i7 920 to a Haswell-E isn't even spectacular. The only real noticeable difference would be from say, a Q6600 to a Haswell-E.

Broadwell is just Intels die-shrink tick in their tick-tock evolution. I'd wait until their tock with Skylake.
 
How about for gaming? Would there be any noticeable improvement going from an IB to a Haswell-E?

Would a 3770K bottleneck enthusiast GPU setups, like let's say 2-way SLI GTX 980? The Z77 motherboards only support PCIe 3.0 x8 and I was wondering if this would cause an issue with multiple flagship GPU's of the latest generation?
 
No real difference with gaming. The cache size is also the same, which doesn't even really matter with games anyway. Gaming performance nowadays is entirely GPU bound. A slightly overclocked 3770k will perform well on-par with a Haswell-E. The only real issue where you would worry about a 3.0 x8 slot hindering performance with a flagship card is if you were getting close to 4k resolutions. 3.0 x8 is good enough to handle current high-end cards.
 
I was told that dual SLI GTX 980 is overkill for 1080p gaming no matter what game you're playing. Would you agree with that?

But you can never have too much power whenever maxing out AA and AF, even on 1080p.

TXAA really pulls a lot of juice from your GPU's as well.
 
Definitely overkill at just 1080. If you're wanting to max out the FSAA and AF, you'd save almost $400 by just going with two GTX 970s in SLI which would do just as well at 1080. The bigger video cards don't really shine until you start cranking the resolution. For example, at really low resolutions like 1024x768, a mid-range card will perform exactly like a flagship card.
 
I might eventually upgrade to a 1440p monitor sometime in the future, but not now..

The 4K monitors cost upwards of $1,500 and I don't have the money for that at the moment. I hope they become more affordable.
 
You can actually get a 28" 4K for under $600 now. The reason why I'm still waiting is because the technology hasn't progressed yet to have a 4K monitor at 120Hz. Right now, everything on the market is 60Hz. Quite a few companies are borderline false advertising saying their monitors are 120Hz, when in fact they are not; they're two 60Hz screen projected onto one and then claiming, "60Hz x 2 = 120Hz! Yes, 120Hz!" No, far from it.
 
  • #10
$600 is still quite a bit of cash to spend on a monitor. This money would be much better used to upgrade your graphics card.

My 32" viewsonic 1080p has several unsightly dead pixels and next year I'll probably upgrade to a 1440p 32" if I can find one at a reasonable price.
 
  • #11
For something high quality you're going to be staring at the entire time you're using your computer over the next five or six years, $600 is more than reasonable. I've spent over $1,000 on a monitor before due to the desktop size demand. $1,000 was actually on the lower-end for the ones in the class.

Sure, you can pump more money into a set of flagship cards, but you'll never tap into they're potential just running at 1080. The larger you go with screen size at a given resolution, you're going to see a decrease in pixel density. Your viewing distance being a prime factor.

Example; Take a 1080 output, blow it up to 130", then stand four feet away from it. Horrible quality. Not as noticeable with "27-32" monitors, but if you're buying expensive video cards to play games at high quality, it should obviously be a priority.

Take your savings from buying two 970s instead of two 980s, add on $200, and there's your 4k monitor.
 
  • #12
Another example would be comparing a 27" monitor @ 1080 to a 32" monitor @ 1080. The 27" monitor will have a tighter pixel density, therefore producing a better higher quality image vs the 32".
 
  • #13
B. Elliott said:
I'd wait until their tock with Skylake.
AFAIK, the tock is Broadwell...
 
  • #16
14nm is really stretching the limits of Moore's law.
 
  • #17
B. Elliott said:
No real difference with gaming ... Gaming performance nowadays is entirely GPU bound.

There's a lot of massively popular games that are CPU bound in the general case. I can think of Minecraft and Guild Wars 2 right off the top of my head. There are many others. I'd need to see benchmarks to comment on the efficacy of a cpu switch.

Personally for me I've noticed increased power savings on haswel laptops but that's hardly relevant to the OP's question about gaming.
 
  • #18
Sim City 5 is a CPU-heavy game.

So is starcraft 2, diablo 3, etc...
 
  • #19
ElliotSmith said:
I was told that dual SLI GTX 980 is overkill for 1080p gaming no matter what game you're playing. Would you agree with that?

But you can never have too much power whenever maxing out AA and AF, even on 1080p.

TXAA really pulls a lot of juice from your GPU's as well.

Nothing is ""Overkill". That "Overkill" hardware of today is useless piece of junk tomorrow. And with Dynamic Super Resolution even GTX 980 can't do the job. Not by a mile and a half (unless it's a very old game and for that it's perfect). Enable Triple Buffering and your PC commits a harakiri. ALWAYS go for the "best-bang-for-a-buck". That is, unless you have extra money you don't know what to do with...

I thought my rig is overkill with 2 x GTX 970, i7 4790K @4.4Ghz and 16GB Ram. I was very much wrong. No way I can max out every game out there. And enabling DSR? Give me a break...

Maxing out AF doesn't take almost any juice.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
4K
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
6K
  • · Replies 11 ·
Replies
11
Views
16K
  • · Replies 37 ·
2
Replies
37
Views
7K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 21 ·
Replies
21
Views
6K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
3
Views
3K