Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is Haswell worth upgrading from Ivy Bridge?

  1. Dec 4, 2014 #1
    Is it worth the money upgrading from an Ivy Bridge to a Haswell or Haswell-E?

    Clock for clock, about how much faster is the new microarchitecture?
     
  2. jcsd
  3. Dec 4, 2014 #2

    Mark44

    Staff: Mentor

    I found several reviews that compare the two architectures. Here's one: http://www.pcmag.com/article2/0,2817,2421019,00.asp

    There's a summary on Wikipedia: http://en.wikipedia.org/wiki/Haswell_(microarchitecture [Broken])

    You should get into the habit of doing some basic research first before asking questions like this one.
     
    Last edited by a moderator: May 7, 2017
  4. Dec 9, 2014 #3
    I've played with each generation of intels consumer line of processors since the P4 570J. IMO, it's not worth the upgrade yet. True there'a slight increase in speed clock-for-clock, about 5% or so, but no difference you'll notice except with benchmarks. The jump from an i7 920 to a Haswell-E isn't even spectacular. The only real noticeable difference would be from say, a Q6600 to a Haswell-E.

    Broadwell is just Intels die-shrink tick in their tick-tock evolution. I'd wait until their tock with Skylake.
     
  5. Dec 10, 2014 #4
    How about for gaming? Would there be any noticeable improvement going from an IB to a Haswell-E?

    Would a 3770K bottleneck enthusiast GPU setups, like let's say 2-way SLI GTX 980? The Z77 motherboards only support PCIe 3.0 x8 and I was wondering if this would cause an issue with multiple flagship GPU's of the latest generation?
     
  6. Dec 11, 2014 #5
    No real difference with gaming. The cache size is also the same, which doesn't even really matter with games anyway. Gaming performance nowadays is entirely GPU bound. A slightly overclocked 3770k will perform well on-par with a Haswell-E. The only real issue where you would worry about a 3.0 x8 slot hindering performance with a flagship card is if you were getting close to 4k resolutions. 3.0 x8 is good enough to handle current high-end cards.
     
  7. Dec 11, 2014 #6
    I was told that dual SLI GTX 980 is overkill for 1080p gaming no matter what game you're playing. Would you agree with that?

    But you can never have too much power whenever maxing out AA and AF, even on 1080p.

    TXAA really pulls a lot of juice from your GPU's as well.
     
  8. Dec 11, 2014 #7
    Definitely overkill at just 1080. If you're wanting to max out the FSAA and AF, you'd save almost $400 by just going with two GTX 970s in SLI which would do just as well at 1080. The bigger video cards don't really shine until you start cranking the resolution. For example, at really low resolutions like 1024x768, a mid-range card will perform exactly like a flagship card.
     
  9. Dec 11, 2014 #8
    I might eventually upgrade to a 1440p monitor sometime in the future, but not now..

    The 4K monitors cost upwards of $1,500 and I don't have the money for that at the moment. I hope they become more affordable.
     
  10. Dec 12, 2014 #9
    You can actually get a 28" 4K for under $600 now. The reason why I'm still waiting is because the technology hasn't progressed yet to have a 4K monitor at 120Hz. Right now, everything on the market is 60Hz. Quite a few companies are borderline false advertising saying their monitors are 120Hz, when in fact they are not; they're two 60Hz screen projected onto one and then claiming, "60Hz x 2 = 120Hz! Yes, 120Hz!" No, far from it.
     
  11. Dec 12, 2014 #10
    $600 is still quite a bit of cash to spend on a monitor. This money would be much better used to upgrade your graphics card.

    My 32" viewsonic 1080p has several unsightly dead pixels and next year I'll probably upgrade to a 1440p 32" if I can find one at a reasonable price.
     
  12. Dec 12, 2014 #11
    For something high quality you're going to be staring at the entire time you're using your computer over the next five or six years, $600 is more than reasonable. I've spent over $1,000 on a monitor before due to the desktop size demand. $1,000 was actually on the lower-end for the ones in the class.

    Sure, you can pump more money into a set of flagship cards, but you'll never tap into they're potential just running at 1080. The larger you go with screen size at a given resolution, you're going to see a decrease in pixel density. Your viewing distance being a prime factor.

    Example; Take a 1080 output, blow it up to 130", then stand four feet away from it. Horrible quality. Not as noticeable with "27-32" monitors, but if you're buying expensive video cards to play games at high quality, it should obviously be a priority.

    Take your savings from buying two 970s instead of two 980s, add on $200, and there's your 4k monitor.
     
  13. Dec 12, 2014 #12
    Another example would be comparing a 27" monitor @ 1080 to a 32" monitor @ 1080. The 27" monitor will have a tighter pixel density, therefore producing a better higher quality image vs the 32".
     
  14. Jan 18, 2015 #13

    Svein

    User Avatar
    Science Advisor

    AFAIK, the tock is Broadwell...
     
  15. Jan 19, 2015 #14
    Last edited: Jan 19, 2015
  16. Jan 19, 2015 #15

    Svein

    User Avatar
    Science Advisor

  17. Jan 19, 2015 #16
    14nm is really stretching the limits of Moore's law.
     
  18. Jan 25, 2015 #17
    There's a lot of massively popular games that are CPU bound in the general case. I can think of Minecraft and Guild Wars 2 right off the top of my head. There are many others. I'd need to see benchmarks to comment on the efficacy of a cpu switch.

    Personally for me I've noticed increased power savings on haswel laptops but that's hardly relevant to the OP's question about gaming.
     
  19. Jan 26, 2015 #18
    Sim City 5 is a CPU-heavy game.

    So is starcraft 2, diablo 3, etc...
     
  20. Jan 29, 2015 #19
    Nothing is ""Overkill". That "Overkill" hardware of today is useless piece of junk tomorrow. And with Dynamic Super Resolution even GTX 980 can't do the job. Not by a mile and a half (unless it's a very old game and for that it's perfect). Enable Triple Buffering and your PC commits a harakiri. ALWAYS go for the "best-bang-for-a-buck". That is, unless you have extra money you don't know what to do with...

    I thought my rig is overkill with 2 x GTX 970, i7 4790K @4.4Ghz and 16GB Ram. I was very much wrong. No way I can max out every game out there. And enabling DSR? Give me a break...

    Maxing out AF doesn't take almost any juice.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Is Haswell worth upgrading from Ivy Bridge?
  1. Ivy Bridge die? (Replies: 8)

Loading...