Gigabyte P34W v3 Gaming Laptop Review

Gigabyte's P34W v3 is one of the only two 14-inch gaming notebooks with GeForce GTX 970M graphics. Can portability and performance really coexist?

Introduction

Gaming laptops are inherently exercises in compromise. You can emphasize the gaming aspect by cramming lots of fast hardware into a big, heavy chassis. Or you can focus on mobility, cutting power and cooling to make the machine more portable. Everything in between represents a rebalancing act to attract customers with different priorities.

When Gigabyte’s P34W v3 first landed in our lab, we thought there was no way it’d game competently. Measuring just under 13.5” wide, less than 9.5” deep and roughly three-quarters of an inch thick, it easily fits in a backpack. Without its bundled power adapter, the P34W weighs just four pounds. By all accounts, the platform is thin and light—two adjectives rarely associated with gaming.

And yet, our sample includes a Haswell-based Core i7-4720HQ host processor, 16GB of DDR3-1600 in a dual-channel configuration, Nvidia’s GeForce GTX 970M with 3GB of GDDR5, a 128GB SSD and a 1TB hard drive. Knowing what we know about quad-core i7s and the GM204 GPU, it’d be reasonable to expect playable frame rates in demanding AAA titles at the 14” panel’s native 1920x1080 resolution.

Surely, then, the P34W v3 must cost a mint, right? Not really, no. You can customize the notebook on xoticpc.com to look like ours for just under $1700. Or, Amazon has a similar model with 8GB of RAM for less than $1600. That's quite a bit less expensive than Razer's 14" Blade.

Gigabyte does lose points when it comes to what you hear from the P34W v3—both through the sound you get through its anemic stereo speakers and the fan noise that escapes as a byproduct of its powerful hardware. Even with no load applied, expect the cooling subsystem to spin up to noticeable volume sporadically. More on that later, though.

There are a few different part numbers listed for this notebook. One, the P34Wv3-CF2, includes 8GB of DDR3, a 128GB SSD, a 1TB hard drive and Windows 8.1. Another, the P34Wv3-CF3, boasts 16GB of DDR3, a 256GB SSD, the same terabyte-class disk and Windows 7 pre-loaded with a Windows 8.1 upgrade. Those are both on Newegg. Other vendors market this system as the P34W v3 and then allow you to customize it to suit. The box containing our sample didn’t list any specifications or model numbers, so you’ll want to pay particular attention when it comes to comparing parts and price tags.

Technical Specifications

MORE: All Notebook Content
MORE: Visit Our Sister Site LAPTOP

On The Outside

The P34W v3 exudes subtle elegance. Nothing about Gigabyte’s design screams “look at me; I’m packed with gaming hardware.” Rather, the top cover is smooth, black and matte, interrupted only by Gigabyte’s logo in glossy silver and a small strip of dissimilar material at the top. The worst thing we can say is that the surface material is almost impossible to keep clean. Fingerprints show up readily. Then again, if you purchase the P34W v3 through XoticPC, you can buy a textured wrap or custom paint job to rectify this.

Underneath, two grilles are cut into the front of the chassis for audio. Notice that they face down. Gigabyte apparently didn't have a lot of room on the chassis to optimize the P34W v3's audio, so you end up with muddled output that varies quite a bit depending on the surface underneath.

Everything else you see is for ventilation, so you won’t want to block all of those intakes with thick carpet or a bedspread. Five rubberized feet and four plastic tabs elevate the enclosure when it’s on a table. The bottom cover comes off after you remove 15 screws. Or, if you only need access to the platform’s two memory slots, a single screw releases an easy-access panel.

The right side plays host to a power input, two USB 3.0 ports, an HDMI output and an SD card reader.

The left includes a Kensington lock, wired Ethernet connectivity, VGA output, two more USB 3.0 ports and a single jack for audio I/O. Space on the two sides is obviously limited, but it would have been nice to see mini-DisplayPort instead of VGA. More gamers are going to want to attach modern secondary displays than old projectors, we imagine.

There is no connectivity up front. Rather, you’ll find five pin-hole LED indicators corresponding to Bluetooth, Wi-Fi, hard drive, battery and power status. Additionally, if you right-click on the touchpad when the P34W is off, those LEDs light up according to the battery’s remaining charge.

The back of Gigabyte’s design features two exhaust vents. A pair of fans just behind those vents blow across two heat pipes, which in turn cover the Core i7 processor and GM204 GPU. Understandably, the air leaving that area gets especially warm.

With its lid open, the P34W v3 retains a clean, simple aesthetic. A pair of hinges hold the top panel securely. Our sample did pop a little when the screen reached the limit of its travel. Otherwise, though, it held position well. Up above the display, you’ll find two microphones, a webcam status indicator, the webcam itself and an ambient light sensor for the panel and keyboard backlights.

You have to push hard on the palm rest to make it flex. Then again, you’ll probably want to use your own USB-attached keyboard and mouse for gaming.

When you don’t have external peripherals handy, you’ll find the backlit keyboard easy to type on. Its keys are spaced nicely, offer just the right amount of travel and click satisfyingly into place. The W, A, S and D keys are bolded for emphasis, a nod to this machine’s gaming purpose. However, the directional keys are asymmetrical—left and right are large, while up and down are tiny. It’s difficult to be precise with them in-game.

It’s hard to imagine using a touchpad for much beyond basic Windows navigation. The P34W v3’s works as expected for this purpose. Use the far left and far right of the click keys to get the best response. Otherwise, you’ll find yourself mashing on them with little travel in return.

On The Inside

Naturally, cramming high-end hardware into a compact gaming notebook requires efficient use of space, and Gigabyte appears to have capitalized on every square inch. The Core i7-4720HQ and GM204 sit right next to each other, covered by heat spreaders and two flattened pipes. Those pipes run from one side of the P34W to the other, ending in arrays of cooling fins. Fans blow across them and out the back.

Intel’s -4720HQ is a quad-core, Hyper-Threaded processor with a 2.6GHz base clock and 3.6GHz peak Turbo Boost frequency. It sports 6MB of shared L3 cache, AES-NI support and a dual-channel memory controller capable of running at data rates as high as 1600 MT/s. While the company’s HD Graphics 4600 engine serves up unimpressive 3D performance, it adds QuickSync and Wireless Display support, both of which can come in useful on the road.

When it comes time to game, the GM204-based GeForce GTX 970M kicks into gear. The GPU has 1280 of its CUDA cores enabled, along with 80 texture units and 48 ROPs. An aggregate 192-bit memory bus attaches to 3GB of GDDR5 memory. There’s a 6GB version of the 970M as well; however, the 3GB implementation is ample for any combination of resolution and detail settings you’d ask of the cut-back GM204.

There are only two SO-DIMM slots on the P34W v3, so pick your modules carefully. Our sample came with two 8GB sticks of DDR3L-1600 from Transcend, totaling 16GB. Bear in mind that although Intel’s memory controller does support 32GB, Gigabyte’s platform tops out at 16.

The P34W v3 doesn’t give you tons of room for storage upgrades, though the one mSATA slot and a single 2.5” bay are more than enough for most gamers. A 128GB Lite-On SSD in our notebook is powered by a Marvell controller and rated for 512 MB/s sequential reads and 320 MB/s sequential writes. Gigabyte also supports 256 and 512GB mSATA drives. Given the relatively small SSD we’re testing, though, a 1TB HGST disk comes in particularly useful for game installations and user data. Really, the SSD/HDD combination is perfectly balanced for such a small high-performance gaming machine.

Gigabyte implements two networking interfaces: a PCIe-based GbE controller from Realtek and Intel’s dual-band Wireless-AC 7260 2x2 mini-PCIe card. In addition to supporting 802.11ac at up to 867 Mb/s on the 5GHz band (enabling Wireless Display in the process), the little module also includes Bluetooth 4.0.

Much of the P34W v3’s internal space is dedicated to a 61.25Wh Li-ion battery. Remarkably, even though this system is small and filled with fast components, it exceeds two hours of run time in two of our battery life tests and approaches three hours in a third. Don’t plan on getting much from it if you’re gaming on the road. But when you need to get some work done away from the wall, the P34W v3 won’t leave you stranded.

Two oblong drivers sit in front of the battery, right up against the notebook’s edge. These are truly diminutive. Although Gigabyte advertises Dolby Digital Plus Home Theater audio technology, there is no getting around the limitations of 1.5W speakers facing downward. If you want to drown out the cooling fans, attach your own headset to the P34W v3’s 3.5mm jack.

Rounding out the otherwise exceptional hardware package is an AU Optronics B140HAN Advanced Hyper-Viewing Angle (AHVA) panel with a hard anti-glare surface. Its viewing angles are unbeatable, as we’d expect from IPS technology, and you’ll find the screen easy to use indoors or out.

How We Tested

A gaming-oriented laptop should be evaluated principally on its gaming performance. To that end, we pull out Battlefield 4, Middle-earth: Shadow of Mordor, Thief, Metro Last Light, Tomb Raider and Grid 2.

All benchmarks were run plugged in to the wall using our custom Windows 8 power profile. The screen was then calibrated to 200 cd/m² before the same tests were repeated on battery power.

Synthetics

3DMark
FireStrike: Graphics, Physics, Combined Modules
PCMark 8Home, Creative, Work Modules (Accelerated), Storage
SiSoftware SandraMemory Bandwidth Module

Gaming

Battlefield 4DirectX 11, 100-sec. Fraps "Tashgar", Ultra Quality Preset, 1920x1080
Grid 2 DirectX 11, 120-sec. Fraps Built-In Benchmark, Ultra Quality Preset, 1920x1080
Middle-earth: Shadow of Mordor
DirectX 11, 40-sec. Fraps Built-In Benchmark, Ultra Quality Preset, 1920x1080
Thief
DirectX 11, 70-sec. Fraps Built-In Benchmark, Very High Quality Preset, 1920x1080
Tomb Raider
DirectX 11, 45-sec Fraps Custom Tom's Hardware Benchmark, Ultra Quality Preset, 1920x1080
Metro Last Light
DirectX 11, 145-sec Fraps Built-In Benchmark, Very High Quality Preset, 1920x1080

Productivity

7-ZipVersion 9.30 alpha (64-bit): THG-Workload (1.3GB) to .7z, command line switches "a -t7z -r -m0=LZMA2 -mx=5"
Adobe Photoshop CCCustom OpenCL-based Workload
Autodesk 3ds Max 2013iray Workload
HandBrake CLIVersion: 0.99: Video from Canon EOS 7D (1920x1080, 25 FPS) 1 Minutes 22 Seconds
Audio: PCM-S16, 48,000Hz, 2-Channel, to Video: AVC1 Audio: AAC (High Profile)
LAME MP3Version 3.98.3: Audio CD "Terminator II SE", 53 min, convert WAV to MP3 audio format, Command: -b 160 --nores (160Kb/s)
TotalCode Studio 2.5Version: 2.5.0.10677: MPEG-2 to H.264, MainConcept H.264/AVC Codec, 28 sec HDTV 1920x1080 (MPEG-2), Audio: MPEG-2 (44.1kHz, 2-Channel, 16-Bit, 224Kb/s), Codec: H.264 Pro, Mode: PAL 50i (25 FPS), Profile: H.264 BD HDMV

Battery Life

PCMark 8Home, Creative, Work Modules

Wireless Networking

Ixia IxChariot
Version 7.2 Build Level 107, TCP Throughput
Passmark WirelessMon
Version 4.0 Build 1009, Signal Strength

Synthetic Benchmark Results

Plugged into the wall, Gigabyte’s P34W v3 generates a 3DMark FireStrike score of 6603, which falls to 4960 on battery power. For context, consider that the GeForce GTX 970M is about 80% of a desktop 970, and that when Thomas reviewed Gigabyte’s P35X v3 earlier this year, its GeForce GTX 980M achieved 8261 points. A Physics result of 8786 actually exceeds the P35X’s 8620 score, which we’d expect since that system had a Core i7-4710HQ instead.

Thomas ran his PCMark 8 numbers without acceleration, choosing to focus on raw CPU performance. I turned acceleration on, though. When you’re on the road and battery life is what limits your laptop’s utility, anything you can do to get back to idle quickly will stretch the system’s run time. This includes leveraging compute resources whenever possible.

To that end, Gigabyte’s P34W v3 scores 4645 points plugged into the wall in PCMark’s Work module. The P35X v3 did 4787 points. From there, the P34W v3 blows past its bigger brother without looking back, quadrupling its performance in the Home test and more than tripling it in the Creative metric.

Storage performance naturally varies depending on whether you’re using the SSD or hard drive. Fortunately, it doesn’t really matter if you’re plugged into the wall or on battery—the results are similar in both cases.

Gaming Benchmark Results

We run our game tests plugged in and away from the wall, yielding two very different looks at performance as Nvidia’s Battery Boost technology kicks in.

Ideally, we want to see the P34W v3 hold a steady 30 FPS on battery power (or whatever frame rate you specify through GeForce Experience). Then, when we attach the AC adapter, the notebook should generate playable performance at its panel’s native resolution and the highest-quality detail settings.

Gigabyte does not disappoint. Plugged in to the wall, we were able to crank each game up to 1920x1080 and its top graphics preset and enjoy smooth frame rates. Battlefield 4 averaged more than 53 FPS; Metro Last Light averaged 50; Tomb Raider nearly hit 80 FPS; Thief was up around 50 as well. The more platform-bound Grid 2 soared up to approach 90 FPS on average.

Even on battery power (and reduced clock rates) the P34W v3 is fast enough to maintain a smooth 30 FPS in these titles without giving up any graphics quality. For the system’s 14” FHD panel, a GeForce GTX 970M is just about right. Faster graphics modules (like the 980M) would really only be useful for attaching external displays. Then again, without a DisplayPort output, the P34W v3 wouldn’t be our top choice for docking to a more stationary desktop.

Productivity Benchmark Results

Intel’s Core i7-4720HQ is a popular processor choice amongst gaming notebook vendors. It’s a quad-core model with Hyper-Threading support, so parallelized workloads run well. And of course, the Haswell architecture makes quick work of single-threaded tasks.

These results illustrate how performance differs when you use the P34W v3 plugged in compared to its behavior on the road. Light use isn’t a problem—our Lame benchmark doesn’t slow down at all. But the threaded TotalCode Studio, HandBrake, and 3ds Max iray tests are far more taxing; they scale back quite a bit on battery power.

Our Photoshop benchmark, which is accelerated by OpenCL, also reflects a performance hit away from the wall. Nvidia’s GeForce GTX 970M throttles back, so we’d expect this.

Display Measurements

Before we get into testing battery life, we need to calibrate the P34W v3’s display. In the process of doing this, we also run its panel through a barrage of benchmarks.

Minimum Brightness
9.03 cd/m²
Maximum Brightness
289.23 cd/m²
Brightness Calibration
201.13 cd/m² (69 on brightness scale)
Black Level
0.34 cd/m²
Gamma
2.18
Contrast Ratio
582:1
Color Temperature
6460K
sRGB Gamut
101.7%
Adobe RGB Gamut
70.1%

A minimum brightness measurement of 9.03 cd/m² is well below our practical floor of 50. AU Optronics rates its B140HAN for a maximum brightness of 300 cd/m² and the panel on Gigabyte’s P34W v3 comes awful close to that figure at 289.23.

By default, an ambient light sensor adjusts the P34W’s screen and backlight. This needs to be disabled in order to get the display calibrated for our testing. Different publications favor certain outputs, but all of our screens are dialed to 200 cd/m².

At that setting, we measure a contrast ratio of 582:1, which registers lower than the display’s typical 700:1 rating.  A gamma response of 2.18 comes close to the 2.2 we want to see. We record a cool color temperature of 6460K, and a sRGB gamut volume of 101.7%.

Subjectively, the P34W v3’s display is excellent, though not particularly vibrant owing to its anti-glare layer. Uniformity isn’t an issue; there is no perceptible light bleed to the naked eye. And the panel’s viewing angles are superb. You don’t experience any color shift. Even if Gigabyte wasn’t using an IPS-equivalent panel, the relatively small 14” screen keeps you head-on most of the time. But the fact that you get AUO’s AHVA technology means all practical viewing angles are usable.

Battery Life Results

We run three different workloads to test battery life, all of them from PCMark 8.

The Home suite tests Web browsing, writing, casual gaming, photo editing and video chat. Futuremark considers the Home test to be computationally light. PCMark 8’s Creative benchmark, on the other hand, is more demanding. It includes Web browsing, photo editing, batch editing, video editing, creating media to go, mainstream gaming and video chatting. Meanwhile, the Work suite includes office-oriented tasks that don’t hammer the platform particularly hard.

Despite the variety in what each metric includes Gigabyte’s P34W v3 posts fairly even battery life results. The Home and Creative runs are each good for roughly 2.5 hours. The lighter-duty Work suite comes closer to three hours.

Of course, you could stretch battery life by dialing down screen brightness below 200 cd/m² or specifying a more conservative power profile. We simply use these settings for consistency between comparison machines.

Wireless Networking Performance

Our wireless networking benchmarking consists of IxChariot's TCP Throughput test using an ASRock Vision X 471D server hardwired and the client system connected through its on-board adapter. The reference router is Asus' RT-AC66U.

As you can see, you're better off sticking with the 5GHz band for performance, so long as you can stay within its more limited range (the 2.4GHz signal stays stronger over greater distances).

Thermal Performance

It’s a good thing that Intel’s Core i7-4720HQ and Nvidia’s GeForce GTX 970M have performance to spare, because Gigabyte’s cooling solution can’t quite keep up with the P34W v3’s hardware under load.

We logged the graphics processor’s clock rates as Unigine’s Valley benchmark looped in the background and watched it drop from a GPU Boost frequency of 1038MHz to 848MHz over the course of a few minutes. That’s lower than the 970M’s specified 924MHz base clock rate. This should not happen. It’s the same sort of issue that got AMD in trouble when the Radeon R9 290X first launched. In short, Nvidia’s GM204 is quickly hitting its thermal ceiling and then pulling performance to avoid exceeding it.

Meanwhile, Gigabyte’s fans try their hardest to exhaust the graphics subsystem’s heat. In the process, they get obnoxiously loud. This is where high-end components and a compact chassis come to a head. It’s just fortunate that, when the P34W v3 does succumb to physics, your experience isn’t ruined—there’s enough performance in reserve at those lower clock rates to continue gaming smoothly.

Conclusion

Gigabyte’s P34W v3 subjected us to a whirlwind of emotions. When it arrived, we didn’t believe that such a small machine could also deliver competent gaming performance. Then we ran the P34W v3 through our benchmark suite and saw how well it cut through the titles we threw at it. But there was no ignoring the noise its cooling fans made as they tried to keep up with the high-end host processor and graphics module. Nor could we excuse that Nvidia’s GeForce GTX 970M is forced down to clock rates lower than the company’s advertised specification due to a cooling solution incapable of handling its thermal output.

After all of that, though, the P34W v3 is fast enough to maintain 30 FPS on battery power in demanding games at their highest detail settings. It stretches well above that level plugged in to the wall. And a quad-core CPU ensures snappy performance even in demanding applications like Photoshop and 3ds Max. What more could you want from a four-pound laptop less than an inch thick?

The example we were sent to review sells for just under $1700. You can push that price upward by adding more solid-state storage. There aren’t a ton of options available for customization in the 14” form factor though, and we’re alright with that. As far as balance goes, this is the right combination of parts for playing the latest games at the IPS panel’s native 1920x1080.

How about the competition? There’s Razer’s new non-touch Blade that sells for $2000, comes with a larger 256GB SSD but less DDR3L memory and weighs slightly more. You can still find the Alienware 14 for sale; it doesn’t even come close, though. Asus once had its own 14” gaming notebook as well. However, that one’s ancient history. Really, Gigabyte’s P34W v3 is in a very exclusive group. Aside from the pricier Razer, matching its graphics performance means stepping up to the 15.6” form factor or larger. And while 13.3” machines with GeForce GTX 960M graphics exist, you can’t hope for the same high-detail experience at 1920x1080 with half as many CUDA cores.

Gigabyte’s P34W v3 doesn’t force you to choose between mobility and performance—it delivers both with aplomb. Still, there’s no escaping physics. So, the sacrifice you make is a loud cooling solution that, even at full speed, cannot keep up as you game. Keep your headphones handy, and be thankful for a two-year warranty. While we’d certainly prefer the P34W v3 to at least enable Nvidia’s rated specifications, Gigabyte itself doesn’t make any clock rate claims. More aggressive fans would only worsen the acoustic situation, too. At least the P34W v3’s gaming performance remains wholly acceptable, even under dialed-back conditions.

For its ability to achieve playable frame rates at its native 1920x1080 resolution (using the most taxing detail settings), the 14” Gigabyte P34W v3 earns our Tom’s Hardware Recommended award.

MORE: All Notebook Content
MORE: Visit Our Sister Site LAPTOP

Chris Angelini is a Technical Editor at Tom's Hardware. Follow him on Twitter and Google+.

Follow Tom's Hardware on Twitter, Facebook and Google+.

This thread is closed for comments
21 comments
    Your comment
  • royalcrown
    UGH, Stupid, Crappy VGA port...Dissappear already !
  • Solandri
    I have this laptop. I'm not sure what brightness setting is 200 nits, but I find 50% good for daytime indoors, and 30% comfortable at night. I only feel the need to crank it above 50% if I'm by a sunny window. At 30% I get over 4 hours of battery life running office tasks and web browsing. Turning off hyperthreading in the BIOS extends that to almost 5 hours.

    The thermal throttling is a problem. But other owners have reported eliminating it by repasting and undervolting the CPU. I'm planning to do that, but haven't yet had time.

    The VGA port is there because Gigabyte knows their market. Real gamers don't mind buying a big and heavy gaming laptop. These thin and light gaming laptops are mostly being bought by business people, who use it as their work laptop when they travel, then relax with some gaming in their hotel room. The lid is very nondescript - completely black with only the Gigabyte logo. Anyone looking at it would never guess it's a gaming laptop. Anyway, the VGA port is there so these business people can plug it into older projectors that are ubiquitous in meeting rooms. The laptop also has a HDMI port (and can output to both + screen simultaneously), though I would've preferred Displayport.

    The fan noise can be obnoxious, but Gigabyte has included an app to let you quickly select fan and performance profiles. At the lowest setting ("stealth") the fan noise is completely acceptable in an office environment. Probably too much for a library when gaming. Performance takes a big hit, but it's more than adequate for most of the games I play. If you plan to use the other settings (low, high, max), break out the headphones.

    Others have complained of problems with fit and finish. Some pieces of plastic aren't completely straight, or have gaps. Backlight bleed seems to be a common problem too. I'm fortunate in that mine doesn't have any backlight bleed or problems with fit and finish. I would buy it again in a heartbeat.
  • mapesdhs
    Just wondering, when the GPU has pulled itself back to 848, is it still quicker than a 960M? If not, maybe Gigabyte would have been better off making this 960M-based, though if the mobile versions differ in the way the desktop cards do, perhaps even a reduced 970M is still way quicker than a 960M.

    Don't laptops at least have the option of running on full power from the battery, instead of always reducing the clocks, etc.? One should at least have the option of staying a max speed even on batter power, kinda handy if one knows the game time is only going to be 30 mins anyway, short train journey or something.

    Ian.
  • soldier44
    I stopped reading at 1080p.
  • rohitbaran
    @soldier44
    Well, as said in the review, there are not any laptops with that form factor and capability of 1440p gaming since it is really tough to get suitable performance from a card that can suitably fit in that form factor without burning itself out. So what do you expect? Unless you prefer jet engines for cooling fans...
  • 10tacle
    355427 said:
    @soldier44 Well, as said in the review, there are not any laptops with that form factor and capability of 1440p gaming since it is really tough to get suitable performance from a card that can suitably fit in that form factor without burning itself out. So what do you expect? Unless you prefer jet engines for cooling fans...


    Thank you. Not to mention that generally mobile GPUs have to run at lower clock settings, have cut down shader and texture units, have less memory, and have a cut memory bus all to help keep not only the temps down but the power use down as well. A quick comparison link to a 970M vs. 970:

    http://www.tomshardware.com/news/nvidia-geforce-maxwell-mobile-gtx-970m-gtx-980m,27833.html

    But with that said, these days I find it hard to justify spending big bucks on a mobile PC gaming solution. I first and last spent nearly $2k on a high end Dell Alienware gaming laptop about 7 years ago (with a 1920x1200 17" display) and will never do it again. No way to upgrade and every new game out continued to need to be detuned in quality. Due to that, portable gaming for me eventually got transferred to the PS3 and more recently the PS4.
  • TallestJon96
    It would be nice to get the 970m as a desktop card (960 ti). It's seems nearly perfect for 1080p
  • Traciatim
    Quote:
    It would be nice to get the 970m as a desktop card (960 ti). It's seems nearly perfect for 1080p


    The 970m already has desktop equivalents in the 660ti and 760. The desktop 960 already outmatches the 970m by a good amount, the other way you can get a 750ti for cheap 1080p gaming with a good amount of settings turned up.

    There are no desktop 900 series cards yet as slow as the 970m.
  • fimbulvinter
    Quote:
    The VGA port is there because Gigabyte knows their market. Real gamers don't mind buying a big and heavy gaming laptop. These thin and light gaming laptops are mostly being bought by business people, who use it as their work laptop when they travel, then relax with some gaming in their hotel room. The lid is very nondescript - completely black with only the Gigabyte logo. Anyone looking at it would never guess it's a gaming laptop.


    I have an older version of this model. Exact same chassis but previous model CPU and GPU.

    But what you said is why I got it. This is a 70% work 30% gaming system for me and I honestly don't think there is too many other gaming laptops out there that don't scream GAMING LAPTOP with garish case lights and logos.

    Although the thing really does scream GAMING LAPTOP when those fans spin up lol
  • Manticorp
    I have this laptop - I chose it over the Razer Blade because it was ~£600 cheaper AND I wanted the 1080p display over the 1800p display, and over the MSI ghost pro because I wanted something 14 inch and a bit more inconspicuous - and I have to say, it suits me down to the ground.

    Regarding gaming, it's been able to handle everything I throw at it enormously well. I play Far Cry 4 on a 3K Dell U2515H at max settings and it handles it fine, not an ounce of stutter.

    I also use it at work every day, the battery lasts a good 4-5 hours for normal chores, and it's so unassuming and professional looking that I don't feel conspicuous using it. I have to plug it in if I'm out and about and want to watch a movie, as it'll only go about 1.5-2 hours then.

    But then I go home, plug it into a nice big screen, and that's when the real playtime starts.

    I found the Gigabyte P34W to be the only laptop to offer sleek, inconspicuous looks with extremely good gaming power underneath in a relatively affordable package.

    I have a colleague who bought a Acer V Nitro - I'm so glad the Gigabyte has white backlit keys instead of RED (urgh).

    I absolutely haven't looked back since buying this laptop - it's simply the best on the market if you want to use it for gaming and for work.
  • SirTrollsALot
    Looks solid! Go here for more info http://forum.notebookreview.com/threads/gigabyte-p34w-v3-phantasus-owners-lounge.772517/

    Also maybe check Prema mods once in a while for bios. I got it for my Gigabyte P27g v 2 (Cleo chassis) works like a charm!
    https://biosmods.wordpress.com/
  • Frozen Fractal
    Why doesn't Tom's include more games?
  • jkteddy77
    Quote:
    The 970m already has desktop equivalents in the 660ti and 760. The desktop 960 already outmatches the 970m by a good amount, the other way you can get a 750ti for cheap 1080p gaming with a good amount of settings turned up. There are no desktop 900 series cards yet as slow as the 970m.


    Such a lie... the 970m lies beetween the 960 and 970, and could be compared to a stock clock 770.
  • jkteddy77
    My P34Wv3 3k doesn't do the battery light deal when the laptop is off... really wish it did, as that's a cool feature.
    Anyone know how to increase the performance of this laptop when it's on battery? the GPU utilization doesn't drop, nor does the cpu clockrate, so it makes no sense to me why its performace in gaming is cut in half for me on battery. I got from 50-75 in bf4 on high in 3k, to 22-35 as soon as I unplug it...
  • Traciatim
    1339761 said:
    Quote:
    The 970m already has desktop equivalents in the 660ti and 760. The desktop 960 already outmatches the 970m by a good amount, the other way you can get a 750ti for cheap 1080p gaming with a good amount of settings turned up. There are no desktop 900 series cards yet as slow as the 970m.
    Such a lie... the 970m lies beetween the 960 and 970, and could be compared to a stock clock 770.


    Not according to Passmark or Futuremark. Futuremark puts it at just a smidgen less than the desktop 960. Passmark puts it's at about 250% slower than then 960. So it really depends on what it's doing, but it is in no way faster than a desktop 960. The 980m is much more in line with what the desktop 770 would be... only you have to pay an arm and a leg for a laptop with a 980m in it, where a desktop 770 at this point is pretty cheap.

    The main point is that the laptop only has portability going for it. In every single other metric the desktop parts win.
  • Solandri
    111219 said:
    1339761 said:
    Such a lie... the 970m lies beetween the 960 and 970, and could be compared to a stock clock 770.
    Not according to Passmark or Futuremark. Futuremark puts it at just a smidgen less than the desktop 960. Passmark puts it's at about 250% slower than then 960. So it really depends on what it's doing, but it is in no way faster than a desktop 960.

    The 965m has 1024 shaders, 64 TMUs, 32 ROPs, and 128-bit memory bus.
    The 960 GTX has 1024 shaders, 64 TMUs, 32 ROPs, and 128-bit memory bus.
    The 970m has 1280 shaders, 80 TMUs, 48 ROPs, and a 192-bit memory bus.
    The 970 GTX has 1664 shaders, 104 TMUs, 64 ROPs, and a 256-bit memory bus.

    If the 960 GTX does in fact run faster than the 970m, then nvidia must've done a helluva incremental architecture revision. The 965m, 970m, and 970 GTX use a GM204 core. The 960 GTX uses a GM206 core.

    Remember, when nvidia or Intel manufactures processors, they don't know ahead of time if it's going into a laptop or a desktop. They're tested, and the ones which can run at full performance at lower voltage (hence lower power) are binned as laptop parts. In fact all the GM204 cores are identical. The lesser models simply have cores, units, and/or lanes with manufacturing flaws or intentionally disabled which lowers performance.

    They do clock the laptop parts a bit slower, but ever since they introduced Boost I can't find figures for max boost. And I don't think it's a sufficient underclock to offset the 25%-50% extra hardware the 970m has over the 960 GTX.

    Edit: I would imagine the benchmark discrepancy has more to do with the CPU. Laptop CPUs are still typically clocked around 2.x GHz, while desktop CPUs are clocked at 3.x GHz. So if the benchmark relies on the CPU at all (I believe the 3D physics portion of the Futuremark scores are CPU-dependent), then it will favor the desktop.
  • jkteddy77
    Quote:
    Not according to Passmark or Futuremark. Futuremark puts it at just a smidgen less than the desktop 960. Passmark puts it's at about 250% slower than then 960. So it really depends on what it's doing, but it is in no way faster than a desktop 960. The 980m is much more in line with what the desktop 770 would be... only you have to pay an arm and a leg for a laptop with a 980m in it, where a desktop 770 at this point is pretty cheap. The main point is that the laptop only has portability going for it. In every single other metric the desktop parts win.

    Quote:
    111219 said:
    1339761 said:
    Such a lie... the 970m lies beetween the 960 and 970, and could be compared to a stock clock 770.
    Not according to Passmark or Futuremark. Futuremark puts it at just a smidgen less than the desktop 960. Passmark puts it's at about 250% slower than then 960. So it really depends on what it's doing, but it is in no way faster than a desktop 960.
    The 965m has 1024 shaders, 64 TMUs, 32 ROPs, and 128-bit memory bus. The 960 GTX has 1024 shaders, 64 TMUs, 32 ROPs, and 128-bit memory bus. The 970m has 1280 shaders, 80 TMUs, 48 ROPs, and a 192-bit memory bus. The 970 GTX has 1664 shaders, 104 TMUs, 64 ROPs, and a 256-bit memory bus. If the 960 GTX does in fact run faster than the 970m, then nvidia must've done a helluva incremental architecture revision. The 965m, 970m, and 970 GTX use a GM204 core. The 960 GTX uses a GM206 core.


    As seen simply by these specifications, it's easy to see it's superior to all but the desktop 970m
    Benchmarks mean nothing against desktops simply because the laptop's cpu hurts the score. If you JUST looked at the graphics score. you'd see the 970m scoring quite a bit higher.

    The 970m almost equals the 770. I'd say it's more in line with the R9 280x in terms of performance. It really is only 10-15fps away from matching my desktop's R9 290 in most titles.
    Trust me, no 750ti-level GPU could be running Battlefield 4 on High-Ultra settings in 2880x1620 at 60 fps like my 970m is right now

    The 980m is actually creditied as being on par with the desktop 970. The 980m is probably dead on with my R9 290 in most titles. Is it worth the money? It wasn't to me, since it would be $500 more for the 980m version of this laptop, so I plan on using that saved money on a 980ti or Fury X, or waiting for Pascal. Much better investment.
  • Traciatim
    Quote:
    The 970m almost equals the 770. I'd say it's more in line with the R9 280x in terms of performance.


    So instead of just guessing, why not look it up. Here is the 3DMark Extreme with the fastest mobile CPU /970m combo I could find with a normal 4690k and a 280x.

    If you notice the physics scores are actually closer to each other than most of the graphics scores. Mobile CPU's are actually pretty solid.

    Look at the separate graphics tests... the disparity is between 37.9% up to 69.5 in favor of the 280x. You vastly overestimate the 970m.

    http://www.3dmark.com/compare/3dm11/9128109/3dm11/9581064
  • jkteddy77
    111219 said:
    Quote:
    The 970m almost equals the 770. I'd say it's more in line with the R9 280x in terms of performance.
    So instead of just guessing, why not look it up. Here is the 3DMark Extreme with the fastest mobile CPU /970m combo I could find with a normal 4690k and a 280x. If you notice the physics scores are actually closer to each other than most of the graphics scores. Mobile CPU's are actually pretty solid. Look at the separate graphics tests... the disparity is between 37.9% up to 69.5 in favor of the 280x. You vastly overestimate the 970m. http://www.3dmark.com/compare/3dm11/9128109/3dm11/9581064


    http://gpuboss.com/gpus/Radeon-R9-280X-vs-GeForce-GTX-970M
    Comes pretty close to me
    NotebookCheck themselves show that the 970m is just a step below the R9 280x, and show benchmark results to prove themselves. This is a very trusted source.

    http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
    Guess I can guess its performance pretty well

    What matters is actual gaming performance, unless you're using it for some other specific purpose (in which you'd get a workstation laptop for that). Most of these benchmarks test areas that notebook gpu's aren't designed to handle.
    Notice that their numbers show the 280x and 960 scoring higher (in fact the 960 performs better in some 3dmark tests than the 280x, but they still ranked it lower), but they rank the GPU's off their gaming performance.
    Wish I could find a controlled gaming test of the 970m vs desktop GPU's, but I can't find any

    Don't know if you've ever used a 970m, but I'm playing BF4's first mission right now at 70+ fps in 2880x1620 (3k) with a mix of high and ultra textures (no MSAA). Very impressive if you ask me.
    It is only 9-10fps behind my R9 290 in the BF4 test range, both in the same spawn-spot and same all ultra settings in 1920x1080p
  • Traciatim
    1339761 said:
    http://gpuboss.com/gpus/Radeon-R9-280X-vs-GeForce-GTX-970M Comes pretty close to me NotebookCheck themselves show that the 970m is just a step below the R9 280x, and show benchmark results to prove themselves. This is a very trusted source. http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html Guess I can guess its performance pretty well What matters is actual gaming performance, unless you're using it for some other specific purpose (in which you'd get a workstation laptop for that). Most of these benchmarks test areas that notebook gpu's aren't designed to handle. Notice that their numbers show the 280x and 960 scoring higher (in fact the 960 performs better in some 3dmark tests than the 280x, but they still ranked it lower), but they rank the GPU's off their gaming performance. Wish I could find a controlled gaming test of the 970m vs desktop GPU's, but I can't find any Don't know if you've ever used a 970m, but I'm playing BF4's first mission right now at 70+ fps in 2880x1620 (3k) with a mix of high and ultra textures (no MSAA). Very impressive if you ask me. It is only 9-10fps behind my R9 290 in the BF4 test range, both in the same spawn-spot and same all ultra settings in 1920x1080p


    GPU boss is a horrible ad click bait farce site. I don't care what your benchmarks are from your machine. Notebook check is much more reliable but even then they don't include reference desktop cards so that you can see how miserably poor the notebook cards do in comparison on a cost/performance or even straight up performance along naming conventions.
  • jkteddy77
    111219 said:
    1339761 said:
    http://gpuboss.com/gpus/Radeon-R9-280X-vs-GeForce-GTX-970M Comes pretty close to me NotebookCheck themselves show that the 970m is just a step below the R9 280x, and show benchmark results to prove themselves. This is a very trusted source. http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html Guess I can guess its performance pretty well What matters is actual gaming performance, unless you're using it for some other specific purpose (in which you'd get a workstation laptop for that). Most of these benchmarks test areas that notebook gpu's aren't designed to handle. Notice that their numbers show the 280x and 960 scoring higher (in fact the 960 performs better in some 3dmark tests than the 280x, but they still ranked it lower), but they rank the GPU's off their gaming performance. Wish I could find a controlled gaming test of the 970m vs desktop GPU's, but I can't find any Don't know if you've ever used a 970m, but I'm playing BF4's first mission right now at 70+ fps in 2880x1620 (3k) with a mix of high and ultra textures (no MSAA). Very impressive if you ask me. It is only 9-10fps behind my R9 290 in the BF4 test range, both in the same spawn-spot and same all ultra settings in 1920x1080p
    GPU boss is a horrible ad click bait farce site. I don't care what your benchmarks are from your machine. Notebook check is much more reliable but even then they don't include reference desktop cards so that you can see how miserably poor the notebook cards do in comparison on a cost/performance or even straight up performance along naming conventions.


    I agree with GPUboss, they never document their results right either.
    As for Notebookcheck, if they indeed use aftermarket or overclocked desktop cards, that makes my 970m look even faster xD