LG G6 initial review: A masterful marriage of hardware and software

The G6 comes at a crucial time for LG. It made a loss last year following the disappointing sales of the G5 and V10 and its new handset is a bold step in the right direction, but that doesn’t always save a company’s fortunes. To cut to the chase, the LG G6 is an astonishing smartphone that easily holds it own against the best smartphones ever made. 

Its taller 18:9 screen is easy to get used to, and while many operations require two hands given the 5.7in screen, it actually is comfortable to hold, scroll, and use with one hand – just like the marketing would have you believe. LG has wisely ditched the gimmicky leather of the G4 and the cool-but-not modularity of the G5 to craft the best ever LG phone. And there have been a lot of them.

The design has been overhauled again following the leather-clad G4 and the modular G5 to a debatably more uniform metal and glass affair. LG’s Friends  didn’t last long, did they?

We’ve tested the G6 rigorously since we got our hands on the unit before its announcement at MWC 2016 and it performs just as well as the best smartphones on the market. Here’s our full review of the LG G6.

Note: The version we have tested is a pre-production unit. Once we have received and tested a UK G6 retail unit we will amend any necessary sections with any differences found, as well as a final verdict.

See also: LG announces the G6 at MWC

LG G6 hands-on review: UK price and availability

MobileFun has revealed the UK price of the LG G6: £699. You can pre-order the LG G6 from its site now. The release date is unconfirmed, though we expect it to be in April or May, and we expect it to be available on all four major UK networks.

Should the official price be £699 in the UK, we are a touch disappointed. We thought LG would be wise to undercut Samsung to boost chance of sales, but it looks like it’s gone all out premium on pricing as well as build.

LG G6 hands-on review: Design and build

So LG has gone big, but it’s the screen, not the handset itself, that’s grown. The G6 boasts an 18:9 screen, expanding the display from the traditional confines of 16:9. This leaves it with a 5.7in Quad HD display. It looks seriously good. 

Alongside that wonderful display is a design that conforms, unlike the modular G5 and the leather-clad G4. The G6 takes a leaf out of the iPhone 4’s book with a solid aluminium frame and Gorilla Glass on the front and back. It comes in platinum, white and black, with only the latter being a true fingerprint magnet.

Alongside that wonderful display is a design that conforms, unlike the modular G5 and the leather-clad G4. The G6 takes a leaf out of the iPhone 4’s book with a solid aluminium frame and Gorilla Glass on the front and back. It comes in platinum, white and black, with only the latter being a true fingerprint magnet. 

The refined design is simpler and more elegant, with the dual rear cameras and fingerprint sensor that acts as the power/lock button sitting flush with the body. The bottom edge houses the USB-C port (fully waterproof), single speaker and mic. The right edge is smooth and clear save for the SIM tray, while the left edge has the two volume keys. The top edge has that very welcome 3.5mm headphone jack.

Even though the metal and glass frame isn’t entirely original, the rounded design is made all the more striking thanks to the rounded corners of the actual display as well. It’s a clever detail that doesn’t negatively affect use while accentuating the G6’s thin bezels and unusually tall screen. It works really well.

The black model sports this look slightly better than the white or platinum models though. The rounded screen actually has a tiny thin black gap between it and the coloured bezels, but it’s enough on the white and platinum to be constantly visible. Though it’s there on the black, it’s invisible and makes for an even better visual impression.

So, while we prefer the platinum model for looks and how it hides fingerprints, the black one wins because the rounded screen simply looks better on it.

LG said that its goal with the G6, after extensive customer research, was to make a phone with a huge screen but that you could still comfortably use with one hand. The problem here is that that is basically impossible, even for those with large hands. Where the company has succeeded though is by making the G6 perfectly pocket friendly while packing in a screen that it’s easy to scroll through and hold with a single paw.

This might sound easy to achieve, but it can be rare to find on phablets like the G6. The iPhone 7 Plus, for example, is a through and through two-handed device, and the G6 succeeds in fitting a larger screen than that phone into a smaller overall body.

From the precision cut metal rim to the flat back that still packs in dual cameras and a fingerprint sensor and, of course, the screen, LG has hit a home run with this design. If at first it looks ordinary, in use it really is far from that. No gimmicks, no leather, no risks – just incredible build quality that positively affects daily use.

LG G6 hands-on review: Features and specifications

In the tech press, a new high-end smartphone usually takes a fair (and unfair) battering simply because of the specs. To us, the G6 actually feels like a marriage of hardware and software that transcends this sort of nit picking because it works so well as a cohesive whole. The flack the G6 has got for using the Snapdragon 821 is a little unfair given how well it performs. Here we’ll break down the features and specifications for you to decide for yourself what you make of LG’s decisions.

Processor

One point of contention among the tech community is LG’s decision to go with Qualcomm’s Snapdragon 821 processor rather than its latest 835 that we expect to see in the Galaxy S8. 

The 821 is in its third generation, and LG told us in an interview that it therefore has more expertise in how to optimise the user experience (UX) and implied the 835 wouldn’t have brought any more noticeable advantages.

It’s true that the 835 might bring noticeable battery life gains when we see it in the Galaxy S8, but if we don’t fully know why LG chose to forego it’s hard to fully criticise the decision. The 821 is, after all, doing just fine powering the Google Pixel

The G6 can handle some pretty heavy multitasking. We swiped between games, video streams, Spotify, document editing and more and the phone barely broke a sweat. Very occasionally in app (Spotify for example) we noticed a tiny lag on album art when switching songs, but live streaming services often do this even on high-end phones. 

We can’t imagine anyone having complaints about the G6’s performance, and the benchmarks below reflect how it holds its own against the best of the best. In fact, it is one of the best.

You’ll notice some of the frame rate scores are lower than the G6’s market rivals; the OnePlus 3T and Google Pixel have the same 821 processor but have better scores.

We are putting this down to the larger resolution on the G6 and its Full HD display, and the processor needing to push that bit harder to keep up. At no point during gaming, for example, was the frame rate lagging, but if top specs that give maximum possible performance are your thing, you may want to take this into consideration

Display

The display is a 5.7in Quad HD display with a resolution of 2880 x 1440 – it’s stunning. The extra pixels on that first figure are to account for the 18:9 aspect ratio, which you will get used to much quicker than you might think.

The latency is very good, with very fast response, but it still is a touch (tiny touch) behind the iPhone 7, but very comparable to any other Android phone we have used. It never affected our use of the device.

Aside from the 564ppi, the extra height of the 18:9 aspect means the whole experience of using the G6 is improved from the G5. If that sounds a bit too vague, it’s because you really need to get your hands on it to see what we mean. The extra height just makes sense in the slim form factor, and you really will use it with one hand. This impression is also intrinsically linked with the changes to the software, which we’ll come onto.

The screen also retains the always-on functionality from the G5, with a slightly altered setup lower down on the screen with a new default font. It still displays the time, date and apps that you have notifications for.

The rounded corners really help the display; they make it feel more contained, almost like the display has been penned in for fear of it becoming to large. This is to positive effect, and we found that everything from homescreen swipes to typing long messages was a joy on the larger display. There was a lot of room for error here, but in terms of pure presentation, LG has absolutely nailed it.

We delve more into how the aspect ratio affects the software in the software section of this review. Click here to skip down to it.

Cameras

The LG G5 impressed us with its dual camera setup that enabled wide-angle shots. The G6 retains this, with two 13Mp rear facing cameras. The wide-angle lens offers a 125-degree angle and the standard has optical image stabilisation. LG claims it has found an algorithm that lets you zoom between the two cameras smoothly without a software jerk. We found, unfortunately, that this isn’t the case. There’s still a tiny flicker as the lenses switch over.

These cameras can record up to 60fps at full HD quality, and in ultra HD at 30fps. HDR support is only for still images, not video, but this is quite usual for smartphones – even the high-end ones.

We found general image quality to be excellent. The display is a joy to use as a viewfinder given its size and the root files themselves show a superior handling of composition. 

The wide-angle lens option is still best on the G6 in comparison to rivals. The user-friendly presentation in Auto mode means you can easily and quickly switch between the two. Check out the comparison shot below for an idea of the perspective changes you get. Though be aware that the full wide angle does create a slight fishbowl effect at the edges of the image. 

 

The camera is also good at handling macro-style shots, and most casual users won’t need to stray into the manual mode, though if you do, it’s well set up.

Something that’s more pushed in the marketing is the camera’s Square mode that panders towards Instagram friendly shots. It also fits in nicely with the G6’s square themed GUI. There are four shooting options in square mode: Snap, Grid, Guide and Match. Here’s a quick rundown of what they do, and an example (examples below explanations).

Snap splits the screen in half and means once you’ve taken a picture you can preview it straight away whilst the second half of the screen remains a viewfinder to take another shot in. Handy if you’re trying to get a perfect picture of an important subject (potentially your own face). 

Grid is the simplest and is a quick way to create a four image grid of pictures that is itself a square. It’s the most simple and most effective mode. 

Guide is where it gets slightly too clever for itself, with the option to pick an image from your gallery to act as a ghosted guide image with which to overlay in the viewfinder and better compose another picture. It ends up overcrowding the screen and is confusing to use. 

Match is set up to capture two images like in Grid, but this is to be slightly kooky and combine (LG suggests) candyfloss with a vapour trail to create a trick image. It’s very hard to use and even harder to get a decent shot. 

They are fun modes to play around with, but it’s a distraction from the very good sensor that takes normal photos very well. But LG is trying to please the Instagram generation, and it has most likely succeeded there.

Storage and RAM

All variants of the LG G6 have 4GB RAM as standard. Regionally, some of the features differ. The European version of the LG G6 has 32GB storage but a micro SD slot for expansion up to 2TB. The same applies to the US version.

The Korean variant will have 64GB storage, but also the micro SD support. LG said these differences were down to regional marketing decisions. Hopefully it won’t make too much difference given the storage is expandable.

Connectivity and extras

Where regional decisions become a bit more frustrating is in the extras. The US G6 has wireless charging, which adds extra convenience, minimal extra weight and no design changes. However, the Korean and European versions miss out on this handy addition. 

The Korean G6 has Hi-Fi Quad DAC, a component that allows for high quality audio playback. LG told us that it doesn’t cost much more to add this feature, but the US and Europe miss out on it. It referred back to regional decisions on included components, but for us it’s frustrating that the European version misses out on two desirable features.

There will also be a dual SIM version, but don’t expect this to come to the UK or Europe. These three missing features aren’t vital to the G6’s success in the UK, but we’d certainly welcome them and it’s frustrating to see a major phone split its features like this dependent on market. Extra features are universally appreciated.

The G6 does have one trick up its sleeve for all regions though. LG claims it’s the first smartphone to support both Dolby Vision and HDR 10. In basic terms, it’s the first smartphone to theoretically support superior audio-visual standards normally associated with high-end televisions.

We say theoretically because while it supports both, streaming services such as Netflix don’t actually yet offer playback of this combined quality on mobile devices. Remember when everything was ‘HD ready’, before HD actually existed? It’s like that. Watch this space.

An iPhone 7 compared to the LG G6

Where it falls down slightly – but thankfully not too much – is in how it adjusts to display content that is by default 16:9 or similar. For example, using Netflix will display the video in 16.7:9 on the G6. Swiping down from the top pf the screen gives you a green icon, tap that and you have the option to view in 16:9 or expand to the full 18:9. If you opt for the latter, it warns you ‘The app’s content may not be fully displayed’. 

It’s a bit fiddly, and we found it meant having to return to the Netflix homescreen. And, in every option, some form of black bar remained on at least one edge to make sure all the content was still visible. It’s far from ideal if you want to view apps using the full display.

LG told us that it was working directly with Netflix to sort this out and bring a seamless 18:9 video experience to the G6, but we remain worried that with the plethora of services and games out there, the G6 might be doomed to a life of black bar playback. Hopefully not.

Battery life

The G6 has a 3,300mAh non-removable battery. This might bug LG fans of the G4 and G5 whose batteries you could remove, but in reality this is the correct decision. The battery is big enough to easily last a full day and the bundled fast charger continues Android device’s pleasing trend of above-average battery life and very fast top up times. 

Our review unit of the G6 was a pre-production model, so perhaps the slight erratic nature of the battery life can be put down to that. It was the only area of use that we suspected might be improved with the final retail version. We were never left out of juice, but some days the G6 would be on 75% by bed with reasonably heavy use (which is outstanding) while other days it’d reach that with light use by mid-morning. We’ll update this review in due course and after an even longer test period.

Our pre-production model was also a US version, and we can confirm the wireless charging works excellently with a number of third party charging pads and through various cases. It is, though, slower by a long way compared to fast charging via cable.

LG G6 hands-on review: Software

The G6 pleasingly ships with Android Nougat 7.0, but then again it’d be a crime if it didn’t. LG’s overlay has a certain playfulness in the pastel colours, square design focus and rounded edges influenced by the screen. However it is well refined, with everything from app animations to menus flowing well and without pause.

It takes a bit of getting used to if you’re coming from Samsung’s TouchWiz or pure stock Android, but after a time it’s just as fun and practical to use as them.

The G6’s software has been quite substantially overhauled from the G5’s in order to play nice with the taller 18:9 screen. LG’s own apps such as messaging, weather and calendar have been redesigned to better manage white space and information displayed since there’s more room to play with.

When presented side by side with the G5’s screens, the difference is noticeably positive:

As you can see, apps have more space to work with, so LG has worked very hard to bring the user a more aesthetically pleasing experience, working on attractive, modernised graphics in the main apps.

The camera software too has been redone, with some excellent use of the extra screen space – we love that when taking photos landscape, you get a camera roll of the last few photos taken, rather than the smartphone norm of one tiny thumbnail of the one most recent photo. 

We also welcome LG’s decision to choose whether or not to display apps iOS style on the home screen or store them in an app tray. We don’t mind it on iOS, but given the choice on Android, we’ll pick the app tray every time.

Multitasking is also good on the G6. As with all Android phones that allow it, you can’t use it with every app, but it’s handy if you want to run two apps simultaneously. It works best though without a keyboard onscreen. As soon as you need it, even the 18:9 aspect can’t cope with the room needed, and multi-window becomes useless. It’s still a feature that we don’t really use, even though some continue to push it.

Go to Source

Nvidia GeForce GTX 1080 Ti

The Nvidia GeForce GTX 1080 Ti has been arguably the most anticipated graphics card of the new year. The 1080 series is Nvidia’s top-end GPU for gamers, and this year’s iteration even outpaces the Titan X in several regards.

Priced at $699 or £699 (about AU$930) – the same as the pre-discounted Nvidia GeForce GTX 1080 – this card offers stunning performance that’s often equal to the Titan X. Beyond being Nvidia’s most impressive GPU to date, it’s a showcase of how far the company’s Pascal architecture has come in less than a year.

Specifications

The Nvidia GeForce GTX 1080 Ti is packing 3584 CUDA cores, 224 texture units and 88 ROPs. It comes with just a notch less video RAM than the god-like Titan X, but the 1080 Ti’s 11GB complement of GDDR5X VRAM is tuned to a faster 11Gbps – clearly Nvidia is a fan of Spinal Tap – making this Nvidia’s quickest Pascal card.

There’s no question the Nvidia GTX 1080 Ti is a performance beast, running at a base 1480MHz frequency and 1582MHz when boosted.

True, the GTX 1080 boosts to a higher 1,733MHz; however, the Ti model is running with more cores and VRAM, boosting performance in both benchmarks and real-world gaming.

Design and cooling

If you’ve seen one of Nvidia’s self-produced Pascal graphics cards, you’ve seen them all. Externally, the original GeForce GTX 1080 and Nvidia’s latest Ti (or tai as the company pronounces it) card are virtually indistinguishable.

Not that we’re complaining. Nvidia’s design for its Founders Edition cards was a hit when it first debuted and the modern, angular look still appeals. One little change users might notice is the lack of a DVI port; don’t fret though, as the GTX 1080 Ti comes with an adapter you can plug into a DisplayPort.

Getting rid of the DVI port has made more room for a better cooling solution. In fact, Nvidia says its new high airflow thermal solution provides twice the airflow area of the GeForce GTX 1080’s cooling system.

Our testing corroborates those claims. Even at a full load, the GTX 1080 Ti stayed at a cooler 75 degrees Celsius while the GTX 1080 peaked at 82 degrees Celsius. Of course, you can push both cards to the edge of 91 degrees Celsius by adjusting the power limiter and overclocking the GPUs.

Performance

4K gaming at 60fps with a single graphics card has long been the promised land for gamers, and the Nvidia GTX 1080 Ti is the closest we’ve come to it. Getting a silky-smooth gaming experience in Ultra HD is highly dependent on which games you play, however.

We were able to achieve frame rates in the 50 to 60 range with games like Battlefield 1 and Doom. That’s not an easy feat – but these are also two of the most optimized games in existence right now. 

Likewise, Nvidia released a DirectX 12-optimized Game Ready Driver that helped us run Rise of the Tomb Raider at a solid 40fps, not quite matching the Titan X’s 57fps performance and double the 20fps previously seen on the GTX 1080.

Those are best case scenarios and you shouldn’t think the Nvidia GTX 1080 Ti is a bulletproof solution for 4K gaming. Total War: Attila for one thing brought the Titanium-power GPU to its knees as it struggled to render the game at a just playable 26fps.

Getting to the pure-performance testing 3DMark: FireStrike Ultra benchmark, the GTX 1080 Ti completely demolishes its forebears by a difference of 2,000 to 4,000 points across the board. More impressively, this ultimate GeForce skips ahead of the Titan X too.

The most mind-blowing bit is the Nvidia GTX 1080 Ti is doing all of this without any overclocking. Nvidia claims the card can be pushed as far as 2GHz, we haven’t quite pushed it that far yet but we have been able to achieve a stably running system at 1.7-1.8GHz range.

Final verdict

From top-to-bottom this is Nvidia’s most impressive graphics card to date. It’s remarkable more powerful than the original GTX 1080 while matching the Titan X’s gaming performance. You’re also looking at one of Nvidia’s coolest running cards with overclocking potential for days.

If you’ve been pining for Nvidia’s top GPU, but couldn’t stomach the $1,200 (£1,099, AU$1,599), the Nvidia GTX 1080 Ti looks much more appetizing at $699 or £699 (about AU$930).

Go to Source

ADATA XPG SX8000 M.2 NVMe SSD review: A more affordable NVMe option

M.2 NVMe SSDs such as ADATA’s XPG SX8000 are a game-changer for PCs. There is simply no other upgrade that offers as dramatic an improvement to the feel and response of your system. If you’re moving from a hard drive, you’ll be astounded. If you’re moving from a SATA SSD, you’ll still be extremely pleased.

Specs and pricing

The SX8000 is a four-lane, PCIe 3.0 (1GBps) M.2 2280 (22mm wide, 80mm long) SSD using 3D (layered) TLC NAND technology. To mitigate the relatively slower writes of TLC (triple level cell/three-bit) NAND, a DRAM cache is employed, as well as some of the TLC treated as SLC for a secondary cache.

sx8000 bb01 smaller ADATA

Stacked/layered/3D NAND is used for storage in the SX8000, but there’s cache to keep performance brisk. Note: The digital removal of chip markings was done by ADATA, not PCWorld.

ADATA quoted us prices of $76 for the 128GB version, $110 for the 256GB version, and $208 for the 512GB version. The prices we saw on Amazon were a bit higher (with the 512GB listed at $250, for example), but still significantly lower than anything other than the Samsung 960 EVO, a drive we haven’t tested yet. A 1TB version of the SX8000 was mentioned in the PR materials, but hasn’t materialized yet.

Performance

The XPG SX8000 performs more on par with OCZ’s RD400 and Plextor’s M8Pe than with Samsung’s killer 960 Pro. But even the slowest NVMe SSD is roughly two to three times the speed of a SATA SSD, so that’s no big knock. Installed in a system, it’s difficult to tell NVMe SSDs apart—they’re so fast, only the benchmarks reveal the difference.

20gb copies adata sx8000 PCWorld

While it can’t match the Samsung 960 Pro in sustained throughput, the SX8000 turned in very competitive numbers with sets of small files and folders. Shorter bars are better.

cdm 5 throughput PCWorld

Though not as fast as the OCZ RD400, the ADATA SX8000 is still very fast, and significantly less expensive. Larger bars are better.

The charts above reiterate the fact that the SX8000 isn’t one of the faster NVMe SSDs we’ve tested. But the fact is, it’s still very speedy and provides the same startling subjective increase in performance the others do. With an NVMe SSD on board, disk I/O basically ceases to be an issue.

Note that we also ran AS SSD, which showed the SX8000 performing about the the same as in CrystalDiskMark. This generally means that the drive is ignoring the FUA (Forced Unit Access) command that AS SSD issues, which disables all caching. As it’s a matter of a few tens of milliseconds before caches are emptied, there’s little chance of data loss. The FUA issue is the reason we don’t quote AS SSD scores for NVMe drives, which unlike SATA SSDs, often obey the command.

Conclusion

Not everyone can use an NVMe SSD. First off, you must have an M.2 slot with four PCIe lanes. If you have PCIe slots, you can use a PCIe M.2 adapter card. Secondly, to get the most out of the drive, you want to run your operating system on it, so you must have a system that recognizes the drive and can boot from it.

That said, while the XPG SX800 isn’t the fastest NVMe drive we’ve tested, it is affordable and absolutely fast enough to give you that NVMe thrill. And it’s warrantied for five years, and/or 80TBW (terabytes written) per 128GB of capacity, which is comforting. Most users never come close to writing that much data.

Nvidia’s new GeForce driver delivers a huge boost to DX12 games

Alongside the release of its new GeForce GTX 1080 Ti, Nvidia is unleashing a fresh graphics card driver which promises a performance boost in DirectX 12 games.

Those running DX12 games (under Windows 10) will benefit from driver optimizations which according to Nvidia will deliver an average performance boost of 16% across these various titles.

The biggest gains are to be seen in Rise of the Tomb Raider, with a rather incredible 33% boost to the frame rate, and Nvidia also boasts that Hitman will get a similarly chunky 23% improvement.

Gears of War 4 will be boosted to the tune of 10%, Ashes of the Singularity by 9%, and Tom Clancy’s The Division will get a more modest increase of 4%. Still, every extra bit of smoothness is welcome, as ever.

Ansel antics

Better performance is the key point with this new driver, but those who like to take screenshots of their games will also be interested to learn that Nvidia Ansel support has arrived for another Tom Clancy game, namely Ghost Recon Wildlands.

Ansel (pictured above) is essentially a screen-grabber on steroids, pausing the game at the moment you wish to capture, and then allowing you to enter the game-world and move the camera around in 3D, zoom or reposition it, and get rid of the HUD or any other interface distractions to hopefully procure yourself a cracking image.

The system also makes it possible to save out super-high-resolution screenshots, and to polish them up with post-processing effects – plus you can capture 360-degree panoramic shots to gawk at using a VR headset, should you own one.

Ansel is currently supported in the likes of Dishonored 2, Mirror’s Edge Catalyst, Watch Dogs 2, The Witcher 3: Wild Hunt and The Witness, and it’ll be coming to other big titles, most notably in the near-future Mass Effect: Andromeda (which is out in a couple of weeks).

Speaking of the latter, yesterday saw Nvidia show off some rather spectacular-looking 4K HDR screenshots taken with Ansel, which you might want to have a gander at here.

Go to Source

Nvidia GeForce GTX 1080 Ti 11GB Review

Nvidia’s GeForce GTX 1080 Ti is now the fastest graphics card available, and at $500 cheaper than the previous champ! Should you buy now, or wait for AMD’s Vega?

Nobody was surprised when Nvidia introduced its GeForce GTX 1080 Ti at this year’s Game Developer Conference. What really got gamers buzzing was the card’s $700 price tag.

Based on its specifications, GeForce GTX 1080 Ti should be every bit as fast as Titan X (Pascal), or even a bit quicker. So why shave off so much of the flagship’s premium? We don’t really have a great answer, except that Nvidia must be anticipating AMD’s Radeon RX Vega and laying the groundwork for a battle at the high-end.

Why now? Because GeForce GTX 1080 Ti is ready today, Nvidia tells us. And because Vega is not, we’d snarkily add.

Turning A Zero Into A Hero

There are currently two graphics cards based on Nvidia’s GP102 processor: Titan X and Quadro P6000. The former uses a version of the GPU with two of its Streaming Multiprocessors disabled, while the latter employs a pristine GP102, without any defects at all.

We’re talking about a 12 billion transistor chip, though. Surely yields aren’t so good that they all bin into one of those two categories, right? Enter GeForce GTX 1080 Ti.

The 1080 Ti employs a similar Streaming Multiprocessor configuration as Titan X—28 of its 30 SMs are enabled, yielding 3584 CUDA cores and 224 texture units. Nvidia pushes the processor’s base clock rate up to 1480 MHz and cites a typical GPU Boost frequency of 1582 MHz. In comparison, Titan X runs at 1417 MHz and has a Boost spec of 1531 MHz. 

Where the new GeForce differs is its back-end. Both Titan X and Quadro P6000 utilize all 12 of GP102’s 32-bit memory controllers, ROP clusters, and slices of L2 cache. This leaves no room for the foundry to make a mistake. Rather than tossing the imperfect GPUs, then, Nvidia turns them into 1080 Tis by disabling one memory controller, one ROP partition, and 256KB of L2. The result looks a little wonky on a spec sheet, but it’s perfectly viable nonetheless. As such, we get a card with an aggregate 352-bit memory interface, 88 ROPs, and 2816KB of L2 cache, down from Titan X’s 384-bit path, 96 ROPs, and 3MB L2.

Left alone, that’d put GeForce GTX 1080 Ti at a slight disadvantage. But in the months between 1080’s launch and now, Micron introduced 11 Gb/s (and 12 Gb/s, according to its datasheet) GDDR5X memories. The higher data rate more than compensates for the narrower memory bus: on paper, GeForce GTX 1080 Ti offers a theoretical 484 GB/s to Titan X’s 480 GB/s.

Of course, eliminating one memory channel affects the card’s capacity. Stepping down from 12GB to 11GB isn’t particularly alarming when we’re testing against a 4GB Radeon R9 Fury X that works just fine at 4K, though. Losing capacity is also preferable to repeating the problem Nvidia had with GeForce GTX 970, where it removed an ROP/L2 partition, but kept the memory, causing slower access to the orphaned 512MB segment. In this case, all 11GB of GDDR5X communicates at full speed.

Specifications

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Meet The GeForce GTX 1080 Ti Founders Edition

During its presentation, Nvidia announced that its Founders Edition cooler was improved compared to Titan X’s. Looking at this card head-on though, you wouldn’t know it. It looks identical except for the model name. The combination of materials (namely cast aluminum and acrylic) is also the same, as is the commanding presence of its 62mm radial cooler.

Nvidia’s reference GeForce GTX 1080 Ti is the same size as its Titan X (Pascal). The distance from the slot cover to end of the cooler spans 26.9cm. From the top of the motherboard slot to the top of the cooler, the card stands 10.5cm tall. And with a depth of 3.5cm, it fits nicely in a dual-slot form factor. We weighed the 1080 Ti and found that it’s a little heavier than the Titan X at 1039g.

The top of the card looks just as familiar as its front, sporting a green back-lit logo, along with one eight- and one six-pin power connector. The bottom is even less interesting; there’s really nothing to say about its plain cover.

Some hot air may escape the back of the card through its open vent. But the way Nvidia designed its thermal solution ensures most of the waste heat exhausts out the back.

Nvidia improves airflow through the cooler by removing its DVI output. You get three DisplayPort connectors and one HDMI 2.0 port, while a bundled DP-to-DVI dongle covers anyone still using that interface.

Cooler Design

We had to dig deep in our tool box because Nvidia primarily uses thin 0.5mm screws, which fit into the mating threads of special piggyback screws that sit below the backplate. These uncommon M2.5 hex bolts also attach the card’s cover to its circuit board.

One improvement to the GeForce GTX 1080 Ti became apparent as we started taking the card apart: Nvidia mated the PWM controller on the back of its PCA with part of the backplate using thick thermal fleece. This is a material we don’t see used very often, and it’s meant to augment heat dissipation. The move would have been even more effective if Nvidia cut a hole into the plastic sheet covering the backplate in this area.

The exposed board reveals two areas on the left labeled THERMAL PAD 1 and THERMAL PAD 2. However, these do not actually host any thermal pads. We don’t know if Nvidia’s engineers deemed them unnecessary or if its accountants decided they were too expensive. Our measurements will tell.

The cooler’s massive bottom plate sports thermal pads for the voltage converters and memory modules, as well as several of the thermal fleece strips we mentioned previously. Those strips connect other on-board components to the cooler’s bottom plate.

Similar to its other high-end Founders Edition cards, Nvidia uses a vapor chamber for cooling the GPU. It’s attached to the board with four spring bolts.

Board Design & Components

Physically, the first thing you might notice about the PCA is its full complement of voltage regulators. Nvidia’s Titan X (Pascal) had the same layout, but not all of its emplacements were populated. The Quadro P6000, on the other hand, uses this board design. That card’s eight-pin power connector points toward the back, and you can see the holes for it on the 1080 Ti’s PCB.

The opposite holds true for memory: compared to Titan X (Pascal), one of GeForce GTX 1080 Ti’s modules is missing.

A total of 11 Micron MT58K256M321JA-110 GDDR5X are organized around the GP102 processor. They operate at 11 Gb/s data rates, which helps compensate for the missing 32-bit memory controller compared to Titan X. We asked Micron to speculate why Nvidia didn’t use the 12 Gb/s MT58K256M321JA-120 modules advertised in its datasheet, and the company mentioned they aren’t widely available yet, despite appearing in its catalog.

Nvidia sticks with the uP9511 we’ve seen several times now, which makes sense because this PWM controller allows for the concurrent operation of seven phases, as opposed to just 6(+2). The same hardware is used for all seven of the GPU’s power phases, and they’re found on the back of the board.

The voltage converters’ design is interesting in that it’s quite simple: one buck converter, the LM53603, is responsible for the high side, and two (instead of one) Fairchild D424 N-Channel MOSFETs operate on the low side. This setup spreads waste heat over twice the surface area, minimizing hot-spots.

For coils, Nvidia went with encapsulated ferrite chokes (roughly the same quality as Foxconn’s Magic coils). They can be installed by machines and aren’t push-through. Thermally, the back of the board is a good place for them, though we find it interesting that Nvidia doesn’t do more to help cool these components. Stranger still, the capacitors right next to them receive cooling consideration.

The memory gets two power phases run in parallel by a single uP1685. The high side uses the FD424 mentioned above, whereas the low side sports two dual N-Channel Logic Level PowerTrench E6930 MOSFETs in a parallel configuration. Because the two phases are simpler, their coils are correspondingly smaller.

So, what’s the verdict on Nvidia’s improved thermal solution? Based on what we found under the hood, it’d be safer to call this a cooling reconceptualization. Switching out active components and using additional thermal pads to more efficiently move waste heat are the most readily apparent updates. The cooler itself should perform identically to cards we’ve seen in the past.

MORE: Nvidia GeForce GTX 1080 Roundup

MORE: Nvidia GeForce GTX 1070 Roundup

How We Tested Nvidia’s GeForce GTX 1080 Ti

Nvidia’s latest and greatest will no doubt be found in high-end platforms. Some of these may include Broadwell-E-based systems. However, we’re sticking with our MSI Z170 Gaming M7 motherboard, which was recently upgraded to host a Core i7-7700K CPU. The new processor is complemented by G.Skill’s F4-3000C15Q-16GRR memory kit. Intel’s Skylake architecture remains the company’s most effective per clock cycle, and a stock 4.2 GHz frequency is higher than the models with more cores. Crucial’s MX200 SSD remains, as does the Noctua NH-12S cooler and be quiet! Dark Power Pro 10 850W power supply.

As far as competition goes, the GeForce GTX 1080 Ti is rivaled only by the $1200 Titan X (Pascal). The only other comparisons that make sense are Nvidia’s GeForce GTX 1080, the lower-end 1070, and AMD’s flagship Radeon R9 Fury X. We add a GeForce GTX 980 Ti to the mix for showing what 1080 Ti can do versus its predecessor.

Our benchmark selection now includes Ashes of the Singularity, Battlefield 1, Civilization VI, Doom, Grand Theft Auto V, Hitman, Metro: Last Light, Rise of the Tomb Raider, Tom Clancy’s The Division, Tom Clancy’s Ghost Recon Wildlands, and The Witcher 3. That substantial list drops Battlefield 4 and Project CARS, but adds several others.

The testing methodology we’re using comes from PresentMon: Performance In DirectX, OpenGL, And Vulkan. In short, all of these games are evaluated using a combination of OCAT and our own in-house GUI for PresentMon, with logging via AIDA64. If you want to know more about our charts (particularly the unevenness index), we recommend reading that story.

All of the numbers you see in today’s piece are fresh, using updated drivers. For Nvidia, we’re using build 378.78. AMD’s card utilizes Crimson ReLive Edition 17.2.1, which was the latest at test time.

Go to Source

Stranger Things 2 latest rumours – release date and trailer

Stranger Things 2 latest rumours – release date and trailer

Netflix exclusive Stranger Things was a big hit in 2016, and is set to make a comeback in 2017. Read the latest rumours on the Stranger Things 2 trailer and UK launch date.

Love Stranger Things? Then you’ll be mega-excited about Stranger Things 2 – coming in 2017


By

Stranger Things, a Netflix exclusive, was one of the hit shows of 2016. So when is Stranger Things 2 coming out? And how can you watch Stranger Things today? We reveal all, including the new Stranger Things Season 2 release date, trailers and episode list. Also see: The 82 best films to watch on Netflix

Stranger Things stars Winona Ryder, David Harbour and Matthew Modine. Netflix describes it thus: “When a young boy vanishes, a small town uncovers a mystery involving secret experiments, terrifying supernatural forces and one strange little girl.”

Read our list of the 10 best sci-fi films

When is the Stranger Things 2 release date? 

Stranger Things 2 release date: Halloween 2017 

Thanks to the above Stranger Things 2 trailer shown as an ad during Super Bowl 2017, we now know that the Stranger Things 2 release date is Halloween 2017. Whether this means season 2 will arrive on 31 October or just near the date is unclear at the moment.

We also know that there will be nine episodes in Stranger Things 2 and you can see the full episode list below.

  • Episode 1 – “Madmax”
  • Episode 2 – “The Boy Who Came Back to Life”
  • Episode 3 – “The Pumpkin Patch”
  • Episode 4 – “The Palace”
  • Episode 5 – “The Storm”
  • Episode 6 – “The Pollywog”
  • Episode 7 – “The Secret Cabin”
  • Episode 8 – “The Brain”
  • Episode 9 – “The Lost Brother”

How to watch Stranger Things 

Stranger Things is a Netflix exclusive, which means you’ll either need to subscribe to Netflix or find a friend who already has done. Alternatively, if you’re prepared to watch the first series quickly enough you can simply sign up for a month’s free trial at Netflix.com

(Do bear in mind, of course, that if you like Stranger Things you won’t be able to get a second free trial when it returns to Netflix next year.) 

If you do decide to subscribe, one of the great things about Netflix is you can cancel at any time. Netflix charges a monthly subscription, the cheapest of which is £5.99. You can pay extra to enable Netflix streaming on more than one device at a time (you can watch on your laptop, PC, TV, tablet or phone) and to unlock HD and Ultra-HD content. Also see: How to watch US Netflix and How to download Netflix video

The entire first season of Stranger Things (eight episodes) is available to view on Netflix, so simply log in and use the Search function or look under Netflix Originals for Stranger Things, then tap to play. 

Read next: The wonderful Stranger Things poster – and the 80s cult films that inspired it 

Stranger Things 2 trailers

Previous to the Super Bowl 2017 ad (top of page), a teaser for the second season of Stranger Things was released. It mostly featured letters making up the title but also some hints at what will happen.

Also check out this video of the kids from the cast reacting to the Super Bowl advert for Stranger Things 2.

Follow Marie Brewis on Twitter.

Go to Source

Modular Epic Gear Morpha X Mouse Gets Last-Minute Tweaks, On Sale Mid-March

The modular Morpha X gaming mouse from Epic Gear was supposed to hit stores in February. The company bumped availability back to March 14, but in between when we saw the mouse at CES 2017 and now, Epic Gear added a couple of final details.

Primary among those concerns the RGB lighting. As of January, it was unclear how the lighting would be implemented and whether or not it would have a software component to give you customization options. Now we know that Epic Gear will indeed include configuration software for the lighting, as well as for features such as “angle-snapping, lift-off distance, button assignment, DPI, profiles, and USB report rate, just to name a few,” according to the company.

One unique lighting feature is Away-From-Mouse (AFM) ambient lighting. Epic Gear said that when the mouse is motionless for 20 seconds, the lights under the scroll wheel will start glowing; they’ll shut off (or at least return to whatever state you program it to) when you grab it.

Otherwise, the specifications and feature set appear to be unchanged from what we’ve previously seen.

The Morpha X is unique among other mice in that you can swap out the sensor for a different one. Epic Gear ships the mouse with an optical sensor cartridge and a laser sensor cartridge, and you can pop in one or the other depending on your preference. It also has modular left and right switches–the EG Orange (medium weight) or EG Purple (heavier)–and, again, you can choose which you prefer and insert the module of your choice. It also has an adjustable 20g weight system (4 x 5g weights).

The package will contain the Morpha X mouse in gray, with a white replacement shell; the two sensor cartridges; the Orange and Purple switch cartridges; a switch puller; the four weights; and documentation. And it comes in the “Iron Box,” a metal box designed to help you keep track of all those parts.

You can pick up a Morpha X from Amazon starting March 14 for $130.

Epic Gear MorphaX Gaming Mouse
Sensor/DPI -12,000 DPI IR LED (Up to 250ips tracking speed, 50G acceleration)
-8,200 DPI laser (Up to 150ips tracking speed, 30G acceleration)
Ambidextrous Yes
Switches Omron
-EG Orange (medium)
-EG Purple (pro [heavier])
Onboard Storage Unknown, but likely
Polling Rate 125-1,000Hz
Lighting RGB, configurable via software. ambient lighting mode
Buttons 7 total, 6 programmable
-L/R click
-DPI buttons x2
-Left side nav buttons x2
-Click wheel
Software Yes
Cable 1.8m x-braided with gold connector
Dimensions 126.5 x 66.5 x 40mm
Weight 110g w/o cable or weights, includes four removable 5g weights (total weight 130g)
Misc. -”Ultra swift” large PTFE feet
-5 gaming profiles with dedicated LED color
assignments
-15 macro sets (configurable in software)

-Lock-down function
-AFM ambient lighting mode
-Adjustable lift-off distance (w/ auto calibration) and angle snapping
System Requirements -USB port
-50MB free storage space
-Windows 7, 8, 10
Warranty 2 yrs
Price $130, Mar 14, 2017

Go to Source