Nvidia GeForce GTX 1080 Ti

The Nvidia GeForce GTX 1080 Ti has been arguably the most anticipated graphics card of the new year. The 1080 series is Nvidia’s top-end GPU for gamers, and this year’s iteration even outpaces the Titan X in several regards.

Priced at $699 or £699 (about AU$930) – the same as the pre-discounted Nvidia GeForce GTX 1080 – this card offers stunning performance that’s often equal to the Titan X. Beyond being Nvidia’s most impressive GPU to date, it’s a showcase of how far the company’s Pascal architecture has come in less than a year.

Specifications

The Nvidia GeForce GTX 1080 Ti is packing 3584 CUDA cores, 224 texture units and 88 ROPs. It comes with just a notch less video RAM than the god-like Titan X, but the 1080 Ti’s 11GB complement of GDDR5X VRAM is tuned to a faster 11Gbps – clearly Nvidia is a fan of Spinal Tap – making this Nvidia’s quickest Pascal card.

There’s no question the Nvidia GTX 1080 Ti is a performance beast, running at a base 1480MHz frequency and 1582MHz when boosted.

True, the GTX 1080 boosts to a higher 1,733MHz; however, the Ti model is running with more cores and VRAM, boosting performance in both benchmarks and real-world gaming.

Design and cooling

If you’ve seen one of Nvidia’s self-produced Pascal graphics cards, you’ve seen them all. Externally, the original GeForce GTX 1080 and Nvidia’s latest Ti (or tai as the company pronounces it) card are virtually indistinguishable.

Not that we’re complaining. Nvidia’s design for its Founders Edition cards was a hit when it first debuted and the modern, angular look still appeals. One little change users might notice is the lack of a DVI port; don’t fret though, as the GTX 1080 Ti comes with an adapter you can plug into a DisplayPort.

Getting rid of the DVI port has made more room for a better cooling solution. In fact, Nvidia says its new high airflow thermal solution provides twice the airflow area of the GeForce GTX 1080’s cooling system.

Our testing corroborates those claims. Even at a full load, the GTX 1080 Ti stayed at a cooler 75 degrees Celsius while the GTX 1080 peaked at 82 degrees Celsius. Of course, you can push both cards to the edge of 91 degrees Celsius by adjusting the power limiter and overclocking the GPUs.

Performance

4K gaming at 60fps with a single graphics card has long been the promised land for gamers, and the Nvidia GTX 1080 Ti is the closest we’ve come to it. Getting a silky-smooth gaming experience in Ultra HD is highly dependent on which games you play, however.

We were able to achieve frame rates in the 50 to 60 range with games like Battlefield 1 and Doom. That’s not an easy feat – but these are also two of the most optimized games in existence right now. 

Likewise, Nvidia released a DirectX 12-optimized Game Ready Driver that helped us run Rise of the Tomb Raider at a solid 40fps, not quite matching the Titan X’s 57fps performance and double the 20fps previously seen on the GTX 1080.

Those are best case scenarios and you shouldn’t think the Nvidia GTX 1080 Ti is a bulletproof solution for 4K gaming. Total War: Attila for one thing brought the Titanium-power GPU to its knees as it struggled to render the game at a just playable 26fps.

Getting to the pure-performance testing 3DMark: FireStrike Ultra benchmark, the GTX 1080 Ti completely demolishes its forebears by a difference of 2,000 to 4,000 points across the board. More impressively, this ultimate GeForce skips ahead of the Titan X too.

The most mind-blowing bit is the Nvidia GTX 1080 Ti is doing all of this without any overclocking. Nvidia claims the card can be pushed as far as 2GHz, we haven’t quite pushed it that far yet but we have been able to achieve a stably running system at 1.7-1.8GHz range.

Final verdict

From top-to-bottom this is Nvidia’s most impressive graphics card to date. It’s remarkable more powerful than the original GTX 1080 while matching the Titan X’s gaming performance. You’re also looking at one of Nvidia’s coolest running cards with overclocking potential for days.

If you’ve been pining for Nvidia’s top GPU, but couldn’t stomach the $1,200 (£1,099, AU$1,599), the Nvidia GTX 1080 Ti looks much more appetizing at $699 or £699 (about AU$930).

Go to Source

ADATA XPG SX8000 M.2 NVMe SSD review: A more affordable NVMe option

M.2 NVMe SSDs such as ADATA’s XPG SX8000 are a game-changer for PCs. There is simply no other upgrade that offers as dramatic an improvement to the feel and response of your system. If you’re moving from a hard drive, you’ll be astounded. If you’re moving from a SATA SSD, you’ll still be extremely pleased.

Specs and pricing

The SX8000 is a four-lane, PCIe 3.0 (1GBps) M.2 2280 (22mm wide, 80mm long) SSD using 3D (layered) TLC NAND technology. To mitigate the relatively slower writes of TLC (triple level cell/three-bit) NAND, a DRAM cache is employed, as well as some of the TLC treated as SLC for a secondary cache.

sx8000 bb01 smaller ADATA

Stacked/layered/3D NAND is used for storage in the SX8000, but there’s cache to keep performance brisk. Note: The digital removal of chip markings was done by ADATA, not PCWorld.

ADATA quoted us prices of $76 for the 128GB version, $110 for the 256GB version, and $208 for the 512GB version. The prices we saw on Amazon were a bit higher (with the 512GB listed at $250, for example), but still significantly lower than anything other than the Samsung 960 EVO, a drive we haven’t tested yet. A 1TB version of the SX8000 was mentioned in the PR materials, but hasn’t materialized yet.

Performance

The XPG SX8000 performs more on par with OCZ’s RD400 and Plextor’s M8Pe than with Samsung’s killer 960 Pro. But even the slowest NVMe SSD is roughly two to three times the speed of a SATA SSD, so that’s no big knock. Installed in a system, it’s difficult to tell NVMe SSDs apart—they’re so fast, only the benchmarks reveal the difference.

20gb copies adata sx8000 PCWorld

While it can’t match the Samsung 960 Pro in sustained throughput, the SX8000 turned in very competitive numbers with sets of small files and folders. Shorter bars are better.

cdm 5 throughput PCWorld

Though not as fast as the OCZ RD400, the ADATA SX8000 is still very fast, and significantly less expensive. Larger bars are better.

The charts above reiterate the fact that the SX8000 isn’t one of the faster NVMe SSDs we’ve tested. But the fact is, it’s still very speedy and provides the same startling subjective increase in performance the others do. With an NVMe SSD on board, disk I/O basically ceases to be an issue.

Note that we also ran AS SSD, which showed the SX8000 performing about the the same as in CrystalDiskMark. This generally means that the drive is ignoring the FUA (Forced Unit Access) command that AS SSD issues, which disables all caching. As it’s a matter of a few tens of milliseconds before caches are emptied, there’s little chance of data loss. The FUA issue is the reason we don’t quote AS SSD scores for NVMe drives, which unlike SATA SSDs, often obey the command.

Conclusion

Not everyone can use an NVMe SSD. First off, you must have an M.2 slot with four PCIe lanes. If you have PCIe slots, you can use a PCIe M.2 adapter card. Secondly, to get the most out of the drive, you want to run your operating system on it, so you must have a system that recognizes the drive and can boot from it.

That said, while the XPG SX800 isn’t the fastest NVMe drive we’ve tested, it is affordable and absolutely fast enough to give you that NVMe thrill. And it’s warrantied for five years, and/or 80TBW (terabytes written) per 128GB of capacity, which is comforting. Most users never come close to writing that much data.

Nvidia’s new GeForce driver delivers a huge boost to DX12 games

Alongside the release of its new GeForce GTX 1080 Ti, Nvidia is unleashing a fresh graphics card driver which promises a performance boost in DirectX 12 games.

Those running DX12 games (under Windows 10) will benefit from driver optimizations which according to Nvidia will deliver an average performance boost of 16% across these various titles.

The biggest gains are to be seen in Rise of the Tomb Raider, with a rather incredible 33% boost to the frame rate, and Nvidia also boasts that Hitman will get a similarly chunky 23% improvement.

Gears of War 4 will be boosted to the tune of 10%, Ashes of the Singularity by 9%, and Tom Clancy’s The Division will get a more modest increase of 4%. Still, every extra bit of smoothness is welcome, as ever.

Ansel antics

Better performance is the key point with this new driver, but those who like to take screenshots of their games will also be interested to learn that Nvidia Ansel support has arrived for another Tom Clancy game, namely Ghost Recon Wildlands.

Ansel (pictured above) is essentially a screen-grabber on steroids, pausing the game at the moment you wish to capture, and then allowing you to enter the game-world and move the camera around in 3D, zoom or reposition it, and get rid of the HUD or any other interface distractions to hopefully procure yourself a cracking image.

The system also makes it possible to save out super-high-resolution screenshots, and to polish them up with post-processing effects – plus you can capture 360-degree panoramic shots to gawk at using a VR headset, should you own one.

Ansel is currently supported in the likes of Dishonored 2, Mirror’s Edge Catalyst, Watch Dogs 2, The Witcher 3: Wild Hunt and The Witness, and it’ll be coming to other big titles, most notably in the near-future Mass Effect: Andromeda (which is out in a couple of weeks).

Speaking of the latter, yesterday saw Nvidia show off some rather spectacular-looking 4K HDR screenshots taken with Ansel, which you might want to have a gander at here.

Go to Source

Nvidia GeForce GTX 1080 Ti 11GB Review

Nvidia’s GeForce GTX 1080 Ti is now the fastest graphics card available, and at $500 cheaper than the previous champ! Should you buy now, or wait for AMD’s Vega?

Nobody was surprised when Nvidia introduced its GeForce GTX 1080 Ti at this year’s Game Developer Conference. What really got gamers buzzing was the card’s $700 price tag.

Based on its specifications, GeForce GTX 1080 Ti should be every bit as fast as Titan X (Pascal), or even a bit quicker. So why shave off so much of the flagship’s premium? We don’t really have a great answer, except that Nvidia must be anticipating AMD’s Radeon RX Vega and laying the groundwork for a battle at the high-end.

Why now? Because GeForce GTX 1080 Ti is ready today, Nvidia tells us. And because Vega is not, we’d snarkily add.

Turning A Zero Into A Hero

There are currently two graphics cards based on Nvidia’s GP102 processor: Titan X and Quadro P6000. The former uses a version of the GPU with two of its Streaming Multiprocessors disabled, while the latter employs a pristine GP102, without any defects at all.

We’re talking about a 12 billion transistor chip, though. Surely yields aren’t so good that they all bin into one of those two categories, right? Enter GeForce GTX 1080 Ti.

The 1080 Ti employs a similar Streaming Multiprocessor configuration as Titan X—28 of its 30 SMs are enabled, yielding 3584 CUDA cores and 224 texture units. Nvidia pushes the processor’s base clock rate up to 1480 MHz and cites a typical GPU Boost frequency of 1582 MHz. In comparison, Titan X runs at 1417 MHz and has a Boost spec of 1531 MHz. 

Where the new GeForce differs is its back-end. Both Titan X and Quadro P6000 utilize all 12 of GP102’s 32-bit memory controllers, ROP clusters, and slices of L2 cache. This leaves no room for the foundry to make a mistake. Rather than tossing the imperfect GPUs, then, Nvidia turns them into 1080 Tis by disabling one memory controller, one ROP partition, and 256KB of L2. The result looks a little wonky on a spec sheet, but it’s perfectly viable nonetheless. As such, we get a card with an aggregate 352-bit memory interface, 88 ROPs, and 2816KB of L2 cache, down from Titan X’s 384-bit path, 96 ROPs, and 3MB L2.

Left alone, that’d put GeForce GTX 1080 Ti at a slight disadvantage. But in the months between 1080’s launch and now, Micron introduced 11 Gb/s (and 12 Gb/s, according to its datasheet) GDDR5X memories. The higher data rate more than compensates for the narrower memory bus: on paper, GeForce GTX 1080 Ti offers a theoretical 484 GB/s to Titan X’s 480 GB/s.

Of course, eliminating one memory channel affects the card’s capacity. Stepping down from 12GB to 11GB isn’t particularly alarming when we’re testing against a 4GB Radeon R9 Fury X that works just fine at 4K, though. Losing capacity is also preferable to repeating the problem Nvidia had with GeForce GTX 970, where it removed an ROP/L2 partition, but kept the memory, causing slower access to the orphaned 512MB segment. In this case, all 11GB of GDDR5X communicates at full speed.

Specifications

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Meet The GeForce GTX 1080 Ti Founders Edition

During its presentation, Nvidia announced that its Founders Edition cooler was improved compared to Titan X’s. Looking at this card head-on though, you wouldn’t know it. It looks identical except for the model name. The combination of materials (namely cast aluminum and acrylic) is also the same, as is the commanding presence of its 62mm radial cooler.

Nvidia’s reference GeForce GTX 1080 Ti is the same size as its Titan X (Pascal). The distance from the slot cover to end of the cooler spans 26.9cm. From the top of the motherboard slot to the top of the cooler, the card stands 10.5cm tall. And with a depth of 3.5cm, it fits nicely in a dual-slot form factor. We weighed the 1080 Ti and found that it’s a little heavier than the Titan X at 1039g.

The top of the card looks just as familiar as its front, sporting a green back-lit logo, along with one eight- and one six-pin power connector. The bottom is even less interesting; there’s really nothing to say about its plain cover.

Some hot air may escape the back of the card through its open vent. But the way Nvidia designed its thermal solution ensures most of the waste heat exhausts out the back.

Nvidia improves airflow through the cooler by removing its DVI output. You get three DisplayPort connectors and one HDMI 2.0 port, while a bundled DP-to-DVI dongle covers anyone still using that interface.

Cooler Design

We had to dig deep in our tool box because Nvidia primarily uses thin 0.5mm screws, which fit into the mating threads of special piggyback screws that sit below the backplate. These uncommon M2.5 hex bolts also attach the card’s cover to its circuit board.

One improvement to the GeForce GTX 1080 Ti became apparent as we started taking the card apart: Nvidia mated the PWM controller on the back of its PCA with part of the backplate using thick thermal fleece. This is a material we don’t see used very often, and it’s meant to augment heat dissipation. The move would have been even more effective if Nvidia cut a hole into the plastic sheet covering the backplate in this area.

The exposed board reveals two areas on the left labeled THERMAL PAD 1 and THERMAL PAD 2. However, these do not actually host any thermal pads. We don’t know if Nvidia’s engineers deemed them unnecessary or if its accountants decided they were too expensive. Our measurements will tell.

The cooler’s massive bottom plate sports thermal pads for the voltage converters and memory modules, as well as several of the thermal fleece strips we mentioned previously. Those strips connect other on-board components to the cooler’s bottom plate.

Similar to its other high-end Founders Edition cards, Nvidia uses a vapor chamber for cooling the GPU. It’s attached to the board with four spring bolts.

Board Design & Components

Physically, the first thing you might notice about the PCA is its full complement of voltage regulators. Nvidia’s Titan X (Pascal) had the same layout, but not all of its emplacements were populated. The Quadro P6000, on the other hand, uses this board design. That card’s eight-pin power connector points toward the back, and you can see the holes for it on the 1080 Ti’s PCB.

The opposite holds true for memory: compared to Titan X (Pascal), one of GeForce GTX 1080 Ti’s modules is missing.

A total of 11 Micron MT58K256M321JA-110 GDDR5X are organized around the GP102 processor. They operate at 11 Gb/s data rates, which helps compensate for the missing 32-bit memory controller compared to Titan X. We asked Micron to speculate why Nvidia didn’t use the 12 Gb/s MT58K256M321JA-120 modules advertised in its datasheet, and the company mentioned they aren’t widely available yet, despite appearing in its catalog.

Nvidia sticks with the uP9511 we’ve seen several times now, which makes sense because this PWM controller allows for the concurrent operation of seven phases, as opposed to just 6(+2). The same hardware is used for all seven of the GPU’s power phases, and they’re found on the back of the board.

The voltage converters’ design is interesting in that it’s quite simple: one buck converter, the LM53603, is responsible for the high side, and two (instead of one) Fairchild D424 N-Channel MOSFETs operate on the low side. This setup spreads waste heat over twice the surface area, minimizing hot-spots.

For coils, Nvidia went with encapsulated ferrite chokes (roughly the same quality as Foxconn’s Magic coils). They can be installed by machines and aren’t push-through. Thermally, the back of the board is a good place for them, though we find it interesting that Nvidia doesn’t do more to help cool these components. Stranger still, the capacitors right next to them receive cooling consideration.

The memory gets two power phases run in parallel by a single uP1685. The high side uses the FD424 mentioned above, whereas the low side sports two dual N-Channel Logic Level PowerTrench E6930 MOSFETs in a parallel configuration. Because the two phases are simpler, their coils are correspondingly smaller.

So, what’s the verdict on Nvidia’s improved thermal solution? Based on what we found under the hood, it’d be safer to call this a cooling reconceptualization. Switching out active components and using additional thermal pads to more efficiently move waste heat are the most readily apparent updates. The cooler itself should perform identically to cards we’ve seen in the past.

MORE: Nvidia GeForce GTX 1080 Roundup

MORE: Nvidia GeForce GTX 1070 Roundup

How We Tested Nvidia’s GeForce GTX 1080 Ti

Nvidia’s latest and greatest will no doubt be found in high-end platforms. Some of these may include Broadwell-E-based systems. However, we’re sticking with our MSI Z170 Gaming M7 motherboard, which was recently upgraded to host a Core i7-7700K CPU. The new processor is complemented by G.Skill’s F4-3000C15Q-16GRR memory kit. Intel’s Skylake architecture remains the company’s most effective per clock cycle, and a stock 4.2 GHz frequency is higher than the models with more cores. Crucial’s MX200 SSD remains, as does the Noctua NH-12S cooler and be quiet! Dark Power Pro 10 850W power supply.

As far as competition goes, the GeForce GTX 1080 Ti is rivaled only by the $1200 Titan X (Pascal). The only other comparisons that make sense are Nvidia’s GeForce GTX 1080, the lower-end 1070, and AMD’s flagship Radeon R9 Fury X. We add a GeForce GTX 980 Ti to the mix for showing what 1080 Ti can do versus its predecessor.

Our benchmark selection now includes Ashes of the Singularity, Battlefield 1, Civilization VI, Doom, Grand Theft Auto V, Hitman, Metro: Last Light, Rise of the Tomb Raider, Tom Clancy’s The Division, Tom Clancy’s Ghost Recon Wildlands, and The Witcher 3. That substantial list drops Battlefield 4 and Project CARS, but adds several others.

The testing methodology we’re using comes from PresentMon: Performance In DirectX, OpenGL, And Vulkan. In short, all of these games are evaluated using a combination of OCAT and our own in-house GUI for PresentMon, with logging via AIDA64. If you want to know more about our charts (particularly the unevenness index), we recommend reading that story.

All of the numbers you see in today’s piece are fresh, using updated drivers. For Nvidia, we’re using build 378.78. AMD’s card utilizes Crimson ReLive Edition 17.2.1, which was the latest at test time.

Go to Source

Stranger Things 2 latest rumours – release date and trailer

Stranger Things 2 latest rumours – release date and trailer

Netflix exclusive Stranger Things was a big hit in 2016, and is set to make a comeback in 2017. Read the latest rumours on the Stranger Things 2 trailer and UK launch date.

Love Stranger Things? Then you’ll be mega-excited about Stranger Things 2 – coming in 2017


By

Stranger Things, a Netflix exclusive, was one of the hit shows of 2016. So when is Stranger Things 2 coming out? And how can you watch Stranger Things today? We reveal all, including the new Stranger Things Season 2 release date, trailers and episode list. Also see: The 82 best films to watch on Netflix

Stranger Things stars Winona Ryder, David Harbour and Matthew Modine. Netflix describes it thus: “When a young boy vanishes, a small town uncovers a mystery involving secret experiments, terrifying supernatural forces and one strange little girl.”

Read our list of the 10 best sci-fi films

When is the Stranger Things 2 release date? 

Stranger Things 2 release date: Halloween 2017 

Thanks to the above Stranger Things 2 trailer shown as an ad during Super Bowl 2017, we now know that the Stranger Things 2 release date is Halloween 2017. Whether this means season 2 will arrive on 31 October or just near the date is unclear at the moment.

We also know that there will be nine episodes in Stranger Things 2 and you can see the full episode list below.

  • Episode 1 – “Madmax”
  • Episode 2 – “The Boy Who Came Back to Life”
  • Episode 3 – “The Pumpkin Patch”
  • Episode 4 – “The Palace”
  • Episode 5 – “The Storm”
  • Episode 6 – “The Pollywog”
  • Episode 7 – “The Secret Cabin”
  • Episode 8 – “The Brain”
  • Episode 9 – “The Lost Brother”

How to watch Stranger Things 

Stranger Things is a Netflix exclusive, which means you’ll either need to subscribe to Netflix or find a friend who already has done. Alternatively, if you’re prepared to watch the first series quickly enough you can simply sign up for a month’s free trial at Netflix.com

(Do bear in mind, of course, that if you like Stranger Things you won’t be able to get a second free trial when it returns to Netflix next year.) 

If you do decide to subscribe, one of the great things about Netflix is you can cancel at any time. Netflix charges a monthly subscription, the cheapest of which is £5.99. You can pay extra to enable Netflix streaming on more than one device at a time (you can watch on your laptop, PC, TV, tablet or phone) and to unlock HD and Ultra-HD content. Also see: How to watch US Netflix and How to download Netflix video

The entire first season of Stranger Things (eight episodes) is available to view on Netflix, so simply log in and use the Search function or look under Netflix Originals for Stranger Things, then tap to play. 

Read next: The wonderful Stranger Things poster – and the 80s cult films that inspired it 

Stranger Things 2 trailers

Previous to the Super Bowl 2017 ad (top of page), a teaser for the second season of Stranger Things was released. It mostly featured letters making up the title but also some hints at what will happen.

Also check out this video of the kids from the cast reacting to the Super Bowl advert for Stranger Things 2.

Follow Marie Brewis on Twitter.

Go to Source

Modular Epic Gear Morpha X Mouse Gets Last-Minute Tweaks, On Sale Mid-March

The modular Morpha X gaming mouse from Epic Gear was supposed to hit stores in February. The company bumped availability back to March 14, but in between when we saw the mouse at CES 2017 and now, Epic Gear added a couple of final details.

Primary among those concerns the RGB lighting. As of January, it was unclear how the lighting would be implemented and whether or not it would have a software component to give you customization options. Now we know that Epic Gear will indeed include configuration software for the lighting, as well as for features such as “angle-snapping, lift-off distance, button assignment, DPI, profiles, and USB report rate, just to name a few,” according to the company.

One unique lighting feature is Away-From-Mouse (AFM) ambient lighting. Epic Gear said that when the mouse is motionless for 20 seconds, the lights under the scroll wheel will start glowing; they’ll shut off (or at least return to whatever state you program it to) when you grab it.

Otherwise, the specifications and feature set appear to be unchanged from what we’ve previously seen.

The Morpha X is unique among other mice in that you can swap out the sensor for a different one. Epic Gear ships the mouse with an optical sensor cartridge and a laser sensor cartridge, and you can pop in one or the other depending on your preference. It also has modular left and right switches–the EG Orange (medium weight) or EG Purple (heavier)–and, again, you can choose which you prefer and insert the module of your choice. It also has an adjustable 20g weight system (4 x 5g weights).

The package will contain the Morpha X mouse in gray, with a white replacement shell; the two sensor cartridges; the Orange and Purple switch cartridges; a switch puller; the four weights; and documentation. And it comes in the “Iron Box,” a metal box designed to help you keep track of all those parts.

You can pick up a Morpha X from Amazon starting March 14 for $130.

Epic Gear MorphaX Gaming Mouse
Sensor/DPI -12,000 DPI IR LED (Up to 250ips tracking speed, 50G acceleration)
-8,200 DPI laser (Up to 150ips tracking speed, 30G acceleration)
Ambidextrous Yes
Switches Omron
-EG Orange (medium)
-EG Purple (pro [heavier])
Onboard Storage Unknown, but likely
Polling Rate 125-1,000Hz
Lighting RGB, configurable via software. ambient lighting mode
Buttons 7 total, 6 programmable
-L/R click
-DPI buttons x2
-Left side nav buttons x2
-Click wheel
Software Yes
Cable 1.8m x-braided with gold connector
Dimensions 126.5 x 66.5 x 40mm
Weight 110g w/o cable or weights, includes four removable 5g weights (total weight 130g)
Misc. -”Ultra swift” large PTFE feet
-5 gaming profiles with dedicated LED color
assignments
-15 macro sets (configurable in software)

-Lock-down function
-AFM ambient lighting mode
-Adjustable lift-off distance (w/ auto calibration) and angle snapping
System Requirements -USB port
-50MB free storage space
-Windows 7, 8, 10
Warranty 2 yrs
Price $130, Mar 14, 2017

Go to Source

IBM stores data on a single atom

What good is a single atom these days?

Well, aside from being essential for, I dunno, most everything, you can now store data on one. That is right, store data on a single atom. But how did researchers achieve that?

In IT Blogwatch, we jump on the miniaturization bandwagon.

What is going on? Mike Wehner has some background:

IBM…announced…that it…successfully managed to store data on a single atom,…an achievement that could potentially change the way storage devices are developed in the future…modern hard drives utilize roughly 100,000 atoms to store a single bit, so shrinking things down to the size of just one atom is obviously a massive achievement.

Remind us what a bit of data exactly is again? Michael Irving has that info:

For those who don’t pay…attention to the wizardry going on inside their computer, hard disk drives store data magnetically, as a series of tiny magnetic dots on a sheet of metal. Each dot represents one bit of data: a demagnetized dot represents a zero…if it’s magnetized, it’s a one.

And they managed to get that on a single atom? How did they even do that? Mike Murphy is in the know:

IBM’s researchers found a way to magnetize individual atoms of the rare earth element holmium and use the two poles of magnetism…as stand-ins for the 1s and 0s. The holmium atoms are attached to a surface of…magnesium oxide, which holds them in place, at a chilly 5 kelvin (-450°F). Using essentially what is a very…small needle, the researchers can pass an electrical current through the holmium atoms, which causes their north and south poles to flip, replicating the process of writing information to a traditional magnetic hard drive. The atoms stay in whatever state they’ve been flipped into, and by measuring the magnetism of the atoms at a later point, the scientists can see what state the atom is, mirroring the way a computer reads information it’s stored on a hard drive…IBM says the researchers used a single iron atom to measure the magnetic field of the holmium atoms.

What does this mean for the future? Tas Bindi tells us:

IBM…demonstrated that two magnetic atoms could be written and read independently even when they were separated by just 1 nanometre, which could culminate in a magnetic storage system…1,000 times denser than today’s hard disk drives and solid state memory chips. Additionally…such a system could…store significantly more data which could pave the way for smaller datacentres, computers, and mobile devices.

So is this something we are going to start seeing around? Stephen Lawson can answer that:

Don’t expect to see a phone the size of your little finger anytime soon. This project is pure research…For one thing, their experiment required conditions that aren’t practical for most devices. It needed an ultra-high vacuum, low vibration, and liquid helium for a super-low temperature.

The team just wanted to achieve the maximum possible density…Now researchers can use what IBM learned to develop new high-density storage that works outside a lab, probably using a small number of atoms that can help each other remain stable at room temperature.

So what are people saying about all this? Darryn Ten sums it up nicely:

Oh my that’s impressive.

To express your thoughts on Computerworld content, visit Computerworld’s Facebook page, LinkedIn page and Twitter stream.

Go to Source