Nvidia’s new GeForce driver delivers a huge boost to DX12 games

Alongside the release of its new GeForce GTX 1080 Ti, Nvidia is unleashing a fresh graphics card driver which promises a performance boost in DirectX 12 games.

Those running DX12 games (under Windows 10) will benefit from driver optimizations which according to Nvidia will deliver an average performance boost of 16% across these various titles.

The biggest gains are to be seen in Rise of the Tomb Raider, with a rather incredible 33% boost to the frame rate, and Nvidia also boasts that Hitman will get a similarly chunky 23% improvement.

Gears of War 4 will be boosted to the tune of 10%, Ashes of the Singularity by 9%, and Tom Clancy’s The Division will get a more modest increase of 4%. Still, every extra bit of smoothness is welcome, as ever.

Ansel antics

Better performance is the key point with this new driver, but those who like to take screenshots of their games will also be interested to learn that Nvidia Ansel support has arrived for another Tom Clancy game, namely Ghost Recon Wildlands.

Ansel (pictured above) is essentially a screen-grabber on steroids, pausing the game at the moment you wish to capture, and then allowing you to enter the game-world and move the camera around in 3D, zoom or reposition it, and get rid of the HUD or any other interface distractions to hopefully procure yourself a cracking image.

The system also makes it possible to save out super-high-resolution screenshots, and to polish them up with post-processing effects – plus you can capture 360-degree panoramic shots to gawk at using a VR headset, should you own one.

Ansel is currently supported in the likes of Dishonored 2, Mirror’s Edge Catalyst, Watch Dogs 2, The Witcher 3: Wild Hunt and The Witness, and it’ll be coming to other big titles, most notably in the near-future Mass Effect: Andromeda (which is out in a couple of weeks).

Speaking of the latter, yesterday saw Nvidia show off some rather spectacular-looking 4K HDR screenshots taken with Ansel, which you might want to have a gander at here.

Go to Source

Nvidia GeForce GTX 1080 Ti 11GB Review

Nvidia’s GeForce GTX 1080 Ti is now the fastest graphics card available, and at $500 cheaper than the previous champ! Should you buy now, or wait for AMD’s Vega?

Nobody was surprised when Nvidia introduced its GeForce GTX 1080 Ti at this year’s Game Developer Conference. What really got gamers buzzing was the card’s $700 price tag.

Based on its specifications, GeForce GTX 1080 Ti should be every bit as fast as Titan X (Pascal), or even a bit quicker. So why shave off so much of the flagship’s premium? We don’t really have a great answer, except that Nvidia must be anticipating AMD’s Radeon RX Vega and laying the groundwork for a battle at the high-end.

Why now? Because GeForce GTX 1080 Ti is ready today, Nvidia tells us. And because Vega is not, we’d snarkily add.

Turning A Zero Into A Hero

There are currently two graphics cards based on Nvidia’s GP102 processor: Titan X and Quadro P6000. The former uses a version of the GPU with two of its Streaming Multiprocessors disabled, while the latter employs a pristine GP102, without any defects at all.

We’re talking about a 12 billion transistor chip, though. Surely yields aren’t so good that they all bin into one of those two categories, right? Enter GeForce GTX 1080 Ti.

The 1080 Ti employs a similar Streaming Multiprocessor configuration as Titan X—28 of its 30 SMs are enabled, yielding 3584 CUDA cores and 224 texture units. Nvidia pushes the processor’s base clock rate up to 1480 MHz and cites a typical GPU Boost frequency of 1582 MHz. In comparison, Titan X runs at 1417 MHz and has a Boost spec of 1531 MHz. 

Where the new GeForce differs is its back-end. Both Titan X and Quadro P6000 utilize all 12 of GP102’s 32-bit memory controllers, ROP clusters, and slices of L2 cache. This leaves no room for the foundry to make a mistake. Rather than tossing the imperfect GPUs, then, Nvidia turns them into 1080 Tis by disabling one memory controller, one ROP partition, and 256KB of L2. The result looks a little wonky on a spec sheet, but it’s perfectly viable nonetheless. As such, we get a card with an aggregate 352-bit memory interface, 88 ROPs, and 2816KB of L2 cache, down from Titan X’s 384-bit path, 96 ROPs, and 3MB L2.

Left alone, that’d put GeForce GTX 1080 Ti at a slight disadvantage. But in the months between 1080’s launch and now, Micron introduced 11 Gb/s (and 12 Gb/s, according to its datasheet) GDDR5X memories. The higher data rate more than compensates for the narrower memory bus: on paper, GeForce GTX 1080 Ti offers a theoretical 484 GB/s to Titan X’s 480 GB/s.

Of course, eliminating one memory channel affects the card’s capacity. Stepping down from 12GB to 11GB isn’t particularly alarming when we’re testing against a 4GB Radeon R9 Fury X that works just fine at 4K, though. Losing capacity is also preferable to repeating the problem Nvidia had with GeForce GTX 970, where it removed an ROP/L2 partition, but kept the memory, causing slower access to the orphaned 512MB segment. In this case, all 11GB of GDDR5X communicates at full speed.


MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Meet The GeForce GTX 1080 Ti Founders Edition

During its presentation, Nvidia announced that its Founders Edition cooler was improved compared to Titan X’s. Looking at this card head-on though, you wouldn’t know it. It looks identical except for the model name. The combination of materials (namely cast aluminum and acrylic) is also the same, as is the commanding presence of its 62mm radial cooler.

Nvidia’s reference GeForce GTX 1080 Ti is the same size as its Titan X (Pascal). The distance from the slot cover to end of the cooler spans 26.9cm. From the top of the motherboard slot to the top of the cooler, the card stands 10.5cm tall. And with a depth of 3.5cm, it fits nicely in a dual-slot form factor. We weighed the 1080 Ti and found that it’s a little heavier than the Titan X at 1039g.

The top of the card looks just as familiar as its front, sporting a green back-lit logo, along with one eight- and one six-pin power connector. The bottom is even less interesting; there’s really nothing to say about its plain cover.

Some hot air may escape the back of the card through its open vent. But the way Nvidia designed its thermal solution ensures most of the waste heat exhausts out the back.

Nvidia improves airflow through the cooler by removing its DVI output. You get three DisplayPort connectors and one HDMI 2.0 port, while a bundled DP-to-DVI dongle covers anyone still using that interface.

Cooler Design

We had to dig deep in our tool box because Nvidia primarily uses thin 0.5mm screws, which fit into the mating threads of special piggyback screws that sit below the backplate. These uncommon M2.5 hex bolts also attach the card’s cover to its circuit board.

One improvement to the GeForce GTX 1080 Ti became apparent as we started taking the card apart: Nvidia mated the PWM controller on the back of its PCA with part of the backplate using thick thermal fleece. This is a material we don’t see used very often, and it’s meant to augment heat dissipation. The move would have been even more effective if Nvidia cut a hole into the plastic sheet covering the backplate in this area.

The exposed board reveals two areas on the left labeled THERMAL PAD 1 and THERMAL PAD 2. However, these do not actually host any thermal pads. We don’t know if Nvidia’s engineers deemed them unnecessary or if its accountants decided they were too expensive. Our measurements will tell.

The cooler’s massive bottom plate sports thermal pads for the voltage converters and memory modules, as well as several of the thermal fleece strips we mentioned previously. Those strips connect other on-board components to the cooler’s bottom plate.

Similar to its other high-end Founders Edition cards, Nvidia uses a vapor chamber for cooling the GPU. It’s attached to the board with four spring bolts.

Board Design & Components

Physically, the first thing you might notice about the PCA is its full complement of voltage regulators. Nvidia’s Titan X (Pascal) had the same layout, but not all of its emplacements were populated. The Quadro P6000, on the other hand, uses this board design. That card’s eight-pin power connector points toward the back, and you can see the holes for it on the 1080 Ti’s PCB.

The opposite holds true for memory: compared to Titan X (Pascal), one of GeForce GTX 1080 Ti’s modules is missing.

A total of 11 Micron MT58K256M321JA-110 GDDR5X are organized around the GP102 processor. They operate at 11 Gb/s data rates, which helps compensate for the missing 32-bit memory controller compared to Titan X. We asked Micron to speculate why Nvidia didn’t use the 12 Gb/s MT58K256M321JA-120 modules advertised in its datasheet, and the company mentioned they aren’t widely available yet, despite appearing in its catalog.

Nvidia sticks with the uP9511 we’ve seen several times now, which makes sense because this PWM controller allows for the concurrent operation of seven phases, as opposed to just 6(+2). The same hardware is used for all seven of the GPU’s power phases, and they’re found on the back of the board.

The voltage converters’ design is interesting in that it’s quite simple: one buck converter, the LM53603, is responsible for the high side, and two (instead of one) Fairchild D424 N-Channel MOSFETs operate on the low side. This setup spreads waste heat over twice the surface area, minimizing hot-spots.

For coils, Nvidia went with encapsulated ferrite chokes (roughly the same quality as Foxconn’s Magic coils). They can be installed by machines and aren’t push-through. Thermally, the back of the board is a good place for them, though we find it interesting that Nvidia doesn’t do more to help cool these components. Stranger still, the capacitors right next to them receive cooling consideration.

The memory gets two power phases run in parallel by a single uP1685. The high side uses the FD424 mentioned above, whereas the low side sports two dual N-Channel Logic Level PowerTrench E6930 MOSFETs in a parallel configuration. Because the two phases are simpler, their coils are correspondingly smaller.

So, what’s the verdict on Nvidia’s improved thermal solution? Based on what we found under the hood, it’d be safer to call this a cooling reconceptualization. Switching out active components and using additional thermal pads to more efficiently move waste heat are the most readily apparent updates. The cooler itself should perform identically to cards we’ve seen in the past.

MORE: Nvidia GeForce GTX 1080 Roundup

MORE: Nvidia GeForce GTX 1070 Roundup

How We Tested Nvidia’s GeForce GTX 1080 Ti

Nvidia’s latest and greatest will no doubt be found in high-end platforms. Some of these may include Broadwell-E-based systems. However, we’re sticking with our MSI Z170 Gaming M7 motherboard, which was recently upgraded to host a Core i7-7700K CPU. The new processor is complemented by G.Skill’s F4-3000C15Q-16GRR memory kit. Intel’s Skylake architecture remains the company’s most effective per clock cycle, and a stock 4.2 GHz frequency is higher than the models with more cores. Crucial’s MX200 SSD remains, as does the Noctua NH-12S cooler and be quiet! Dark Power Pro 10 850W power supply.

As far as competition goes, the GeForce GTX 1080 Ti is rivaled only by the $1200 Titan X (Pascal). The only other comparisons that make sense are Nvidia’s GeForce GTX 1080, the lower-end 1070, and AMD’s flagship Radeon R9 Fury X. We add a GeForce GTX 980 Ti to the mix for showing what 1080 Ti can do versus its predecessor.

Our benchmark selection now includes Ashes of the Singularity, Battlefield 1, Civilization VI, Doom, Grand Theft Auto V, Hitman, Metro: Last Light, Rise of the Tomb Raider, Tom Clancy’s The Division, Tom Clancy’s Ghost Recon Wildlands, and The Witcher 3. That substantial list drops Battlefield 4 and Project CARS, but adds several others.

The testing methodology we’re using comes from PresentMon: Performance In DirectX, OpenGL, And Vulkan. In short, all of these games are evaluated using a combination of OCAT and our own in-house GUI for PresentMon, with logging via AIDA64. If you want to know more about our charts (particularly the unevenness index), we recommend reading that story.

All of the numbers you see in today’s piece are fresh, using updated drivers. For Nvidia, we’re using build 378.78. AMD’s card utilizes Crimson ReLive Edition 17.2.1, which was the latest at test time.

Go to Source

Stranger Things 2 latest rumours – release date and trailer

Stranger Things 2 latest rumours – release date and trailer

Netflix exclusive Stranger Things was a big hit in 2016, and is set to make a comeback in 2017. Read the latest rumours on the Stranger Things 2 trailer and UK launch date.

Love Stranger Things? Then you’ll be mega-excited about Stranger Things 2 – coming in 2017


Stranger Things, a Netflix exclusive, was one of the hit shows of 2016. So when is Stranger Things 2 coming out? And how can you watch Stranger Things today? We reveal all, including the new Stranger Things Season 2 release date, trailers and episode list. Also see: The 82 best films to watch on Netflix

Stranger Things stars Winona Ryder, David Harbour and Matthew Modine. Netflix describes it thus: “When a young boy vanishes, a small town uncovers a mystery involving secret experiments, terrifying supernatural forces and one strange little girl.”

Read our list of the 10 best sci-fi films

When is the Stranger Things 2 release date? 

Stranger Things 2 release date: Halloween 2017 

Thanks to the above Stranger Things 2 trailer shown as an ad during Super Bowl 2017, we now know that the Stranger Things 2 release date is Halloween 2017. Whether this means season 2 will arrive on 31 October or just near the date is unclear at the moment.

We also know that there will be nine episodes in Stranger Things 2 and you can see the full episode list below.

  • Episode 1 – “Madmax”
  • Episode 2 – “The Boy Who Came Back to Life”
  • Episode 3 – “The Pumpkin Patch”
  • Episode 4 – “The Palace”
  • Episode 5 – “The Storm”
  • Episode 6 – “The Pollywog”
  • Episode 7 – “The Secret Cabin”
  • Episode 8 – “The Brain”
  • Episode 9 – “The Lost Brother”

How to watch Stranger Things 

Stranger Things is a Netflix exclusive, which means you’ll either need to subscribe to Netflix or find a friend who already has done. Alternatively, if you’re prepared to watch the first series quickly enough you can simply sign up for a month’s free trial at Netflix.com

(Do bear in mind, of course, that if you like Stranger Things you won’t be able to get a second free trial when it returns to Netflix next year.) 

If you do decide to subscribe, one of the great things about Netflix is you can cancel at any time. Netflix charges a monthly subscription, the cheapest of which is £5.99. You can pay extra to enable Netflix streaming on more than one device at a time (you can watch on your laptop, PC, TV, tablet or phone) and to unlock HD and Ultra-HD content. Also see: How to watch US Netflix and How to download Netflix video

The entire first season of Stranger Things (eight episodes) is available to view on Netflix, so simply log in and use the Search function or look under Netflix Originals for Stranger Things, then tap to play. 

Read next: The wonderful Stranger Things poster – and the 80s cult films that inspired it 

Stranger Things 2 trailers

Previous to the Super Bowl 2017 ad (top of page), a teaser for the second season of Stranger Things was released. It mostly featured letters making up the title but also some hints at what will happen.

Also check out this video of the kids from the cast reacting to the Super Bowl advert for Stranger Things 2.

Follow Marie Brewis on Twitter.

Go to Source

Modular Epic Gear Morpha X Mouse Gets Last-Minute Tweaks, On Sale Mid-March

The modular Morpha X gaming mouse from Epic Gear was supposed to hit stores in February. The company bumped availability back to March 14, but in between when we saw the mouse at CES 2017 and now, Epic Gear added a couple of final details.

Primary among those concerns the RGB lighting. As of January, it was unclear how the lighting would be implemented and whether or not it would have a software component to give you customization options. Now we know that Epic Gear will indeed include configuration software for the lighting, as well as for features such as “angle-snapping, lift-off distance, button assignment, DPI, profiles, and USB report rate, just to name a few,” according to the company.

One unique lighting feature is Away-From-Mouse (AFM) ambient lighting. Epic Gear said that when the mouse is motionless for 20 seconds, the lights under the scroll wheel will start glowing; they’ll shut off (or at least return to whatever state you program it to) when you grab it.

Otherwise, the specifications and feature set appear to be unchanged from what we’ve previously seen.

The Morpha X is unique among other mice in that you can swap out the sensor for a different one. Epic Gear ships the mouse with an optical sensor cartridge and a laser sensor cartridge, and you can pop in one or the other depending on your preference. It also has modular left and right switches–the EG Orange (medium weight) or EG Purple (heavier)–and, again, you can choose which you prefer and insert the module of your choice. It also has an adjustable 20g weight system (4 x 5g weights).

The package will contain the Morpha X mouse in gray, with a white replacement shell; the two sensor cartridges; the Orange and Purple switch cartridges; a switch puller; the four weights; and documentation. And it comes in the “Iron Box,” a metal box designed to help you keep track of all those parts.

You can pick up a Morpha X from Amazon starting March 14 for $130.

Epic Gear MorphaX Gaming Mouse
Sensor/DPI -12,000 DPI IR LED (Up to 250ips tracking speed, 50G acceleration)
-8,200 DPI laser (Up to 150ips tracking speed, 30G acceleration)
Ambidextrous Yes
Switches Omron
-EG Orange (medium)
-EG Purple (pro [heavier])
Onboard Storage Unknown, but likely
Polling Rate 125-1,000Hz
Lighting RGB, configurable via software. ambient lighting mode
Buttons 7 total, 6 programmable
-L/R click
-DPI buttons x2
-Left side nav buttons x2
-Click wheel
Software Yes
Cable 1.8m x-braided with gold connector
Dimensions 126.5 x 66.5 x 40mm
Weight 110g w/o cable or weights, includes four removable 5g weights (total weight 130g)
Misc. -”Ultra swift” large PTFE feet
-5 gaming profiles with dedicated LED color
-15 macro sets (configurable in software)

-Lock-down function
-AFM ambient lighting mode
-Adjustable lift-off distance (w/ auto calibration) and angle snapping
System Requirements -USB port
-50MB free storage space
-Windows 7, 8, 10
Warranty 2 yrs
Price $130, Mar 14, 2017

Go to Source

IBM stores data on a single atom

What good is a single atom these days?

Well, aside from being essential for, I dunno, most everything, you can now store data on one. That is right, store data on a single atom. But how did researchers achieve that?

In IT Blogwatch, we jump on the miniaturization bandwagon.

What is going on? Mike Wehner has some background:

IBM…announced…that it…successfully managed to store data on a single atom,…an achievement that could potentially change the way storage devices are developed in the future…modern hard drives utilize roughly 100,000 atoms to store a single bit, so shrinking things down to the size of just one atom is obviously a massive achievement.

Remind us what a bit of data exactly is again? Michael Irving has that info:

For those who don’t pay…attention to the wizardry going on inside their computer, hard disk drives store data magnetically, as a series of tiny magnetic dots on a sheet of metal. Each dot represents one bit of data: a demagnetized dot represents a zero…if it’s magnetized, it’s a one.

And they managed to get that on a single atom? How did they even do that? Mike Murphy is in the know:

IBM’s researchers found a way to magnetize individual atoms of the rare earth element holmium and use the two poles of magnetism…as stand-ins for the 1s and 0s. The holmium atoms are attached to a surface of…magnesium oxide, which holds them in place, at a chilly 5 kelvin (-450°F). Using essentially what is a very…small needle, the researchers can pass an electrical current through the holmium atoms, which causes their north and south poles to flip, replicating the process of writing information to a traditional magnetic hard drive. The atoms stay in whatever state they’ve been flipped into, and by measuring the magnetism of the atoms at a later point, the scientists can see what state the atom is, mirroring the way a computer reads information it’s stored on a hard drive…IBM says the researchers used a single iron atom to measure the magnetic field of the holmium atoms.

What does this mean for the future? Tas Bindi tells us:

IBM…demonstrated that two magnetic atoms could be written and read independently even when they were separated by just 1 nanometre, which could culminate in a magnetic storage system…1,000 times denser than today’s hard disk drives and solid state memory chips. Additionally…such a system could…store significantly more data which could pave the way for smaller datacentres, computers, and mobile devices.

So is this something we are going to start seeing around? Stephen Lawson can answer that:

Don’t expect to see a phone the size of your little finger anytime soon. This project is pure research…For one thing, their experiment required conditions that aren’t practical for most devices. It needed an ultra-high vacuum, low vibration, and liquid helium for a super-low temperature.

The team just wanted to achieve the maximum possible density…Now researchers can use what IBM learned to develop new high-density storage that works outside a lab, probably using a small number of atoms that can help each other remain stable at room temperature.

So what are people saying about all this? Darryn Ten sums it up nicely:

Oh my that’s impressive.

To express your thoughts on Computerworld content, visit Computerworld’s Facebook page, LinkedIn page and Twitter stream.

Go to Source

iOS 11 latest rumours – release date and features

iOS 11 latest rumours – release date and features

Apple has announced the dates for WWDC 2017, which is when we’ll get the Developer Preview of iOS 11. Read the latest rumours on the iOS 11 features and UK launch date.

iOS 11 is expected to launch with the iPhone 8, but you’ve only to wait until June to get a preview


For almost as long as we can remember (and we do remember when you had to pay for iOS updates!) Apple has released the new version of iOS in September each year along with the new iPhone(s). See also: Best new phones coming in 2017

In 2017 we can’t see the company deviating from its established routine. And 2017 is set to be a belter for Apple – and its fans – since it’s the 10th anniversary of the original iPhone.

The rumours and leaks for the iPhone 8 continue unabated, but while this year’s hardware could well include some impressive new tech, what’s the story for iOS 11?

When is the iOS 11 release date?

Although the final public release of iOS 11 will be on the day new iPhones go on sale, we expect Apple will show off the new OS on 5 June at the opening keynote to its WWDC event for developers. While the event is meant to get app developers excited about the new features, it also serves to whip up a public appetite for both the software and – naturally – the new hardware.

In the past couple of years, Apple has allowed anyone that wants to to sign up for the beta programme and install the early test version of iOS on their iPhone and iPad. This usually starts in July, and you can update to subsequent beta versions until the ‘final’ version becomes available in September.

It’s never the final version of course, because iOS is updated regularly throughout the year.

What are the rumoured new features?

Nothing is certain about iOS 11 as it’s too early: Apple won’t release any details until at least June. But there are plenty of rumours about what might be on the horizon, and we have a few features on our own wish list. Here are some of them:

Dark Mode

There has been a long-running rumour that Apple will introduce a ‘dark mode’ which will provide a dark theme or background in apps and throughout iOS. This should help reduce brightness when using your phone in total darkness. Currently, the light colours iOS uses are still too bright in some environments even with the brightness slider set to minimum.

However, this could be implemented before iOS 11 as there are more rumours of a ‘theatre mode’ in iOS 10.3 which is due to be released in March 2017 along with new iPads.

VR support

Another rumour, which makes sense, is that Apple will add VR support to iOS 11. Android has it in the form of Daydream, and we’re sure Apple won’t want to ignore VR entirely. It may be that only the latest iPhone(s) will support VR because of the processing power and low-latency response times required, though.

Better Siri

Virtual assistants have been leapfrogging each other since they appeared, but Siri is due for an upgrade. It’s fairly capable, but not all that intelligent and is now lagging behind Google’s Assistant.

Apple is reportedly working on making Siri sound more natural and now that it’s open to third-parties, we expect to see many more apps to start using it (even before iOS 11 comes out)

Customisable Control Centre

This isn’t a rumour, but it’s been on our wish list since iOS 8. The Control Centre is great, but if you’ve ever used an Android phone, you’ll appreciate how nice it is to be able to customise the shortcuts so you get quicker access to the features and tools you use most.

In iOS – including the latest version of iOS 10 – you can’t customise the Control Centre at all.

Easier access to video settings

We’ve moaned about this in our iOS reviews for a couple of years now, but it’s so frustrating that the video recording settings aren’t in the Camera app. If you want to switch between 1080p at 60fps and 4K at 30fps (for example) you have to spend about 20 seconds doing it through the Settings app.

All we want is a simple cycle through the few modes in the Camera app itself. Is that so much to ask?

For more on what we want to see, check out our full iOS 11 wish list.

Which iPhones and iPads will get iOS 11?

That’s another question we can’t yet answer definitively. Recently, Apple has kept more older devices up to date than we were expecting, but the oldest tend to miss out on most of the best new features.

The iPhone 5 runs iOS 10, for example, but it misses out on the brilliant Memories feature in Photos (among other absentees).

iOS 10 new features and release date

iOS 10 new features and release date

It wouldn’t surprise us if only the iPhone 5 and 5C drop off the upgrade cycle this year, along with the iPad 4 and iPad mini 2.

That would leave the following models which – in theory – should get the update to iOS 11:

  • iPhone 7
  • iPhone 7 Plus
  • iPhone 6s
  • iPhone 6s Plus
  • iPhone 6
  • iPhone 6 Plus
  • iPhone SE
  • iPhone 5s
  • iPad Pro 12.9-inch
  • iPad Pro 9.7-inch
  • iPad Air 2
  • iPad Air
  • iPad 4th gen
  • iPad mini 4
  • iPod Touch 6th gen

Go to Source

Sinclair ZX Vega+ funding campaign halted by Indiegogo

Crowdfunding platform Indiegogo intervened to stop a handheld retro computer console campaign from acquiring further funding, the BBC has learned.

The Spectrum ZX Vega+, backed by Sir Clive Sinclair, had achieved its original crowdfunding target.

Bur then Indiegogo halted further fundraising because of delivery delays and a lack of communication to backers.

The project’s organisers had asked the BBC not to reveal the development.

The BBC understands no consoles have been delivered to backers, despite a pledge last month that they would “ship after 20 Feb 2017”.

And the company behind the project – Retro Computers Limited – suggested these details might put its team at risk.

“Following a credible threat of violence against personnel of Retro Computers Limited, including threats made as recently as last night, we asked [technology desk editor] Leo Kelion and the BBC to refrain from publishing a story we believe to be factually inaccurate and might put people at risk of physical harm, alarm and distress,” Retro Computers Limited founder David Levy said in a statement on Wednesday.

“Since December 2016 the BBC have formally been on notice that this is a police matter, and we ask that the BBC and Mr Kelion do not compromise the police investigation.”

The BBC delayed publication of this report to give RCL managing director Suzanne Martin time to provide evidence of the threats, but she did not do so.

In the meantime, the Gizmodo news site also published and then deleted an article about the matter because it too was told of threats.

Refund requests

RCL had already received more than £513,000 ($624,000) from Indiegogo crowdfunders for the Vega+ .

And before the fundraising campaign was halted, the project had been listed as “in demand” to allow new people to become backers, despite having already reached its funding target.

But in recent weeks, many backers have expressed anger that they still have not received their console and claimed their requests for more information were going unanswered by the company.

Although, Indiegogo is clear in its terms and conditions that those who back a project are supporting an idea rather than buying a product – and that hardware in particular tends to be more difficult to deliver.

In 2015, RCL brought a different Sinclair computer to fruition after a smaller campaign.

Lawyers’ letter

RCL originally said the new Spectrum ZX Vega+ was due to go into production in the summer of 2016 and it might even “be able to improve on this delivery date”.

But in December 2016, after the BBC contacted RCL to ask about the status of the Vega+, the broadcaster was threatened with legal action.

“Our clients are concerned that the BBC is in fact supporting and participating in a malicious campaign intended to denigrate our clients’ reputation,” wrote lawyers Michelmores LLP in a letter to the broadcaster.

They went on to request that the BBC show them its report at least 48 hours ahead of publication so they could identify any false information, which the BBC refused to do.

Ms Martin then apologised to backers for the delays and said there had been unexpected issues with the console buttons.

“In November, we identified an improvement we believed was essential to the Vega+ gaming experience,” she said at the time.

“An improvement that would make the feel of the product far better, including a correction in the design of one of the buttons, making it more robust and able to withstand the rigours of extended game-play.

“We also wanted to make sure we did justice to the Sinclair legacy.

“This change has caused a brief delay, and we are truly sorry about that, but we needed this time to improve the product, and we have now completed the necessary revisions, and we are delighted to announce that we will ship the first units in February 2017.”

Since then, RCL has suggested it had been unable to respond to some backers’ requests because of a business dispute with two former directors.

And in its last public update, 11 days ago, the company released some technical details about software used by the device.

‘Last chance’

Many recent comments left by backers on RCL’s Indiegogo page, which remains live but has stopped taking funds, are requests for refunds.

“I don’t expect a response. I’m just being polite in letting them know this is their last chance before they have to deal with small claims court,” wrote a backer called Paul Brookfield.

“Please receive this email as written notice of cancellation of my pledge and a request for a refund,” wrote Drew Miller.

“I no longer believe you are capable of providing the product I pledged for in April, considering the drastic number of delays and your lack of communication toward fellow backers.”

Go to Source