What is AMD’s Vega battle plan? To fight on multiple fronts

AMD’s history in high-end gaming is shaky. If it’s any indication of the company’s position in the race, its place among our picks of the best graphics cards is an entry-level spot with the Radeon RX 460.

With that, it’s hard to believe that the upcoming AMD Vega, or Radeon RX Vega, cards will bear much competitive edge over, say, Nvidia’s GeForce GTX 1080 Ti. A whopping 11GB of VRAM is a tough egg to crack for a company that’s best known for budget-friendly offerings. 

AMD tried its damnedest at its Capsaicin & Cream 2017 event at the Game Developers Conference (GDC) this past Tuesday to keep details of its Vega graphics architecture as vague as possible. While the company revealed an unforeseen partnership with Fallout publisher Bethesda Softworks, it kept mum on concrete Vega specs and launch details.

This was intentional, of course, as the company had to know its long-time rival would make a major reveal the same night. Nevertheless, we managed to squeeze as much info on Vega and this odd partnership as we could from AMD technical marketing manager Scott Watson.

Below, Watson elucidates Vega’s to-be-instated high-bandwidth cache controller and its foray into cloud gaming, while addressing concerns following its vague announcement regarding its new deal with Bethesda.

Vega’s official branding

TechRadar: What was the biggest announcement you made today?

Scott Watson: The biggest story is – I’m going to spill the beans here – the last thing that we announced, which was a partnership with Bethesda Softworks. They’re a game publisher, they publish under their various studios. Doom, Quake, Wolfenstein, Skyrim, Elder Scrolls, Fallout, Dishonored, Prey – I mean they have a huge range of titles. 

We’ve worked with developers, one off, on individual games or we’ll provide engineering assistance and help optimize a game and maybe do some co-marketing around it. But what we’re doing with Bethesda is a multi-title, deep collaboration in engineering across all of their titles for a good period of time. So it’s a different approach. The idea is that we’ll help them optimize for Ryzen, and Radeon, both.

Do you think that future games in this partnership are going to have significant advantages with Ryzen and Vega?

So, we’ll be looking obviously for helping optimize the traditional things to do in graphics. I think there will be some Vulkan optimizations, low-level APIs, but then also helping with core scaling across Ryzen because now we have eight cores, 16 threads and very affordable price points. We want games to be able to take advantage of that.

Why Bethesda?

A number of reasons. They have great PC games, and they have good tech, and we already have a nice template that we can work from. If you look at what happened with Doom Vulkan, and we worked with them on that, and they’re able to achieve really nice performance with a new API. It has some really smart folks working there, and so that’s a good company to work with going forward.

Bethesda and AMD shaking on their partnership on stage

Are you concerned that, by working with specific game developers, you’re dividing AMD and Nvidia or Intel fans further when certain games run better on AMD hardware than on competing hardware?

We’re the only company that has those CPUs and GPUs that we provide, and we know our users. And they have one of our CPUs and the competitor’s graphics card or vice versa. So we want to benefit the PC gamers generally. So it’s in our interest to do good optimization that will work for whoever has our hardware even if they’re mixing it with the competition. 

Regarding the Vega-powered, LiquidSky game streaming service, how is that going to differ from Nvidia’s GeForce Now and other streaming services?

My understanding is it’s a lot more economical price-wise. We have a new model, because you install the games you own on their service. However, the piece that we announced is that they’ll be using our Vega GPUs for their servers. 

One thing Vega does is it has hardware support for virtualization, not software. It’s actually hardware partitioning on GPU resources, so they can guarantee customers that they get the slice of a GPU that’s consistent, and the performance is consistent. So, [the game itself] is partitioned on hardware. We’ll have a better user experience because you’re getting a piece of the GPU consistently.

There’s another piece of it that we announced today too which is the Vega GPUs can virtualize Nvidia encoding hardware as well.

What does that mean?

We have a video encoding VCE in all Radeons. Virtualization allows for us to slice up the GPU across multiple user sessions that we secure and split between the resources. 

We can now do that with the video encoder in Vega, which means that when they’re serving a gaming session and then they have to send it downstream to a user, they can encode it on the hardware encoder on Vega for multiple gaming sessions at one time.

AMD’s use of forward rendering in VR illustrated

About AMD’s take on forward rendering for VR, what advantages does this pose over traditional deferred rendering?

Deferred rendering is actually newer than forward, but it’s widely used now because it works well, especially on the last generation of consoles. It has some performance costs when you start it up, but then deferred can offer lots of lights and reflections and other features, and they’re nice to have. 

However, in VR you have to be at 90 frames per second, and so there’s no reason to use deferred rendering because you can’t take advantage of the extra features while hitting the performance standard that you have to meet. So it’s just not always a good fit for VR. The other thing that deferred doesn’t do well is post-processing anti-aliasing like effects AA. 

That isn’t good when you have two eyes with two different views, you know what I mean? We want smart, subpixel AA and that’s multi-sampling, so if you switch back to forward rendering, it can be quick to start. It’s a good fit for VR’s use case. It’s a performance uplift, and there’s an image quality improvement with multi-sampling. 

So what we’ve done is we’ve helped enable a version of Unreal Engine 4 – version 4.1.5 that has a forward rendering path. And we showed on stage multiple VR games including Epic’s Robo Recall using the forward path to perform better and provide better anti-aliasing.

Getting to the elephant in the room, what is Vega’s high-bandwidth cache controller and, more importantly, how does it make our games better?

The idea with high-bandwidth cache and the high-bandwidth cache controller is that the traditional sort of VRAM that you had in the past can now really act as a cache for a larger pool of memory that can all get in the system. 

Ultimately, hopefully game developers build bigger, more complex worlds and get the ability to not worry so much about overrunning the memory balance and still have beautiful images that flow smoothly. 

So, what we did was demonstrate the feature naturally. We took a current game, Deus Ex: Mankind Divided, built for 4GB, and we constrained two Vega cards to just two gigs of RAM artificially as a test case. 

And then we showed it running without the high-bandwidth cache controller – it was slow and it stuttered, and then we turned on the high-bandwidth cache controller to increase the effect in memory size by mapping some into system RAM and being smart about what we kept on the 2GB of local memory. 

It gets 50% faster frames per second and 100% faster minimum fps than the non-high-bandwidth cache controller case. You can imagine future games; developers can build very complex worlds against very high memory requirements on this new architecture.

AMD demonstrating the benefits of AMD CrossFire’s alternate frame rendering in Vega

So, effectively, people will be able to use lower spec hardware for higher spec games?

Potentially. What we really want is for developers to not have to worry about it, and just go build what they want – make it beautiful, and then we can be smart about the amount of memory that goes on the card.

Go to Source

VR Shooter 'Robo Recall' Is Now Free On Oculus Rift

Have you ever wanted to shoot a bunch of angry robots with two pistols without having to worry about the consequences? Then you should give Robo Recall, a VR first-person shooter from Epic Games that allows you to do just that, a try now that it’s available as a free download for the Oculus Rift.

Robo Recall has players teleport around, aim with the Touch motion controllers, and fire away at the objects of their not-so-virtual fury. We got to play a demo of Robo Recall at the Oculus Connect 3 developer conference in October 2016. The game was similar to Bullet Train, a tech demo that stole Oculus Connect 2 in 2015, but this time it took the form of a full-fledged game instead of just an itty-bitty experience used to show off the idea’s potential.

Robo Recall – Evolution from Bullet Train

Epic said at the time that Robo Recall would be a free download in Q1 2017 and, well, here we are. But that isn’t all–the company also revealed the Robo Recall Mod Kit to let anyone natively mod the game with new weapons, maps, and characters at no charge. The company said in a press release:

With the Robo Recall Mod Kit, players are invited to bring all-new experiences to Recallers everywhere. Modders can build new RoboReady-approved products, including top notch weapons for dispatching rogue robots, or change up the fight by creating their own opponents in need of recalling and creative decimation. The Mod Kit also allows players to create and share new levels for everyone to explore and build new versions of the Robo Recall maps with custom gameplay.

Robo Recall was made with Unreal Engine 4. We get the feeling that Epic is using the game as a showcase for the engine’s capabilities–a trailer released at GDC 2017 was careful to point out UE’s potential as a VR development tool, and it just so happens that the footage used to demonstrate UE’s support for high-end VR at 90fps came from Robo Recall. The game will prove a worthwhile investment if it convinces other devs to make VR titles with UE.

You can download Robo Recall from the Oculus Store and learn more about the Robo Recall Mod Kit at the game’s website.

Name Robo Recall
Type VR, FPS
Publisher Oculus
Developer Epic Games
Platforms Oculus Rift
Where To Buy Oculus Store
Release Date March 1

Go to Source

High-End VR Just Got A Whole Lot Cheaper: Oculus Slashed The Price Of The Rift

It’s been a little less than a year since Oculus started shipping the Rift CV1. After years of anticipation, hype, and Oculus executives telling us the Rift would be an affordable (sub-$400) device, Oculus dropped a bombshell on its fans and released its VR headset–with motion controllers–for $600.

The initial sticker shock undoubtedly stopped some people from buying Rift HMDs, but Oculus didn’t have any trouble overselling its production capacity, though it certainly didn’t help that production problems shortly after the Rift’s launch caused shipment delays that lasted for several months.

On top of dealing with a problematic hardware launch, Oculus was busy trying to bring the Oculus Touch controllers to market. The controllers were supposed to ship in Q2 on 2016, but in December 2015, Oculus revealed that it would be delaying the release until later in the year. Oculus spent most of 2016 working on the Touch controllers and curating a lineup of content to complement them. The company finally launched Touch in early December, but again, the price was a bit higher than people had hoped.

The Touch controller package launched at $199, which gives you a pair of tracked controllers and an extra Constellation camera to track them. Considering the price of a single Xbox One or PlayStation DualShock 4 controller, the price of the Touch controllers shouldn’t be a surprise, but it was enough to deter some people from upgrading.

Now that the Rift has an install base of at least a couple hundred thousand units (no official numbers have been released, but analyst estimates peg the Rift at north of 200,000 units sold) and the Touch controllers have been available for three months, Oculus is moving to make buying into VR more affordable.

You can now buy a Rift headset with Touch controllers for the same price that just a Rift would have set you back yesterday. Oculus is now selling the Rift + Touch bundle for $600. The company also slashed the price of the standalone Touch controllers in half to $99 and dropped the price of the extra Constellation cameras from $79 to $59.  

Curiously, Oculus appears to have dropped the standalone Rift as an option. Instead of dropping the price of entry by $100, the company is offering a better package for the same price. Last year, Oculus stood by the merits of seated VR experiences played with a gamepad, and now it almost feels like the company is moving towards a future of motion control games.

“We know this from responses to hundreds of thousands of surveys taken at our retail demo locations, as well as from empirical evidence before us: Console VR is less expensive and currently outselling PC VR, and even less expensive Mobile VR headsets, like our Gear VR device, are outselling Console VR,” said Jason Rubin, Oculus VP Content. “Bringing the higher quality of PC VR toward these lower price points is an obvious win for both consumers and PC VR. This price drop was as inevitable as it is beneficial. This is how the technology business works.”

We agree with Rubin, at least to some extent. Tom’s Hardware did a survey of our readers last year to determine what, if anything, was holding people back from investing in VR. Overwhelmingly, the results indicated that price was a primary factor keeping people from joining the VR revolution. And he’s not wrong about console VR. The PSVR hit the market in October 2016, and its software lineup is sparse, but that didn’t stop Sony from selling almost a million units already. The price of entry undoubtedly played a considerable role in the PSVR’s early success. 

Gabe Newell disagrees, though. He believes that content is the key factor holding people back from buying a VR system. “If you took the existing VR systems and made them 80% cheaper, there’s still not a huge market,” Newell said in a recent interview. “There’s still not a compelling reason for people to spend 20 hours a day in VR.”

Fortunately, Oculus is looking at the software side of the equation too. The company is doubling down on quality content for 2017. Oculus plans to launch new in-house developed titles from Oculus Studios on almost a monthly basis.

The Oculus platform already has several excellent games, such as Superhot, The Climb, Chronos, and The Unspoken. And let’s not forget about Arizona Sunshine. But you would be hard pressed to argue that VR’s killer app is here already.

“I can’t say for sure that this year’s line-up is going to have VR’s World of Warcraft or GTA, but with every new release, and with every new discovery, VR gets closer to finding its killer app,” said Rubin.

Building a AAA game takes time, and most VR developers haven’t had enough time to make that kind of game. VR developers are still trying to figure out what works and what doesn’t, and VR locomotion is not yet a solved problem (not for lack of trying, mind). Thankfully, solutions for those problems are becoming clearer all them time.

“We have to remember that as of this GDC, our developer community has had dev kits in their hands for less than two years and has only been able to get feedback from consumers about what they’re doing for a year,” said Rubin. “With that frontier style development behind us, and with second-generation development and informed design taking place, the sweet spot for developers to create breakout hits opens. Some of these titles will become perpetually loved VR series that are with us for generations.”

Things are looking up in the VR industry. Price cuts and better content can only be good for everyone. Who’s ready to join me in the metaverse?

Go to Source

Fujitsu SP-1425

There was a time, many years ago, when a flatbed scanner was a must-buy peripheral alongside a printer. Then the all-in-one arrived, which combines scanning and printing functionality along with faxing and photocopying.

Demand – and interest – for that new category of devices effectively killed off standalone scanners, as shown by the below Google trends graph from the past 13 years pitching flatbed scanners (in blue) up against multi-function printers or MFPs (in red).

However, vendors like Canon, HP, and Panasonic still believe in the future of this fundamental digitisation tool, even in the face of adversity.

Fujitsu unveiled the SP-1425 a few weeks ago. It’s a flatbed scanner that promises to “deliver simple operation and reliable performance for professional use,” especially for the SMB market. It is positioned as the top-of-the-range model and is the only flatbed scanner in the SP series.

At the time of writing and until the end of March 2017, the SP-1425 will come with a £100 cashback. PCWB sells it for £441 excluding VAT (for a total of £529 with VAT – that’s around $650, or AU$850).

At 454 x 331 x 129mm and weighing a mere 4.3kg, the scanner is smaller than expected with a compact footprint and a lower height compared to some rivals. Clearly, that makes it more appealing to small businesses where real-estate (i.e. desk space) can be a limited commodity.

Fujitsu designed the scanner in such a way to minimise any potential human error. There’s no display here and only two buttons (Scan/Stop and Power), a far cry from the likes of the N7100 scanner with its 7-inch touch panel.

Such scarcity of controls on the scanner itself means that you must rely on the host device (the computer) to operate the hardware. More on that later.

The chassis felt solid and sturdy despite the scanner’s lightweight nature. It’s rather bland looking, but this is a functional device for scanning, and not designed to be admired on your desk.

The SP-1425 has a quoted scanning speed of 25ppm (mono and colour) in simplex and twice that in duplex, and an impressive 50 images per minute is claimed (going by the ISO standard measures).

It uses an ultrasonic sensor to prevent paper jams and the maximum page size supported is 8.5 x 14-inches. The automatic document feeder/paper chute capacity is a mere 50 pages, though, with a rated 1500 sheets a day and an optical resolution of 600dpi.

Like the overwhelming majority of scanners and MFPs on the market, this one uses the tried-and-trusted USB Type-B connector.

There’s no other connector besides this, meaning there’s no Ethernet port and no wireless connectivity, and you might well expect more on a half-a-grand standalone scanner.

There’s already an online update for the scanner, one which targets Fujitsu’s very own PaperStream IP application. A DVD containing these applications is bundled with the SP-1425, which also features the PaperStream Capture Lite 1.0, Presto PageManager 9, ABBYY FineReader 12 Sprint and the Scanner Central Admin 4.6.

We ran into some issues with Capture Lite as it wouldn’t detect the scanner despite another popular photo management app, Irfanview, seeing it (note that Capture Lite is a 32-bit application and must be paired with 32-bit drivers).

We ended up using Windows 10’s own Windows Fax and Scan desktop application, a rudimentary but capable tool when it comes to digitising content rapidly. We didn’t install PageManager or FineReader; just bear in mind that neither are the latest versions, with the former being a shocking seven-years-old (and the second a more reasonable three-years-old).

The scanner managed to convert 35 A4 sheets (that’s 70 pages) in just over four minutes, or an average scanning speed of 17.5 pages per minute, well under the rated speed (although bear in mind that WFS doesn’t allow scans at less than 300dpi in JPG format).

Make sure the paper tray is properly adjusted as the SP-1425 is very sensitive to misaligned paper sheets which can cause paper jams. To its credit, the scanner handled various types of printed paper (folded, thicker, thinner, and sheets slightly bigger than A4) very well.

The quality of the scanning was more than acceptable in an office environment, even for smaller font sizes.

Early verdict

Fujitsu’s flatbed offering looks like a perfect match for the SMB market. It is reasonably quick and accurate, and the bundled software is on par with the rest of the competition. It’s a great choice if you’re short on space and you wish to digitise a mixture of different types of media.

Look at the rest of the competition, though, and it is abundantly clear that the SP-1425 is an expensive model even given its rated scanning speed. A comparable scanner, the HP ScanJet Pro 3500 F1, may well be bigger but it has a faster USB 3.0 port and costs about half the price.

If you can live with a device that has a bigger footprint, then consider the Brother MFC-8950DW which has a rated scanning speed of 40ppm. It is an MFP so can print, fax and copy as well, plus it has a longer warranty, wireless connectivity, Gigabit LAN and costs far less than the SP-1425.

Should you need to scan loose sheets, then sheet-fed scanners like the Panasonic KV-S1027C should be at the top of your list given that they are even smaller and usually come packed with features. The aforementioned model, for example, has a rated duplex speed of 90 images per minute and a 100-sheet automatic document feeder.

Go to Source

Real-Time Cinematic VR Rendering With Epic Games And The Mill

They call it the Blackbird. It looks like a dune buggy with a fancy camera rig on top, and in truth, that’s kind of what it is, but that’s not what it’s for. It’s every car you can imagine, or at least that’s the idea.

The Blackbird is the brainchild of The Mill, a visual effects and content creation studio. It’s a modular vehicle, meaning you can add various components to it depending on what you want to do with it.

At a GDC event co-hosted by The Mill, Epic Games, and Chevrolet, they played a trailer for a fake movie called The Human Race wherein a hotshot race car driver agrees to race against an AI driver. It’s man versus machine, a John Henry story for the new age. (Spoiler: The human wins–or does he?) In the trailer, two cars race, but both of them were fake. They were Blackbirds in disguise, wearing rendered car skins that looked as real as you can imagine.

[Applause], nice work Epic Games and The Mill, that’s very cool. But there was a twist: The race cars were rendered in real-time. [More applause], WOW, we didn’t see that coming.

Then came the second, and frankly more impressive, twist. A Chevy executive, Sam Russell, took the stage and picked up a Lenovo Phab 2 Pro smartphone (which has Tango on board). He fired up an app that lets you customize the new Chevy Camaro ZL-1–paint colors, trim colors, and so on. Because it was a Tango app, he could move the phone around and look at the car from different angles. They had the phone’s display mirrored on the giant middle screen in the presentation hall. On the two enormous screens flanking the middle one, they let the race car trailer play on a loop. When Russell changed the color on the car on his phone, the colors of the car in the video also changed, in real-time.

[Applause]

How It Works

We pinned down some of the guys from Epic Games and The Mill to understand how exactly they were able to accomplish this feat.

The Blackbird’s four-camera rig shoots 360-degree, 3D video at a 6K resolution. It measures the depth of the terrain using Lidar, and marries that data with the captured images. (This is more or less how the Mars Rover maps the surface of Mars.) It’s a multi-camera setup, but there’s a PC mounted in a box on the back of the vehicle that does all the stitching right there in real-time. This is The Mill’s proprietary “Mill Cyclops” virtual reality toolkit, which includes both hardware and software components. 


There’s a second “hero” camera that a filmmaker would use to shoot the Blackbird in action. That is, an actual driver zips around pell-mell in the car, and a filmmaker shoots it, as one would.  

So then, at this point you have the capture from the Blackbird’s camera array as well as the framed footage from the hero camera.

The next step is an Epic one, if you will. Blackbird is covered in markers, and using those for tracking, Unreal Engine can paint essentially anything–in this case, pretty much any car–onto the Blackbird. Because those capabilities are already in UE, it can work on any platform. This demo relies on UE’s Sequencer to seamlessly line up all the footage and renders. Part of the magic is that the software is able to use the data from the 360-degree/3D capture to figure out where all the reflections should go, and this is why it can paint any car into the scene.

That’s Unreal

The PC that Epic had running the demos at the event had an Nvidia Quadro GPU and relied on an Intel PCIe SSD that could handle 1.8 GBps data.  

The Unreal Engine technologies involved with this project are:

  • Multiple streams of uncompressed EXR images (1.3GBps)
  • Dynamic Skylight IBL (image-based lighting) for lighting the cars
  • Multi-element compositing, with edge wrap, separate BG and FG motion blur
  • Fast Fourier Transform (FFT) blooms for that extra bling
  • PCSS shadows with directional blur settings
  • Bent Normal AO with reflection occlusion
  • Prototype for next generation Niagara particles and FX
  • Compatibility with Nvidia Quadro graphics card
  • Support for Google Tango-enabled devices (currently Lenovo Phab 2 Pro)

Epic said that these capabilities will come “later this year.”

There are some tricks involved. The “movie” itself is just a 24fps HD video, so by itself, it’s not exactly a resource hog. And because you don’t have to render the entire scene–just the car–that’s also not terribly resource intensive. Rendering in real-time, of course, does require lots and lots of horsepower; but you can get away with it because it’s just the car that gets rendered.

Epic Games and The Mill hope to use this mixed reality technique for filmmakers, who can shoot scenes that require VFX and see the finished product right in front of them, in real-time.

Go to Source

Xbox Live Creators Program Opens Up Xbox And Windows 10 Game Development

Microsoft announced at GDC 2017 that anyone can release games for the Xbox One and Windows 10 PCs via the new Xbox Live Creators Program. The company also highlighted several other features, such as the Beam streaming tool and the performance-boosting Game Mode, that should debut soon.

Xbox Live Creators Program

The new program opens up Xbox One and Windows 10 game development and distribution to basically anyone with a hankering to make something. The Xbox Live Creators Program will allow development on retail Xbox One consoles instead of requiring a dev kit, for example, and developers will be able to publish to a dedicated Creators section of the Xbox Games Store. (Games will also be made available for Windows 10 PCs via the Windows Store.)

Here’s what Microsoft said about the program in a blog post:

With the Creators Program, anyone can integrate Xbox Live sign-in, presence, and social features into their UWP games, then publish their game to Xbox One and Windows 10. This means their title can see exposure to every Xbox One owner across the Xbox One family of devices, including Project Scorpio this holiday, as well as hundreds of millions of Windows 10 PCs, and millions of folks using the Xbox app on mobile platforms.

One notable omission from the Xbox Live Creators Program is support for multiplayer gameplay and achievements; those features will be limited to members of the more exclusive ID@Xbox program. More information about the Xbox Live Creators Program can be found on Microsoft’s website. The company said that sign-ups will be limited at first so it can fine-tune the program, but it plans to open up the program to everyone later on.

Microsoft’s decision to open up Xbox and Windows 10 game development comes at an interesting time: Valve recently changed the rules for its Steam marketplace to be more restrictive. The Xbox Live Creators Program is likely meant to capitalize on those new limitations and encourage devs to create Universal Windows Platform (UWP) games. Time will tell if opening the floodgates will cause the same problems for Microsoft that it did for Valve.

Beam, Game Mode, And More

The rest of Microsoft’s announcement focused on small updates to features and services we already knew about. The Xbox Game Pass–which offers unlimited access to Xbox One and “backwards compatible” Xbox 360 titles for $10 per month– servicerevealed earlier this week got a shout-out. So did Beam, which was announced in October 2016 and will offer “fast, low-latency, interactive game broadcasting.” Here’s the big news about Beam:

Building on this, today we shared preliminary details of the Beam Interactivity 2.0 SDK with developers – coming in March, this SDK will make developing interactive streaming features much easier and enable a wide range of new design scenarios. More details about this SDK will be published on the Beam blog and detailed in the Beam-specific GDC talk tomorrow.

Microsoft also drew attention to Game Mode, a feature coming in the Windows 10 Creators Update this Spring that promises to boost game performance, and new additions to the Xbox Play Anywhere platform that allows Xbox One and Windows 10 PC owners to play games on either platform at will. The company said that 10 games already support Xbox Play Anywhere and 16 more are going to embrace the platform some time in the near future.

Go to Source