Nintendo Finally Responds To Switch's Joy-Con Issues

Nintendo’s Switch console made its way to consumers today. We have one in house, and already, we can confirm that the Joy-Con controllers have connectivity problems. Our left Joy-Con, specifically, frequently and consistently delays or drops the connection to the Switch console.

This isn’t an unknown problem. Several reviewers complained about this issue, and the general consensus was that people with large hands might obstruct the signal, but otherwise the Joy-Con was hunky-dory. It’s not. Nintendo finally acknowledged the problem today with an article on its support website, advising consumers with faulty Joy-Con controllers to limit potential sources of interference. That list of problem devices includes:

  • Cell phones, laptops, tablets, etc.
  • Wireless headsets
  • Wireless printers
  • Microwaves
  • Wireless speakers
  • Cordless phones
  • USB 3.0-compatible devices such as hard drives, thumb drives, LAN adapters, etc.

Nintendo advises Switch owners to move those devices three to four feet away from the console. If that doesn’t work, the company asked its customers to “please power these devices off while using the Nintendo Switch console” if the problem persists, which is about as useful as having a doctor tell you to stop moving your arm if your elbow’s making a weird clicking noise. But that’s not all; Nintendo also said to make sure the Switch is not:

  • Behind a TV
  • Near an aquarium
  • Placed in or under a metal object
  • Pressed against a large amount of wires and cords
  • Within three to four feet of another wireless device, such as a wireless speaker or a wireless access point.

That’s already proven to be a problem for us. Using the Switch in TV mode requires putting the device in a dock, and the easiest place to put that dock is behind the TV to which it’s connected. The alternative would be to either place it horizontally, which is worrisome because the inside of the dock isn’t padded to protect the console’s display, or to find somewhere else on the entertainment center to put the device. There simply isn’t any room for that.

Even if there were, the Switch is probably not going to be anyone’s only console. The PlayStation 4 and Xbox One have been out for years. Combine those with set-top boxes, wireless routers, speakers, and the like, and it’s hard to imagine a scenario in which the Joy-Con and the Switch will be in perfect harmony. And that’s assuming Nintendo’s explanation for the problem is accurate–a few tests with our console raise questions about that.

Our Joy-Con controller doesn’t experience problems all the time. Issues occur only when a specific part of the controller is touched. Moving closer to the Switch, or providing line-of-sight between the two instead of allowing the TV to sit in between them, does not alleviate this problem. Avoiding the particular spot on the Joy-Con requires holding the controller in an uncomfortable position that many people (including us) won’t naturally assume.

It’s good to see Nintendo respond to the Joy-Con problem, but the response itself feels a lot like Apple co-founder Steve Jobs’ admonishment that people experiencing cellular network problems with the iPhone 4 were just holding the device wrong. Switch itself is promising, but the Joy-Con issue somewhat mars the experience, and Nintendo’s unwillingness to admit fault raises serious questions about the company’s respect for its customers.

Go to Source

EVGA (Barely) Teases GeForce GTX 1080 Ti FTW3 With iCX

Manufacturers haven’t waited to announce new GeForce GTX 1080 Ti cards. Nvidia announced the 1080 Ti Founders Edition on February 28. Just three days later, EVGA teased what it called the EVGA GeForce GTX 1080 Ti FTW3 with iCX Technology, albeit with nothing more than a name and an image.

The 1080 Ti FTW3 iCX is the heir apparent to the all-too-similarly-named 1080 FTW2 iCX. Both use the company’s (sorta) new iCX cooler technology, which features a redesigned cooler shroud and PCB with nine different temperature sensors: three for memory, five for PWM, and one for the GPU. The iCX tech was inspired by problems with the ACX 3.0 cooling used by the company’s GeForce GTX 1080/1070 FTW cards; EVGA released free thermal pad kits and BIOS updates to help affected consumers address the issue. It also made iCX and, as the name implies, built it into the 1080 FTW2 iCX.

We examined the 1080 FTW2 iCX in February and concluded that EVGA’s efforts were worthwhile even if the card isn’t perfect:

With more sensors, the thermal pads we wanted to see added, and lots of features, EVGA is trying to get you to forget about its previous cooler. If the card also had a larger heat sink with integrated cooling for the memory and VRMs, it would be even better. It’s always good to know everything is in the green when it comes to temperatures, especially if you don’t have easy access to thermal imaging cameras and holes drilled in your backplate to verify thermal readings.

It’s probably a safe bet that the 1080 Ti FTW3 iCX will be much like its older sibling. The usual suspects will be different–the 1080 Ti boasts a Pascal GP102 GPU with 3,584 Cuda cores that boost up to 1,600Mhz and 11GB of 11 Gbps GDDR5X memory, which promises to make it faster than the 1080–but the overall goal of making people forget about ACX 3.0’s problems is likely the same.

EVGA didn’t reveal a release date, price, or full specs for the 1080 Ti FTW3 iCX; we expect to learn more about the card soon. In the meantime, Inno3D was the first manufacturer to announce custom GeForce GTX 1080 Ti cards, and it actually provided more information about the two products.

Go to Source

SMI Eye Tracking In A Vive HMD, Hands-On

After an announcement earlier this week, we were anxious to get some hands-on time with a Vive headset that had SMI’s eye-tracking technology on board. At GDC, we met with the company in Valve’s showcase area and got a chance to see what the companies built together.

Embedded, Not Bolted On

It’s important to understand how SMI’s eye trackers have been implemented into the Vive. They haven’t been bolted on or awkwardly attached in some other kludgy way; they’re embedded inside the headset, just as Tobii’s EyeChip has been.

Further, we were frankly astonished to learn that the SMI hardware inside is exactly the same thing we saw a year ago at Mobile World Congress. It’s still that little PCB and two skinny wires with tiny cameras at the ends, it can still offer 240Hz, and it still costs under $10 to implement.

Same as it ever wasSame as it ever was

We were not permitted to take photographs of the inside of the HMD, but we can confirm that the lenses were ringed with illuminators, which eye trackers need so they can get a clear look into your eyes.

We saw a similar type of illuminators on the Vive HMD that had Tobii’s eye tracking, but we can’t confirm at this time whether SMI’s version had the same illuminators or just ones that were similar.

Vive With Tobii Eye Tracking: Note illuminator ring around the lensesVive With Tobii Eye Tracking: Note illuminator ring around the lenses

Our educated guess is that they’re the same. It makes sense that either Valve or HTC would likely help to build one illuminator ring and leave it at that. It’s really just about lighting and cameras; the lights illuminate the subject (in this case, your eyeballs) so the cameras can take better shots. Once the subject is well-lit, it doesn’t really matter so much which camera is taking the picture–in this case, either SMI’s or Tobii’s.


I should note that although calibrating SMI’s eye tracking is designed to be quick and easy, I had some issues getting it to “take.” All you have to do is follow a floating dot with your eyes for about five seconds. However, after my first calibration, the pink dot that tells you where your eyes are looking was jumping all over the place. We tried it a couple more times to no avail.

The problem, it turned out, was my glasses. In VR, glasses-wearing is a constant issue for those of us with crummy vision. They really don’t fit well in any HMD, but they fit well enough most of the time. It’s kind of uncomfortable, but the alternative is that you have to either switch to contacts or just accept that (depending on your prescription) parts of the VR experience will be a little bit blurry.

I generally just have to pick an inconvenience and roll with it. Apparently, wearing glasses confuses SMI’s eye tracker. I pulled off the headset, removed my glasses, put the headset back on, ran the calibration, and voila, it worked.

Input And Social Experiences

Once I donned the SMI-equipped Vive, I was taken through several demos. They were designed to demonstrate how you can use eye tracking as a selection tool, how it can enhance social interactions in VR, and how it can enable foveated rendering.

In the first demo, I was put into a virtual room with a couple of lamps and a desk littered with parts for a CPU cooler. I was tasked with assembling the CPU cooler, which was easier than it sounds. The parts were comically large, and the “assembly” was really just a matter of picking each part up in turn and making them touch one another, similar to the video game mechanics we’re all accustomed to seeing. But what was cool about the demo is that I just looked at each part to highlight it, which made it more intuitive to reach out and grab the item in question.

There were some Easter eggs in the room, too. I could look at one of the lamps and click a button on the controller to turn it on or off. This is a delightful method of input, and wonderfully intuitive. You look at something, it gets “marked” or “highlighted” because the eye tracker knows what you’re looking at, and you can interact–whether that’s a button click, a locomotion, or what have you.  

While in that particular demo, two other people joined me. We were all represented as cartoonish avatars, and we were standing around the desk. However, the combination of eye tracking, head tracking, and hand tracking (via the controllers) gave each of us some uniqueness.

It was further a reminder of how powerful VR can be for putting people who are not physically close to one another into the same virtual space. One of the two people who joined me was physically just a few feet away; the other was physically in Seattle. There were zero clues as to which person was where–no lag or latency, no degraded image, nothing.

We were able to make virtual eye contact because of the SMI technology. It’s hard to overstate how important that is within social VR; it’s one thing to use an avatar to get a sense of another person’s hand and head movements, but it’s quite another when they can make eye contact.

Think about all your real social interactions: Eye contact tells you a great deal. Some people make no eye contact, some people make too much, some people are a little shifty with their eyes, some people are looking over your shoulder for someone better to talk to, and so on. With eye tracking in a headset and a social VR setting, you get all of that.

I saw even more of that in the next demo, which put me into a room with those same two fellow where we sat around a table to play cards. In this environment, you could stand or sit; I was escorted to a virtual chair, and someone in the real world behind me gave me a physical chair to sit down on. (That process was a little weird, to be honest.)

Once seated, they showed me how you can blink, raise your eyebrows, cross your eyes, and more; the avatars reflected all of those actions.

Foveated Rendering

In the final demo, I stood in a dark room and was greeted by a grid of floating cubes that extended infinitely in all directions. The cubes were translucent and rainbow-colored. Someone manning the demo toggled foveated rendering on and off. I tried to trick the system by focusing on what was happening in my peripheral vision, because of course foveated rendering doesn’t bother to fully render the images around your peripheral vision, so the edges would be the giveaway.  

But I couldn’t sneak a peek, because, well, that’s exactly how eye tracking works. Conclusion: SMI’s foveated rendering within the Vive works as advertised, at least in this demo.

SMI has come a long way in a year. When we first met the company and its technology, it was at an event at Mobile World Congress. They had a small table in a big ballroom where other smaller outfits that didn’t have a booth presence in the main convention halls gathered to show their wares to media.  

At GDC, SMI was in Valve’s exclusive showcase area showing off the tech in a specially-modified Vive. We don’t know when we’ll see a shipping headset with SMI’s eye trackers, but it will probably be towards the end of this year.

Go to Source

Bungie Talks 'Destiny' Sequel, Final Live Event

It’s been two-and-a-half years since Bungie released Destiny, the online FPS with which it followed up the Halo franchise, and now the company has shared in a blog post some tidbits about what to expect from the game’s final live event as well as some information about its upcoming sequel.

The final live event is dubbed “Age of Triumph.” Bungie said it wants the event to be a “fun and memorable celebration” during which it will “look back upon the three incredible years we’ve shared as a community, and look ahead to some final challenges and rewards that await you in the weeks ahead.” The company will reveal more about the event on March 8, introduce the weekly rituals on March 15, and release the sandbox update on March 22.

Bungie was a little more forthcoming about Destiny 2–or whatever the sequel will be called–and how it will relate to the first game. The company said that “power, possessions, and Eververse-related items and currency will not carry forward” between titles. This is likely to rankle some players, many of whom have sunk countless hours into collecting everything Destiny has to offer, but Bungie said it believes this approach is the best option:

We believe this is the best path forward. It allows us to introduce the major advancements and improvements that all of us expect from a sequel, ensuring it will be the best game we can create, unencumbered by the past. We’re looking forward to sharing more details with you later this year for how we will honor your legacy in the future.

Not all will be lost in the transition from Destiny to its sequel: players will be able to carry over their character’s appearance between the games. That’s going to save people the frustration of having to recreate their favorite character, at least in the aesthetic sense, if not in a gameplay-related fashion. Here’s what the company said about bringing a character from Destiny to Destiny the Second:

We know that, just like us, you have grown fond of the Guardians you’ve created, so we do plan to preserve your character personalization. We are going to recognize the dedication and passion you’ve shown for this world. Specifically, the class, race, gender, face, hair, and marking selections for all characters that have achieved Level 20 and completed the Black Garden story mission will carry forward. We also plan to award those veteran accounts with honors that reflect your Destiny 1 accomplishments.

Many eyes will be on Bungie as Destiny‘s sequel nears release. The first game made $500 million on its first day; new entries in established franchises don’t even sell that well. Players have spent a lot of time in Destiny since that launch day, and Bungie has released several expansions that, by all accounts, have delivered on its promises for Destiny better than the base game did. Following up on those successes won’t be a walk in the park.

Bungie said in February that it plans to release the Destiny sequel at some point in 2017.

Go to Source

What’s next for VR? Affordability and trade-offs, says Intel

When it comes to VR, you might say that Intel maintains a behind-the-scenes approach compared to Nvidia and AMD’s chest beating. 

However, high-end processing power is just as important as graphics for the demanding requirements of the Oculus Rift and HTC Vive on PC.For that reason, Intel is far from an underdog in the VR space, although the company is still well aware of the dangers of VR’s starkly divided audience. 

On the high-end, a VR headset will set you back upwards of $600 (about £489, AU$792). That’s without factoring in the $1,000+ PC required to use it. Meanwhile, low-end VR is littered with mobile smartphone solutions, offering bite-sized games and apps to those curious about the technology, but not nearly curious enough to drop a more substantial chunk of cash. 

Save for perhaps Sony’s PlayStation VR, there is hardly anything occupying the middle ground. 

At the 2017 Game Developers Conference (GDC) this week, we spoke with Kim Pallister, director of the Intel VR Center of Excellence, about the virtual reality improvements the industry needs to see next. Amongst our conversations, Pallister stressed that VR is lacking when it comes to support for mid-range devices such as lightweight Intel Kaby Lake-powered notebooks. 

“We really think that for [VR] to hit the mainstream – for it to get to a critical mass where developers can make money – there needs to be good/better/best solutions,” he said. “There needs to be a range of price and things that play in the mainstream space.”

As such, Intel has been working with its partners to make VR devices more accessible and cheaper without stripping it of high quality components. A major part of this puzzle was a collaboration with Microsoft, which was revealed back in October of last year to be a lineup of Windows 10-compatible VR HMDs

These come from a variety of different manufacturers including Acer, Dell and HP featuring various styles and designs, but with one thing in common: low PC spec requirements.

“As much as everyone loves the high-end enthusiast stuff, that alone is not enough to sustain a sizable market with a wide variety of content,” Pallister asserted. “We’ve been focused on the hard problem of getting a mainstream-type solution to market and Microsoft is one of the primary partners we’ve been working with there.”

To migrate VR over to more conservative PCs, however, there are sacrifices to clearly be made, sacrifices Intel is well aware of. The bigger challenge is determining where users won’t mind the compromise. From Pallister’s perspective, a lot of the pixels are already being wasted in VR, paving the way for opportunities aplenty to make virtual reality less reliant on high-end discrete GPUs.

“There are a lot of problems in the VR space where people just said, ‘Just throw more GPU at it, you’ve got a high-end gamer GPU in there, you’ve got a $400 GPU, just solve this problem with more GPU,’” Pallister told us.

He says that developers have a tendency to render everything at the highest detail, even in situations where it isn’t needed. Because VR is 360 degrees of visual stimulation, there’s a lot being depicted in-game while the player only sees a narrow cone. 

As Pallister puts it, you have “all of these other pixels that are way too much detail for what you can’t see out there, you just throw all that away.”

For mobile VR, high-end PC VR and everything in between, the trick is to “not waste time putting detail where you don’t need it.”

“I think that’s something you’ll see the whole industry focus on over the next couple years,”  which will be a piece of how you get down to more affordable price points, VR that works with thin and light notebooks [and] that doesn’t require a big, beefy desktop.”

Pallister believes there’s an audience for VR at the mid-range, even if it’s not gaining the traction he contends it deserves. 

“Part of the benefit of an open platform like PC is the market will find ways to fill that spectrum in,” he said. “There are people who like VR that don’t think the phone is good enough that can’t afford a Vive, so let’s aim for a solution there.”

Luckily, according to Pallister, we’re already on the right path to seeing VR work on a wider range of devices, though it’s a far cry from where we could be. As he reminded us, Oculus took its system requirements down a peg last year. Letting users with as low as an Intel Core i5-4590 yields benefits for everyone with a virtual reality HMD. 

“You saw this even with Oculus when they introduced asynchronous space warp and their scalability features last year,” he said. With one software feature, they were able to drop their graphics performance requirements by 40%.”

“Is there a trade-off in quality, sure, but that’s okay, that’s the way PC gaming works is that people can come in at different points and then decide if they want to dial things up or down.”

Go to Source

AMD's AM4 Ryzen Chipsets

Ryzen is here, and with it comes a new generation of motherboard chipsets from AMD. We covered the X370 in some detail in our Ryzen review and touched lightly on the others before, but we’ll compare their features more closely in this article.

Common Features

Before we discuss what makes each chipset different from each other, we should outline what they have in common. All of AMD’s AM4 chipsets support SATA-III and SATA Express ports. As SATA Express never really got off the ground, however, each SATA Express connection can be used to support two SATA-III ports instead.

All AM4 chipsets also have two PCI-E 3.0 lanes dedicated to NVMe storage devices. The connection canbe extended to an x4 NVMe connection at the cost of two SATA-III ports, or the vendor can leave remove NVMe support and reuse the PCI-E 3.0 lanes for something else.

AMD’s AM4 chipsets also do not support RAID 5. This feature is crucial for users who need to store lots of data securely, and its absence could hamper AMD’s AM4 sales to small businesses.

All AM4 chipsets connect to the CPU with a PCI-E 3.0 x4 connection, which is essentially equivalent to Intel’s DMI 3.0 connection.

The X370 Flagship

AMD currently has five AM4 chipsets in the works. The X370 chipset is at the top of the stack, and it features more connectivity support than its counterparts. We can roughly compare it to Intel’s Z270 chipset because it supports overclocking, and it can also split the CPU’s PCI-E lanes between two GPU slots.

Although the alignment is technically inferior to Intel’s Z270 chipset, which can split the CPU’s PCI-E lanes into an x4/x4/x4/x4 configuration, it will likely not have an impact on most users. Multi-GPU configurations containing more than two graphics cards are uncommon. This point is driven home by the fact that Nvidia ended support for SLI beyond a two-GPU configuration with its 1000-Series graphics cards.

AMD Desktop AM4 Chipsets











Form Factor






CPU PCI-E 3.0 Config Support

1×16 or 2×8

1×16 or 2×8




Memory support (Channels/DIMMs Per Channel)

DDR4 2667MHz (2/2)

DDR4 2667MHz (2/2)

DDR4 2667MHz (2/2)

DDR4 2667MHz (2/2)

DDR4 2667MHz (2/2)

CPU Overclocking Support






RAID Support 0/1/10


0/1 Only



0/1 Only

Chipset Maximum PCI-E Lanes

8 PCI-E 2.0 Lanes

4 PCI-E 3.0 Lanes

6 PCI-E 2.0 Lanes

4 PCI-E 2.0 Lanes

4 PCI-E 3.0 Lanes

NVMe Support






USB Support (2.0/3.0/3.1 Gen2)






SATA-III (6Gbps) Ports






Sata Express






The Mainstream And Essential Products

Although there are five AM4 chipsets, AMD primarily relies on X370, B350, and A320 to address the consumer desktop market.

The B350 solution has a unique position; like its main competitor, Intel’s H270 PCH, B350 does not officially support SLI nor Crossfire. Unlike H270, however, B350 allows you to overclock unlocked CPUs. Depending on how AMD’s board partners price B350 motherboards, this could lead to a strong advantage in the budget overclocking market.

AMD’s A320 SKU, however, lacks overclocking support and it’s probably best to compare it to Intel’s B250 and H110 PCHs.

In addition to its reduced feature set, B350 also loses two SATA-III and two PCI-E 2.0 Gen 2 lanes compared to the X370 chipset. It also provides four fewer USB 3.0 ports. A320 drops an additional two PCI-E lanes and one USB 3.1 Gen 2 port.

The SFF Solutions

AMD designed the X300 and A300 AM4 chipsets for compact, minimalist systems. Both feature limited connectivity options, but this is somewhat mitigated by the Ryzen CPU’s built-in SATA and NVMe controllers. Although Ryzen CPUs have evolved to the point of essentially being SoCs, they have not quite reached the point that you can use them without an accompanying chipset. As the SFF chipsets have a relatively small footprint, however, OEMs can use them to build fairly compact systems.

The X300 chipset, which AMD designed as as an enthusiast SFF solution, can overclock unlocked CPUs and split the CPU’s PCI-E lanes between multiple CPUs. The A300, however, lacks these abilities.

Go to Source

Inno3D Is Cooking Up Two 1080 Ti With iChill Coolers

Inno3D announced that in addition to the Founders Edition 1080 Ti, which launches next week, the company plans to launch two variants with the company’s iChill three- and four-fan coolers.

On February 28, Nvidia founder and CEO Jen-Hsun Huang took the stage at GDC to reveal the long-rumored GeForce GTX 1080 Ti. The new card boasts a Pascal GP102 GPU with 3,584 Cuda cores that boost up to 1,600Mhz. It also comes equipped with 11GB of 11 Gbps GDDR5X memory.

On paper, these specs are comparable to the mighty Titan X Pascal, and Nvidia claimed the 1080 Ti outperforms the Titan X. What’s more, board partners were not permitted to change any part of the Titan X, whereas they are free to customize and improve the 1080 Ti. If the Founders Edition 1080 Ti is faster than a Titan X, custom cards with advanced cooling solutions should prove even faster still.

Inno3D is the first graphics card manufacturer to announce a custom GeForce 1080 Ti card–and it has two in the pipeline. Inno3D revealed the GeForce GTX 1080 Ti iChiLL X3 and GeForce GTX 1080 Ti iChiLL X4 graphics cards, which include triple- and quad-fan cooling systems, respectively.

We don’t know much about the two cards yet. We know that both cards feature Inno3D’s iChill HerculeZ backplate and the company’s Air Boss fan shroud. The X3 features three large fans to cool a large heatsink; the X4 model adds an extra fan to the top of the card to draw air upwards when the card is vertical.

Inno3D didn’t reveal the base or boost clock speeds, and it didn’t say anything about the power draw.  But we can infer from the Inno3D GeForce GTX 1080 iChill x3 and x4 models that the company may have tweaked the power delivery system, and both cards undoubtedly come overclocked from the factory.

Inno3D hasn’t yet announced the price or availability date for the Inno3D GeForce GTX 1080 Ti iChill x3 and x4 graphics cards. Nvidia’s Founders Edition cards ship next week. We imagine Inno3D’s custom cards aren’t far behind.

Go to Source