Mechs In VR: A Stroke Of Locomotion Genius In 'Archangel'

VR presents numerous challenges for game developers, not the least of which involves how to frame scenes and help you navigate them. In Archangel, an upcoming VR title, developer Skydance Interactive used mechs to near perfection.

Mech Pilot In A Desert City

In the demo we saw at GDC, you enter the world of Archangel as a mech pilot. You start out in the cockpit of your mechanical beast, what Skydance Interactive calls a “six-story-high war machine.” You can see a facsimile of your hands courtesy of the controllers you’re holding (in our case, the Vive controllers), and when you move your hands, your mech’s giant robot hands move as well.

You’re on a planet that may or may not be earth; it’s a futuristic-looking city, with tall buildings rising out of a desert floor.

A voice in your ear–the AI that shepherds your team through battle–has you run a “systems check” as a means to get you accustomed to the game controls. Simply put, your right arm alternately has a machine gun or a rail gun, your left hand shoots a missile or “paints” multiple bogies with a missile targeting system before blasting them all out of the sky, and you can toggle on a shield for either arm.  

That’s pretty much it. Your mech stalks along a predetermined route, but you can look all around from the cockpit, just like you would if you were actually piloting one. (We should be so lucky.)

You have a team with you, and they fly around in hover vehicles as you all encounter waves of flying space jets as well as tanks slogging over the sandy terrain. Eventually, you face soldiers on the ground as well as beefier airships that are more difficult to take down.

As you make your way through the violent city streets, your comms are interrupted a couple of times by the enemy–some kind of rogue colonel, it seems. He offers you a devil’s bargain; your teammates urge you not to pay him any mind.

By the end of the demo, you’re in the firefight of your life, and just as you face off against a powerful airship, a giant metal…something…clamps down on your mech from above.

Everything goes dark. Fin.

Ideal Locomotion

Achieving quality locomotion in VR games is difficult to say the least, but Skydance Interactive found the perfect foil in the mech idea.

You have a delightful perch–six stories high according to the developer, remember–and you’re positioned close to the front of it, so you have a nice wide field of view. You never really lose that heart-pounding sense of height.

The mech walks, but you don’t. Normally, that’s an issue that can make you nauseous, but the developers used a few clever tricks to make it work. First of all, you’re moving rather slowly, with big, deliberate stomps, as a giant robot killing machine is wont to do. That means the scenery doesn’t move too fast for you.

Second, you can see just enough of the cockpit to remember that you’re essentially inside a vehicle. Humans are accustomed to moving quickly inside of vehicles without getting nauseous–except for sometimes when you get car sick or air sick–and by adding those reference points, Skydance Interactive gives you that same sense. (In that way it reminds us of CCP Games’ Gunjack.) It helps that this is a game you can play while seated.

Butter-Smooth Gameplay

It’s hard to describe how satisfying it is to move your physical arms and see giant robot arms move in concert with them. You feel powerful. Dangerous. Which is what you are when you’re piloting your mech.

You plod along a preset path, periodically stopping to fight waves of enemies. All the while, your NPC teammates are chattering over the comms, calling out the attacks as they come and delivering key info about what’s happening so you can react accordingly.

As we mentioned earlier, you have two weapon options on each arm, plus a shield for each arm that you can engage with a button press. You don’t have unlimited ammo, but you do have fairly deep magazines, so you don’t have to reload every few seconds. The different enemies go down faster with different weapons, so there’s a bit of a dance to rotating through your weaponry.

Your shields are the same: They last for quite a while, but not forever. At times there’s a lot of bullets flying at you, so a strategy we quickly adopted involved pulling up a shield with one hand while blasting away with the other. When the shield expires, you switch hands. That’s perhaps a little obvious, but because each weapon is ideal for taking out a certain type of enemy, you have to be disciplined in what you’re targeting when you’re using the weapons on a given hand.

Put another way: We suspect this will turn into one of those minute-to-learn, lifetime-to-master kinds of games.

Throughout the demo, the gameplay was crystal clear and butter smooth. All of the actions, even in the middle of an intense firefight, evinced no lag or issues of any kind.

Because of the fun, engaging, and pretty gameplay and visuals; clever no-nausea locomotion; and seated experience, Archangel is the type of VR title you can lose yourself in for long stretches of time.

The game will arrive in July 2017 for the HTC Vive, Oculus Rift, and PSVR, and on Steam.

Go to Source

Uber uses 'secret program' Greyball to hide from regulators

Uber has been using a secret program to prevent undercover regulators from shutting down the taxi-hailing service in cities around the world.

The software, called Greyball, was developed to help protect the company from “violations of terms of service”.

But data collected through Uber’s phone app has been used to identify officials monitoring its drivers.

Uber has acknowledged that Greyball has been used in multiple countries, the New York Times reports.

The tool has enabled the company to monitor users’ habits and pinpoint regulators posing as ordinary passengers, while trying to collect evidence if they believe the service to be breaking local laws governing taxis.

The software works by collecting geolocation data and credit card information to determine whether the user is linked to an institution or law enforcement authority.

A “fake” version of the app would then allow those individuals suspected of attempting to entrap drivers to hail a cab, only to have their booking cancelled.


Read more


The existence of the Greyball program was revealed in an article published in the New York Times on Friday, which attributed the information to four current and former Uber employees, who were not named.

“This program denies ride requests to fraudulent users who are violating our terms of service,” Uber said in a statement.

“Whether that’s people aiming to physically harm drivers, competitors looking to disrupt our operations, or opponents who collude with officials on secret ‘stings’ meant to entrap drivers,” it added.

It comes in the same week that the chief executive of Uber, Travis Kalanick, was forced to apologise after a video emerged of him swearing at one of the company’s drivers. Just two weeks earlier he apologised for “abhorrent” sexism at the company.

Go to Source

Impulse Gear Confirms 'Farpoint' Features Co-Op, Launches Mid-May With PSVR Aim Controller

When Impulse Gear’s Farpoint launches in May, you won’t have to explore the alien world you’re lost on alone. The developer revealed the game has co-operative gameplay.

Last June, Sony announced that a new independent development studio called Impulse Gear was working on an exclusive PlayStation VR first-person shooter called Farpoint. Back then, not much was known about the game–and, truth be told, we still don’t know a whole lot about it.

We do know this: Farpoint is set on an alien world filled with hostile creatures. The spacecraft in which you were traveling crashed on the planet, and now you must fight to survive. Sony introduced Farpoint as a single-player experience, which, honestly sounds terrifying. Who wants to experience being alone on an alien planet with no one to talk to, and no one to give you a helping hand? Not this guy, that’s for sure. But let me bring a buddy along, and we’ll tear those aliens a new one.

Fortunately, when Farpoint launches I’ll be able to do that, and so will you. During GDC, Impulse Gear revealed that Farpoint includes an online co-operative gameplay mode, so you won’t have to face the solitude of being the only human on the planet.

The developers built the game around a special peripheral called the PSVR Aim Controller. Impulse Gear worked with Sony to build the gun peripheral, which features a tracking ball like the Move controllers, and a thumbstick to enable locomotion. The Aim controller launches alongside Farpoint, which is the only title for it so far, but we expect to see more titles with support for the gun-like controller in the future.

Impulse Gear said that Farpoint launches on May 16. It is currently slated as a PSVR exclusive, but we’re not sure if the release is a timed exclusive or not.

Go to Source

Nintendo Finally Responds To Switch's Joy-Con Issues

Nintendo’s Switch console made its way to consumers today. We have one in house, and already, we can confirm that the Joy-Con controllers have connectivity problems. Our left Joy-Con, specifically, frequently and consistently delays or drops the connection to the Switch console.

This isn’t an unknown problem. Several reviewers complained about this issue, and the general consensus was that people with large hands might obstruct the signal, but otherwise the Joy-Con was hunky-dory. It’s not. Nintendo finally acknowledged the problem today with an article on its support website, advising consumers with faulty Joy-Con controllers to limit potential sources of interference. That list of problem devices includes:

  • Cell phones, laptops, tablets, etc.
  • Wireless headsets
  • Wireless printers
  • Microwaves
  • Wireless speakers
  • Cordless phones
  • USB 3.0-compatible devices such as hard drives, thumb drives, LAN adapters, etc.

Nintendo advises Switch owners to move those devices three to four feet away from the console. If that doesn’t work, the company asked its customers to “please power these devices off while using the Nintendo Switch console” if the problem persists, which is about as useful as having a doctor tell you to stop moving your arm if your elbow’s making a weird clicking noise. But that’s not all; Nintendo also said to make sure the Switch is not:

  • Behind a TV
  • Near an aquarium
  • Placed in or under a metal object
  • Pressed against a large amount of wires and cords
  • Within three to four feet of another wireless device, such as a wireless speaker or a wireless access point.

That’s already proven to be a problem for us. Using the Switch in TV mode requires putting the device in a dock, and the easiest place to put that dock is behind the TV to which it’s connected. The alternative would be to either place it horizontally, which is worrisome because the inside of the dock isn’t padded to protect the console’s display, or to find somewhere else on the entertainment center to put the device. There simply isn’t any room for that.

Even if there were, the Switch is probably not going to be anyone’s only console. The PlayStation 4 and Xbox One have been out for years. Combine those with set-top boxes, wireless routers, speakers, and the like, and it’s hard to imagine a scenario in which the Joy-Con and the Switch will be in perfect harmony. And that’s assuming Nintendo’s explanation for the problem is accurate–a few tests with our console raise questions about that.

Our Joy-Con controller doesn’t experience problems all the time. Issues occur only when a specific part of the controller is touched. Moving closer to the Switch, or providing line-of-sight between the two instead of allowing the TV to sit in between them, does not alleviate this problem. Avoiding the particular spot on the Joy-Con requires holding the controller in an uncomfortable position that many people (including us) won’t naturally assume.

It’s good to see Nintendo respond to the Joy-Con problem, but the response itself feels a lot like Apple co-founder Steve Jobs’ admonishment that people experiencing cellular network problems with the iPhone 4 were just holding the device wrong. Switch itself is promising, but the Joy-Con issue somewhat mars the experience, and Nintendo’s unwillingness to admit fault raises serious questions about the company’s respect for its customers.

Go to Source

EVGA (Barely) Teases GeForce GTX 1080 Ti FTW3 With iCX

Manufacturers haven’t waited to announce new GeForce GTX 1080 Ti cards. Nvidia announced the 1080 Ti Founders Edition on February 28. Just three days later, EVGA teased what it called the EVGA GeForce GTX 1080 Ti FTW3 with iCX Technology, albeit with nothing more than a name and an image.

The 1080 Ti FTW3 iCX is the heir apparent to the all-too-similarly-named 1080 FTW2 iCX. Both use the company’s (sorta) new iCX cooler technology, which features a redesigned cooler shroud and PCB with nine different temperature sensors: three for memory, five for PWM, and one for the GPU. The iCX tech was inspired by problems with the ACX 3.0 cooling used by the company’s GeForce GTX 1080/1070 FTW cards; EVGA released free thermal pad kits and BIOS updates to help affected consumers address the issue. It also made iCX and, as the name implies, built it into the 1080 FTW2 iCX.


We examined the 1080 FTW2 iCX in February and concluded that EVGA’s efforts were worthwhile even if the card isn’t perfect:

With more sensors, the thermal pads we wanted to see added, and lots of features, EVGA is trying to get you to forget about its previous cooler. If the card also had a larger heat sink with integrated cooling for the memory and VRMs, it would be even better. It’s always good to know everything is in the green when it comes to temperatures, especially if you don’t have easy access to thermal imaging cameras and holes drilled in your backplate to verify thermal readings.

It’s probably a safe bet that the 1080 Ti FTW3 iCX will be much like its older sibling. The usual suspects will be different–the 1080 Ti boasts a Pascal GP102 GPU with 3,584 Cuda cores that boost up to 1,600Mhz and 11GB of 11 Gbps GDDR5X memory, which promises to make it faster than the 1080–but the overall goal of making people forget about ACX 3.0’s problems is likely the same.

EVGA didn’t reveal a release date, price, or full specs for the 1080 Ti FTW3 iCX; we expect to learn more about the card soon. In the meantime, Inno3D was the first manufacturer to announce custom GeForce GTX 1080 Ti cards, and it actually provided more information about the two products.

Go to Source

SMI Eye Tracking In A Vive HMD, Hands-On

After an announcement earlier this week, we were anxious to get some hands-on time with a Vive headset that had SMI’s eye-tracking technology on board. At GDC, we met with the company in Valve’s showcase area and got a chance to see what the companies built together.

Embedded, Not Bolted On

It’s important to understand how SMI’s eye trackers have been implemented into the Vive. They haven’t been bolted on or awkwardly attached in some other kludgy way; they’re embedded inside the headset, just as Tobii’s EyeChip has been.

Further, we were frankly astonished to learn that the SMI hardware inside is exactly the same thing we saw a year ago at Mobile World Congress. It’s still that little PCB and two skinny wires with tiny cameras at the ends, it can still offer 240Hz, and it still costs under $10 to implement.

Same as it ever wasSame as it ever was

We were not permitted to take photographs of the inside of the HMD, but we can confirm that the lenses were ringed with illuminators, which eye trackers need so they can get a clear look into your eyes.

We saw a similar type of illuminators on the Vive HMD that had Tobii’s eye tracking, but we can’t confirm at this time whether SMI’s version had the same illuminators or just ones that were similar.

Vive With Tobii Eye Tracking: Note illuminator ring around the lensesVive With Tobii Eye Tracking: Note illuminator ring around the lenses

Our educated guess is that they’re the same. It makes sense that either Valve or HTC would likely help to build one illuminator ring and leave it at that. It’s really just about lighting and cameras; the lights illuminate the subject (in this case, your eyeballs) so the cameras can take better shots. Once the subject is well-lit, it doesn’t really matter so much which camera is taking the picture–in this case, either SMI’s or Tobii’s.


Calibration

I should note that although calibrating SMI’s eye tracking is designed to be quick and easy, I had some issues getting it to “take.” All you have to do is follow a floating dot with your eyes for about five seconds. However, after my first calibration, the pink dot that tells you where your eyes are looking was jumping all over the place. We tried it a couple more times to no avail.

The problem, it turned out, was my glasses. In VR, glasses-wearing is a constant issue for those of us with crummy vision. They really don’t fit well in any HMD, but they fit well enough most of the time. It’s kind of uncomfortable, but the alternative is that you have to either switch to contacts or just accept that (depending on your prescription) parts of the VR experience will be a little bit blurry.

I generally just have to pick an inconvenience and roll with it. Apparently, wearing glasses confuses SMI’s eye tracker. I pulled off the headset, removed my glasses, put the headset back on, ran the calibration, and voila, it worked.

Input And Social Experiences

Once I donned the SMI-equipped Vive, I was taken through several demos. They were designed to demonstrate how you can use eye tracking as a selection tool, how it can enhance social interactions in VR, and how it can enable foveated rendering.

In the first demo, I was put into a virtual room with a couple of lamps and a desk littered with parts for a CPU cooler. I was tasked with assembling the CPU cooler, which was easier than it sounds. The parts were comically large, and the “assembly” was really just a matter of picking each part up in turn and making them touch one another, similar to the video game mechanics we’re all accustomed to seeing. But what was cool about the demo is that I just looked at each part to highlight it, which made it more intuitive to reach out and grab the item in question.

There were some Easter eggs in the room, too. I could look at one of the lamps and click a button on the controller to turn it on or off. This is a delightful method of input, and wonderfully intuitive. You look at something, it gets “marked” or “highlighted” because the eye tracker knows what you’re looking at, and you can interact–whether that’s a button click, a locomotion, or what have you.  

While in that particular demo, two other people joined me. We were all represented as cartoonish avatars, and we were standing around the desk. However, the combination of eye tracking, head tracking, and hand tracking (via the controllers) gave each of us some uniqueness.

It was further a reminder of how powerful VR can be for putting people who are not physically close to one another into the same virtual space. One of the two people who joined me was physically just a few feet away; the other was physically in Seattle. There were zero clues as to which person was where–no lag or latency, no degraded image, nothing.

We were able to make virtual eye contact because of the SMI technology. It’s hard to overstate how important that is within social VR; it’s one thing to use an avatar to get a sense of another person’s hand and head movements, but it’s quite another when they can make eye contact.

Think about all your real social interactions: Eye contact tells you a great deal. Some people make no eye contact, some people make too much, some people are a little shifty with their eyes, some people are looking over your shoulder for someone better to talk to, and so on. With eye tracking in a headset and a social VR setting, you get all of that.

I saw even more of that in the next demo, which put me into a room with those same two fellow where we sat around a table to play cards. In this environment, you could stand or sit; I was escorted to a virtual chair, and someone in the real world behind me gave me a physical chair to sit down on. (That process was a little weird, to be honest.)

Once seated, they showed me how you can blink, raise your eyebrows, cross your eyes, and more; the avatars reflected all of those actions.

Foveated Rendering

In the final demo, I stood in a dark room and was greeted by a grid of floating cubes that extended infinitely in all directions. The cubes were translucent and rainbow-colored. Someone manning the demo toggled foveated rendering on and off. I tried to trick the system by focusing on what was happening in my peripheral vision, because of course foveated rendering doesn’t bother to fully render the images around your peripheral vision, so the edges would be the giveaway.  

But I couldn’t sneak a peek, because, well, that’s exactly how eye tracking works. Conclusion: SMI’s foveated rendering within the Vive works as advertised, at least in this demo.

SMI has come a long way in a year. When we first met the company and its technology, it was at an event at Mobile World Congress. They had a small table in a big ballroom where other smaller outfits that didn’t have a booth presence in the main convention halls gathered to show their wares to media.  

At GDC, SMI was in Valve’s exclusive showcase area showing off the tech in a specially-modified Vive. We don’t know when we’ll see a shipping headset with SMI’s eye trackers, but it will probably be towards the end of this year.

Go to Source

Bungie Talks 'Destiny' Sequel, Final Live Event

It’s been two-and-a-half years since Bungie released Destiny, the online FPS with which it followed up the Halo franchise, and now the company has shared in a blog post some tidbits about what to expect from the game’s final live event as well as some information about its upcoming sequel.

The final live event is dubbed “Age of Triumph.” Bungie said it wants the event to be a “fun and memorable celebration” during which it will “look back upon the three incredible years we’ve shared as a community, and look ahead to some final challenges and rewards that await you in the weeks ahead.” The company will reveal more about the event on March 8, introduce the weekly rituals on March 15, and release the sandbox update on March 22.

Bungie was a little more forthcoming about Destiny 2–or whatever the sequel will be called–and how it will relate to the first game. The company said that “power, possessions, and Eververse-related items and currency will not carry forward” between titles. This is likely to rankle some players, many of whom have sunk countless hours into collecting everything Destiny has to offer, but Bungie said it believes this approach is the best option:

We believe this is the best path forward. It allows us to introduce the major advancements and improvements that all of us expect from a sequel, ensuring it will be the best game we can create, unencumbered by the past. We’re looking forward to sharing more details with you later this year for how we will honor your legacy in the future.

Not all will be lost in the transition from Destiny to its sequel: players will be able to carry over their character’s appearance between the games. That’s going to save people the frustration of having to recreate their favorite character, at least in the aesthetic sense, if not in a gameplay-related fashion. Here’s what the company said about bringing a character from Destiny to Destiny the Second:

We know that, just like us, you have grown fond of the Guardians you’ve created, so we do plan to preserve your character personalization. We are going to recognize the dedication and passion you’ve shown for this world. Specifically, the class, race, gender, face, hair, and marking selections for all characters that have achieved Level 20 and completed the Black Garden story mission will carry forward. We also plan to award those veteran accounts with honors that reflect your Destiny 1 accomplishments.

Many eyes will be on Bungie as Destiny‘s sequel nears release. The first game made $500 million on its first day; new entries in established franchises don’t even sell that well. Players have spent a lot of time in Destiny since that launch day, and Bungie has released several expansions that, by all accounts, have delivered on its promises for Destiny better than the base game did. Following up on those successes won’t be a walk in the park.

Bungie said in February that it plans to release the Destiny sequel at some point in 2017.

Go to Source