Next-Gen SteamVR Hardware Will Process Tracking Data On Board

Lattice Semiconductor Corporation revealed that Valve chose Lattice’s iCE40 FPGA to improve the SteamVR Tracking platform. Lattice’s iCE40 FPGA reduces position tracking overhead by combining the tracking data from multiple sensors before sending the signal to the host processor.

The first generation of Valve’s SteamVR Tracking system (formerly known as Lighthouse tracking) offers sub-millimeter tracking accuracy and industry-leading tracking volume, but Valve isn’t satisfied with its solution. The company is hard at work developing the second generation of the SteamVR Tracking system, which increases tracking volumes, reduces power consumption, and lowers manufacturing costs.

However, Valve is slowly trickling out information about this second-gen SteamVR Tracking system. Early this year, Valve confirmed that it’s developing new SteamVR Base Stations that offer reduced internal complexity. They won’t include moving parts like the current Base Stations do, which should improve reliability and reduce manufacturing defects. The new Base Stations also feature a wider field of view and the ability to pair the headset with more than two units.

In early June, Valve revealed that it partnered with Triad Semiconductors to develop sensor ASICs for the next generation SteamVR tracking system, and it implored hardware developers to adopt the new sensors immediately because of compatibility concerns. The new sensors would support the current Base Stations and the upcoming second generation Base Station design, whereas the old sensor design would not support the new Base Stations.

Valve is also working with Lattice, though, and adopted its iCE40 FPGA to help improve the efficiency of the SteamVR tracking system. The iCE40 FPGA pulls in tracking data from the Triad TS4231 sensors that are embedded in the trackable device and feeds the pre-processed data to the host computer. Lattice Semiconductors said that processing the data on the device before sending it to the host PC “reduces EMI emission, PCB congestion and improves signal integrity.”

“Our low power, low cost, and small form factor iCE40 FPGA allows each sensor to have an independent interface for low latency data capture and parallel processing,” said Ying Chen, senior business development manager at Lattice Semiconductor. “Our FPGA enables the host to accurately process simultaneous sensor events by providing the MCU with metadata such as time stamping. This capability can also be useful in other movement or flow related analysis where low latency and concurrency are required.”

Lattice Semiconductors said the iCE40 FPGAs could benefit other VR- and AR-related devices, including wireless HMD upgrade kits, multi-camera 360-degree rigs, and MIPI bridging for micro displays (such as the ones that Kopin is developing for mobile VR HMDs).

Valve hasn’t yet announced any products that are being built for SteamVR Tracking 2.0, but the company has more than 500 SteamVR Tracking licensees developing technology and hardware for Valve’s tracking system, so you can expect product announcements in the future.

Go to Source

Kailh Laptop Switches With Light Pipe Design, Pictured

The grab bag of new low-profile mechanical switches from Kaihua that we saw at Computex 2017 continue to emerge into the real world. Just a week or so after the company made its Kailh Choc PG1232 switches official, Kaihua released pictures of its intriguing PG1442 switches.

The 1442 switches have a scissor design, which is similar to, but not identical to, the (as yet unreleased) 1425 series. (You’re reading that correctly: Kaihua is working on two separate low-profile mechanical scissor switches.) That’s unique enough, but the other compelling part of the design is the light pipe.

On most mechanical switches, you’ll find the LEDs at the top of the switch housing. This is why some keycap legends are positioned close to the top of the caps instead of in the center, and it’s also why you often see slightly uneven underglow. Another limitation of that placement is that usually, secondary keycap legends are backlit unevenly. This is true even of “RGB” switches that have clear housings.

It’s generally accepted that a centered, through-stem lighting design is superior, because the light is easier to control and is less messy. Perhaps oddly, though, it’s rare to see it. One major vendor that uses such a design is Logitech, on its Romer-G switches, but major switch makers like Cherry and Kaihua have eschewed this design.

Also note that, obviously, these switches will not offer Cherry MX-compatible keycaps. Instead, they’re meant for chiclet-style caps, which you can see in the images.

We do not yet have any specifications on these switches, unfortunately.

These 1442-series switches, then, are notable for multiple reasons: They’re low profile (apparently a growing trend), have a scissor design, and use a light pipe. The fact that prototypes now exist–at Computex, remember, there were no prototypes to be seen on the show floor–means that Kaihua has now at least taken one step closer to turning them into a Real Thing.

Go to Source

Is There Another PSVR Controller Coming?

Last year, Teotle Studios released its sci-fi thriller The Solus Project for Xbox One and PC, including support for the Oculus Rift and HTC Vive VR platforms, but Sony’s PlayStation 4 and PSVR were left out in the cold. Today, The Solus Project made its debut on Sony’s console and VR platform, and the description page may reveal a new input configuration for PSVR players.

Navigation Conundrum

The Solus Project is a free-roaming first-person experience that requires joystick input to move around, and it also supports using the PlayStation Move motion controllers to add hand-tracked input to the game. If you’re not intimately familiar with the PSVR platform and the limitations that it has, you likely won’t notice the conundrum that combining those two input methods poses.

Presently, you can play PSVR games with a Dual Shock 4 controller or a pair of PlayStation Move controllers. Games that offer scripted locomotion, such as Until Dawn: Rush of Blood, work well with PlayStation Move controllers to give you tracked hands within the game. Games that require input for locomotion, such as Resident Evil 7, aren’t compatible with the Move controllers, though, because of the absence of a joystick or d-pad.

When the PlayStation Move controllers were in demand for PS3 games, Sony developed a solution for locomotion input–a controller called a PlayStation Navigation Controller, which resembled a Move controller without the colored ball for motion tracking. The Navigation Controllers include a thumbstick so that you can play games that require locomotion input, though.

Sony’s Navigation Controller didn’t live long on the PS3 platform, but it appears the old device is coming out of retirement. The Solus Project’s product page lists the Navigation controller as a compatible input device. We don’t know much about the Navigation Controller implementation yet, and we aren’t certain that it would work with the PSVR version of the game, but it would make a lot of sense if it did. The HTC Vive, Oculus Rift, and upcoming Windows MR platforms all support motion control with thumbstick input, so adding Navigation Controller support to the PSVR platform would make it easier for developers to bring content from other VR platforms to PSVR.

Something Altogether New?

Of course, it’s entirely possible that the Navigation Controller that Teotle Studios supports isn’t the same Navigation Controller that we’re familiar with. Sony could be preparing to announce a new peripheral, and we’ve found one developer reference that reinforces that notion. VR Visio Games is building a VR first-person shooter for HTC Vive, Oculus Rift, and PSVR called Special Forces VR, which also features free locomotion. The developers said that Special Forces VR would support Sony’s PlayStation Aim Controller, which includes thumbsticks for movement, but then they posted a message on their Facebook page last week that suggested they recently received a secret PSVR-related device.

It’s possible that the secret device is a new version of the Navigation Controller that features both motion tracking and joystick/d-pad input.

A motion controller with a locomotion input would be a great boon for the PSVR platform. Currently, every other premium VR system offers both features at once, and as a result, PSVR can’t support all the same content that you can get on a PC-connected VR system. Navigation Controller support would be a good step in the right direction, but we’d prefer to see a new Navigation Controller with a tracking bubble attached to it.

Sony’s PlayStation Experience conference is coming up in early December. Perhaps Sony will have new hardware to announce.

Go to Source

New Expansion For 'Black Desert Online' Arrives September 27, And It's Free

New content is coming for Black Desert Online players later this month. In addition to more gameplay, the first part of the “Kamasylvia” expansion will also feature a new location, and Kakao Games provided an early look at the area.

The new Kamasylvia area consists mainly of sprawling woodlands, which are inhabited primarily by elves. The studio said that its size in landmass is the same as the neighboring region of Calpheon, which hosts the game’s largest trading area. In Kamasylvia, the main hub is the village known as the Old Wisdom Tree. Even though anyone can travel to the new region at launch, Kakao Games said that high-level characters will benefit the most from the new content. In addition, group play is recommended, so bring some comrades with you.


The expansion will introduce numerous main and side missions that provide more details on the Ranger and the Dark Knight, which are two player classes native to the area. If you’re on the hunt for more items to add to your inventory, you can try out two new events, “The Alchemist of Kamaslyvia” and “Growth of a Kamasylve Tree.”

For an additional challenge, you can work with other players to visit a new location in the area called The Altar of Training. It includes a survival mode where your group must defend sacred treasure from multiple waves of increasingly difficult enemies. If you manage to make it past the fifth wave, a powerful enemy called the Ancient Puturum appears, and defeating it will provide you powerful items and gear.

Access to the expansion also gives you the chance to get a new Tier 9 mount called Arduanatt. The “Tier 9” designation means that it’s one of the fastest mounts you can have in the game. The studio said that it also includes an ability called Wings of the Wind that allows it to glide in the air after a double jump.

The new expansion arrives on September 27, and it will be available to all players for free. The studio also noted that this is only the first half of the expansion, and more information on the latter portion of “Kamasylvia”is coming soon.

Name Black Desert Online
Type MMORPG
Developer Kakao Games
Publisher Pearl Abyss
Platforms PC
Where To Buy
Release Date March 3, 2016

Go to Source

What is FinTech (and how has it evolved)?

When you use PayPal, Apple Pay, Google Wallet or simply your credit card to make an online purchase, you the consumer, the ecommerce retailer and the banks behind the money exchange are using FinTech.

When Charles Schwab, TD Ameritrade or Fidelity Investments purchase stocks and the banks settle the securities transactions, that’s FinTech.

And when you go online to find the best mortgage rates for that dream home or to refinance the one you’re in, that’s FinTech.

FinTech defined

Broadly speaking, FinTech (financial technology) is anywhere technology is applied in financial services or used to help companies manage the financial aspects of their business, including new software and applications, processes and business models.

Once considered more of a back-end, data center processing platform, FinTech has in  recent years come to be known as the basis for end-to-end processing of transactions over the Internet via cloud services.

FinTech is not new. It’s been around in one form or another virtually as long as financial services has. After the global financial crisis of 2008, however, FinTech has evolved to disrupt and reshape commerce, payments, investment, asset management, insurance, clearance and settlement of securities and even money itself with cryptocurrencies  such as Bitcoin.

“When you think about banks today, they’re really technology companies if you look at where they spend their money,” Eric Piscini, a principal in the technology and banking practices at Deloitte Consulting, said.

In just a few short years, the companies that provide FinTech have defined the direction, shape, and pace of change across almost every financial services subsector, according to Deloitte Consulting.

“Customers now expect seamless digital onboarding, rapid loan approvals, and free person-to-person payments – all innovations that FinTechs made popular. And while they may not dominate the industry today, FinTechs have succeeded as both standalone businesses and vital links in the financial services value chain,” Deloitte said in a recent industry report.

How FinTech can be disruptive

According to Deloitte, disruptive forces that have reshaped the FinTech industry include, but are certainly not limited to:

  • The growth of online shopping, which is expanding quickly at the expense of in-person shopping, leading to the dominance of online, cashless solutions for transactions.
  • A shifting balance of power that swings from banks and other financial services to those who own the customer experience. Banks are eliminating in-person services and looking to FinTech and large technology companies for other ways to engage customers.
  • New trading platforms that are collecting data to create an aggregated market view and using analytics to uncover trends.
  • Insurance products, which are becoming more tailored to customers who, in turn, are demanding coverage for specific locations, uses and timeframes. That’s driving insurers to collect and analyze additional data about their clients.
  • Artificial intelligence, which now plays a role in differentiating financial services products as it replaces complex human activities.
  • Transaction process improvement and middleware, both of which remain expensive. This is pushing traditional financial services firms to consider partnerships with marketplace lenders for FinTech solutions that don’t require a full infrastructure overhaul.

A new world of regulations

After the 2007-2009 financial crisis, regulators turned up the heat on the larger players in the financial services industry, enabling smaller and more agile firms and upstarts to gain traction. For example, the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010 created a number of new oversight agencies and represented the largest set of regulatory oversight changes in the financial services industry since the Great Depression.

In addition, companies that provided integration technology, services, data and analytics for banks saw a significant increase in the use of their hosted services, according to Jason Deleeuw, a vice president at Piper Jaffray covering financial and business services companies.

After spending billions of dollars and thousands of hours to comply with that new regulatory landscape, the financial services marketplace turned its collective attention to rolling out new products and services. In some cases, banks became the technology developers. But in most cases, the financial services sector found it far simpler to outsource the technology for electronic payments or onboarding of customers rather than build it in-house, Deleeuw said.

For example, online mortgage servicing platforms saw a surge in adoption by banks for processing client accounts.

“They [the banks] are dealing with more regulatory issues around servicing mortgages, so it’s becoming more costly to do this with an internal system,” Deleeuw said. “I think it’s helped drive banks more toward outsourced solutions because of the cost and reduced regulatory risk involved in trying to manage their own internal systems.”

With increased interest in service-based systems, the technology grew more robust even as the costs of implementing it fell, enabling even further proliferation, Deleeuw added.

The explosion of ecommerce has created a healthy ecosystem of start-up tech  suppliers for the financial services, retail and other industries. While cautious, banks in particular are quick to adopt technology that can create new revenue streams or bring on efficiencies. So they sought help integrating new technologies, such as peer-to-peer payments, into their massive legacy infrastructure.

Over the past decade, the FinTech supplier ecosystem has grown from 10 or so key players to more than 10,000 companies, according to Piscini. That, in turn, has spawned a new service known as ecosystem relationship management, or ERM.

“The way you manage 10,000 suppliers is completely different from the way you managed 10 technology partners,” Piscini said. “That’s a big challenge for large organizations: how do you manage your 10,000-supplier ecosystem versus the 10 relationships you had before. For them, it’s not as much about technology but what kind of innovation can I source and how do I do that in an ecosystem that’s much more fragmented than it used to be?”

Banks as tech providers

Banks have also become technology providers, competing with the likes of PayPal or Square and sometimes collaborating on rolling out shared platforms to enable services.

For example, earlier this year Early Warning Services LLC. – a technology provider owned by Bank of America, BB&T, Capital One, JPMorgan Chase and Wells Fargo – unveiled its new Zelle person-to-person payments service. The service platform is expected to be supported by more than 30 banks this year and will let 86 million U.S. mobile banking customers send and receive payments as an alternative to cash and checks.

“So now the FinTech [firms], who were disrupting the banking industry are now being disrupted by the banking industry, which is an interesting spin of events,” Piscini said. “It’s a good example of the disruptors being disrupted.”

Go to Source

G-Core

Founded in 2011 to support online gaming, Luxembourg-based G-Core Labs now offers powerful management hosting packages and a very capable, enterprise-level CDN.

G-Core’s network covers more than 40 locations across four continents, significantly more than many competitors. The company hasn’t just crammed most of them into Europe and North America, either. Russia and CIS countries are covered by fifteen points of presence (PoPs), there are five in Asia, and others in Brazil, Israel, Dubai and Australia.

The core service is a pull CDN, where G-Core automatically grabs files from your servers as they’re required. A prefetching feature enables loading files before they’re first requested, ensuring users always get the best possible performance. There’s optional origin push support (G-Core stores all your content, reducing the load on your server) at extra cost.

Security features include free shared SSL, and the option to use your own custom SSL certificate at no extra cost.

Elsewhere, a strong feature set includes support for HTTP/2 and IPv6. Custom HTTP rules allow you to define exactly how and when content is cached and served, an ‘Instant Purge’ (which actually takes ‘several minutes’) feature clears the cache whenever you like, and there’s a REST API to help automate any complex tasks.

Pricing

G-Core is aimed very much at high traffic sites. Even the baseline Startup plan has a minimum charge of $250 (£200) a month, which gets you a bandwidth allowance of up to 5TB.

Still, if you crunch the numbers that’s better value than most. It works out at $0.05 per GB, similar to many budget providers, and maybe half the price of Amazon Cloudfront and other big-name services.

There are no annoying hidden catches. G-Core doesn’t charge extra for requests, or HTTPS, or transfers from premium regions. Overage rates are the same $0.05 per GB. There are no artificial limits on the number of zones or resources you can create, either – one account can be used to support as many domains as you like.

For us, G-Core’s real stand-out feature is its transparency. A Demo Account allows browsing the CDN Control Panel to see the type of reports you’ll get, and the settings you can apply. A status page shows you recent network issues. And if that’s not enough, there’s a free 14-day trial available, no payment details required.

Setup

Getting started with G-Core is unusually easy. Hand over your name, company and email address, click the Confirm link in the following email and you’re taken directly to G-Core’s web console: no hassles at all.

The Control Panel is more intuitive than most, too. It prompts you to create a new CDN resource, default settings are well chosen, and you could start by entering just your origin server (the source domain or IP address).

There’s also plenty of control available for more experienced users. The CDN can pull content from a single source or a group, optionally with a custom port. It can connect only via HTTP, only via HTTPs, or choose automatically. You’re also able to use a shared SSL certificate, or add a wildcard SSL of your own.

G-Core allows specifying a CNAME, or alternate domain (cdn.mydomain.com). Unusually, you can extend this with a custom origin directory. For instance, you can set this to /images/ and a client request to ‘cdn.mydomain.com/pic.jpg’ is redirected to ‘cdn.mydomain.com/images/pic.jpg’. It’s a simple idea, but could be a convenient way to keep your files more organized.

The well-designed interface doesn’t assume you’re a CDN guru. Jargon is kept to the service basics, text boxes initially display example entries, and there’s a caption under each setting to explain exactly what it does. It’s both more powerful and usable than most of the competition.

If your needs are simple you might not have much more to do. Tapping the ‘Setup Instructions’ displays your origin and CDN names, and explains how you must update your website code to have objects pulled from the CDN (use ‘cdn.mydomain.com/file.ext’ instead of ‘www.mydomain.com/file.ext’.) The same page points you to support articles for common issues, and gives you an email link for personal support.

There are more configuration options if you need them. These start with origin shielding, where you can specify a CDN server to handle requests in the event of a cache miss (a simple way to reduce the load on your origin server).

G-Core also provides an extremely powerful Rules dialog. This enables matching URLs with literal text or regular expressions, applying new caching rules, configuring access in various ways (country, IP address, referrer, secure token, user agent, HTTP method and more), adding custom headers, even applying GZIP compression or optimizing the service to deliver large files.

This is by far the most complex set of functions G-Core has to offer. Rule creation isn’t fully described in the support pages, and you’ll need plenty of CDN and HTTP header experience to understand what’s possible. But if that’s not an issue, you’ll find an enormous amount of power and flexibility available here, which tramples effortlessly over most other services.

Once you’re up and running, G-Core’s Dashboard keeps you up-to-date with attractive reports on CDN traffic, bandwidth, response codes, cache hit ratio and requests per second. You can filter by time to view anything from the last hour to a full month, or you’re able to view aggregated data over the last year.

Advanced Analytics takes reporting even further, with details on cache use by location, directories, devices, browsers, operating systems and more.

Support is available 24/7, including by live chat, email and ticket. We tried the chat service and had accurate and helpful replies to our questions within a couple of minutes, much better than we typically see elsewhere.

Overall, it’s a very impressive setup – but you don’t have to take our word for it. If you’re at all curious, go to the G-Core login page, click Demo Account and you can explore all the screens and reports we’ve described. There’s no need to register or provide any details, you’re free to browse as much as you’d like.

Performance

Comparing CDNs is a challenge, as there are so many variables to consider: the numbers and locations of your visitors, the size and type of files, how often they’re updated, the web applications you’re using, and more. Change any element and you could get a different result.

One simple approach is to just look at average CDN response time. It’s only a single figure and can’t begin to tell you the whole story, but it’s still a useful metric which gives a basic idea of how fast a service may appear.

As we write, CDNPerf ranks G-Core as 13th out of 23 for worldwide response times. While that might seem very ordinary, it’s only fractionally behind some big-name services (Edgecast, Cloudfront and Fastly are ranked 10, 11 and 12) and just ahead of a few others (KeyCDN, CacheFly, MaxCDN).

Drilling down to continent-level performance reveals similar mid-range scores. G-Core is rated 9th for response times in Europe, 11th in Asia, 17th in North America.

There is one major speed highlight, though. CDNPerf rates G-Core as the fastest responding CDN in Russia. The margin of victory can be huge – G-Core’s average response time is currently 27ms, Level3 is 80ms, Fastly 105ms – and the service also scores well in neighbouring countries.

Overall, G-Core offers decent performance which is in line with many other enterprise CDNs. It’s difficult to say precisely how it will work for you, but we would recommend taking the free trial to see for yourself.

Final verdict

G-Core is a high-quality service with a stack of features, a well-designed web dashboard and loads of detailed reports and CDN analytics. The $250 (£200) monthly minimum charge is hefty, but if you’ll use the 5TB transfer allowance, G-Core is a must for your CDN shortlist.

Go to Source

Microsoft confirms Outlook issues

Microsoft has confirmed that some users of its email service Outlook are unable to send email or access their accounts.

Hundreds from around Europe have commented on the website Downdetector that they have been affected by the problem – many since Monday morning.

One common issue seems to be that sent emails remain in the drafts folder and are not being delivered to recipients.

On its website, Microsoft says the service dropped “unexpectedly” and it is working on a fix.

Not all account holders are affected.

“Intermittent connectivity is affecting customers in some European countries, which we are working to resolve as soon as possible,” said a Microsoft representative.

Outlook incorporates Hotmail and Windows Live Hotmail accounts.

On its service status page, Microsoft is currently saying that an “alternate infrastructure” is being used while the service is restored.

“We’ve identified that a subset of infrastructure was unable to process requests as expected, which caused general service availability to drop unexpectedly,” it says.

“We’ve redirected requests to alternate infrastructure to restore service, and we’re monitoring the environment while connectivity recovers.

“Additionally, we’re investigating an issue in which users are unable to send email messages.”

Go to Source