macOS High Sierra. Credit: AppleApple started using the High Efficiency Image Container (HEIC) in iOS 11 and macOS High Sierra in 2017. The new format promises much smaller file sizes compared to JPG or PNG, and it supports many other useful features. Microsoft and Google recently announced support for HEIC in the next versions of their respective operating systems.
HEIC And HEIF
HEIC is the container or file extension that holds HEIF images or sequences of images. HEIF borrows technology from the High Efficiency Video Compression (HEVC) codec, also known as h.265. Both HEVC and HEIF are proprietary technologies developed by the Moving Picture Experts Group (MPEG).
HEIF came into the mainstream when Apple made it the default format for its pictures on iOS11 devices and macOS High Sierra. However, other operating systems or websites don’t yet support HEIF and its HEIC file extension, so Apple’s operating systems will automatically convert the images to JPEG when users want to share them with friends who don’t use Apple products.
Benefits Of HEIC
The biggest benefit by far of HEIC, and why anyone would even consider yet another image format on the web, is that HEIC images promise to be half the size or smaller compared to JPEG images, but with even better quality. HEIC can also replace file extensions that support image sequences, such as GIF and GIFV.
Additionally, HEIC files can store not just multiple individual images, but also their image properties, HDR data, alpha and depth maps, and even their thumbnails.
Cons Of HEIC
Because HEIC and HEIF are developed by MPEG and are based on HEVC technology, that means that HEIC could have some patent issues. Multiple patent groups outside of MPEG claim to have patents on HEVC technology, so we may see similar claims against HEIC. This could (potentially) impact not just larger platform owners, such as Apple, but also small websites that publish images in the HEIC format.
Another con of HEIC is quite obvious: It doesn’t yet have widespread adoption beyond iOS 11 and macOS High Sierra. Without use on other platforms, HEIC could remain just another technology that can be used only among Apple users.
The good news is that Windows 10 Build 17123 should support HEIC in its Photos app, and Google also promised support for this file format in Android P. Both of these operating systems should be released later this year. However, it will take some time before most people are on Windows 10 Build 17123 or newer, or on Android P or newer. Therefore, HEIC adoption still won’t happen overnight, even if all the patent issues are resolved.
How To Open HEIC Files?
Because Apple supports HEIC on its iOS 11 and macOS High Sierra devices, its users shouldn’t have any issues opening HEIC files. They also don’t have to worry about converting them before sending them to friends, because they will convert automatically to JPEG when they’re shared through Apple’s multi-purpose Share sheet.
If you’re on Windows, you can’t yet open HEIC files natively, but you can convert them using either one of the online HEIC to JPG converting services, or you can download software to convert them on your PC. You may also want to use one of these services if you want to view the images as JPG on an Android device or a Linux machine.
HEIC could enable users to take not just smaller photos, but also higher-quality photos of higher resolutions, with minimal impact on their device storage. However, HEIC will become truly useful only when most users can open HEIC images on any and all of the devices they use, without having to worry about first converting them.
Microsoft clearly looks to boost Cortana’s effectiveness and helpfulness with the next big Windows 10 update. Known internally as Redstone 4, this update introduces a new ‘Cortana Show Me’ feature that teaches users how to navigate key features of the operating system (OS).
The firm has released a test version of this feature through its Windows Insider Preview, publicly accessible by anyone who’s interested should they want to test out a less-than-stable version of the OS. Specifically, this feature can be found in the Fast Ring of Windows Insider Preview Build 17128.
This feature update follows one released just earlier this week that adds profiles to the Cortana digital assistant, allowing it to provide insights and reminders before you even ask.
Windows 10 rookies no longer
The idea behind Cortana Show Me, which is available through the Microsoft Store within this preview build specifically, is to make newly-minted Windows 10 users feel acclimated more quickly and easily. To that end, the app currently provides detailed guides on several key OS functions and tasks, while voice activation will be added ‘soon’, a blog post announcing the feature reads.
So far, here’s what Cortana can help new users with through Cortana Show Me:
- Update Windows
- Check if an app is installed
- Uninstall an app
- Change your desktop background
- Use Airplane Mode
- Change your display brightness
- Add nearby printers or scanners
- Change your default programs
- Change your screen resolution
- Turn off Windows Defender Security Center
- Run a security scan
- Change Wi-Fi settings
These changes are particularly interesting as it appears Microsoft is hell-bent on seeing Cortana win the war of digital assistants between itself, Amazon, Apple and Google – particularly with Amazon’s Alexa soon to make it onto Windows 10 PCs this year.
Widely assumed to be known as the Spring Creators Update when it finally launches, we expect to see this major revision to Windows 10 available to all (in the most stable version of Windows 10) sometime in April.
Monitor tech has come a long, long way in the last few years. We’ve seen ultra-high refresh rates (like in Alienware’s 240Hz AW2518H). We’ve seen variable refresh for smoother gaming via Nvidia G-Sync (and AMD’s competing FreeSync tech). We’ve seen full-array backlighting and 1,000-nit brightness for excellent HDR output (in Dell’s UP2718Q, for example), among many other advances.
But while monitors have undoubtedly improved in myriad ways, the high-end models always seem to be prohibitively expensive—especially when compared to the gobs of screen real estate you can get by going with a similarly priced HDTV. That’s why, last summer, I swapped my 28-inch G-Sync-equipped Acer XB280HK monitor for a 49-inch Sony XBR49X900E TV.
Scandalous? Maybe to some PC gamers. But clearly I’m not alone in my desire for a really big PC screen, as evidenced by Nvidia’s announcement of its so-called “Big Format Gaming Displays” (BFGDs) back at CES 2018 in January. (We’re still awaiting availability and pricing on those, but I wouldn’t be surprised at all to see BFGDs sell for north of $3,000.)
BFGDs Aren’t TVs, and Most TVs Aren’t Great for Gaming
That said, the Nvidia displays (which will be sold by Asus, Acer, and HP) really are 65-inch monitors, and not TVs. They sport 120Hz refresh rates and G-Sync support, and they don’t have a built-in TV tuner. Many TVs are non-starters for serious gaming (due to input lag and refresh rates that are usually locked at 60Hz). But if you’re the type of gamer who wants to just enjoy your game—rather than amassing every hardware advantage to out-frag the quick-clicking teenagers of the world—a properly chosen TV can deliver a great gaming experience.
Plus, a TV can deliver the goods for other types of content better than any similarly priced monitor, because 4K HDR content just looks better on a big screen—especially when you aren’t sitting a foot in front of the panel and leaning in.
In my experience, a (very) big screen also makes for less eyestrain when working on productivity tasks. There is a whole lot of screen real estate available for documents, spreadsheets, and Web pages when your display is both 4K resolution (3,840×2,160 pixels) and you don’t have to ramp up Windows Scaling higher than 100 percent.
At that setting and resolution, a UHD screen is the same as stacking four 1080p screens in a 2×2 grid (minus, of course, the middle bezels). The same is in some ways true, of course, of smaller 4K monitors. But if the screen isn’t large enough to see things such as small text and UI details, all that extra real estate is useless—unless maybe you do your computing with the assistance of a loupe. The 27-inch 4K LG screen I’m writing this on, for instance, looks great. But Windows (rightly) suggests that scaling should be set to 150 percent. Dropping it to 100 percent makes onscreen text really tough to read unless I lean in close and squint—not great for my eyes, or my spine. I leave the scaling at 150 percent, and even then, things are sometimes smaller than I’d like.
A well-chosen 4K TV solves that text-size problem, and it excels in other areas, though you’ll of course need serious desk/wall space. My 49-inch Sony TV at home—which I’ve mounted to the wall for easy access to the rear ports and to move it back a bit from my desk—gets nearly as bright as current high-end HDR monitors. (Reviews peg the XBR49X900E at a bit under 1,000 nits.) HDR content, of course, looks great, and I don’t need a recent-model PC connected to the screen to watch protected content. Netflix and Google Play support are baked into the TV’s smarts—and I’ve recently added to that by plugging in an Amazon Fire TV.
I also personally have zero issues with gaming performance, as the input lag on the Sony XBR49X900 is quite good for a TV—Rtings.com pegs it at about 34ms in Game Mode. 2016’s Doom reboot is about as frenetic as my gaming gets, and it looks gorily great and plays just fine. And as I noted in my recent post about the excellent game They Are Billions, most of my gaming interests lean toward real-time strategy, with the occasional RPG thrown in when something grabs my interest. If you’re a competitive online gamer, you’re almost certainly going to want a dedicated monitor with speedier response times.
All-Around Performance and Value
Really, though, what sold me on the idea of using an UHDTV as a monitor—and the Sony XBR49X900E in particular—is its ability to do many things surprisingly well, and at a reasonable cost. At about $900 as I write this, my chosen TV-made-monitor is far from cheap. But it saves me the serious additional cost of having an expensive monitor and a pricey good-looking TV. It’s excellent for productivity work and surfing the Web—especially when I come home from work after staring at my “puny” 28-inch 4K office monitor all day.
And my Sony TV even works well for photo editing. I’m no professional photographer, but I do have a travel blog and accompanying Facebook page where I share my images and experiences—primarily from several trips to Northern Scotland. After a couple of calibration tweaks, I’ve been very happy editing photos on this screen—especially given I can catch up on Ash Vs. Evil Dead in a big window while working, or tinker with how an image looks in the Web interface on my site, while looking at alternatives at the same time.
So, while monitors continue to advance, and some like Samsung’s recent 49-inch ultra-wide GHG90 are also getting massive, I think I’ll stick with my big-screen TV. And given that TV prices continue to fall (and my eyes certainly aren’t getting any younger), my next monitor will probably be a TV, as well.
My only real complaint? TV companies don’t seem to make high-end TVs below the 49-inch class these days. I originally wanted a 43-inch TV for my monitor replacement because my home workspace is a bit cramped. But I couldn’t find a model in that size range last year with solid HDR abilities, low input lag, and accurate color. That’s okay, though. I’m extremely happy with the performance of the 49-incher I bought last year. I just push my chair back a few extra inches, pull my keyboard drawer forward, and get lost in the pixels.
Massdrop posted a svelte TKL keyboard with an aluminum frame and extensive RGB backlighting called the “CTRL.”It’s a fetching mechanical keyboard with a cool name, but there’s one problem: It looks every inch like Input Club’s K-Type, just dressed in black.
Input Club K-TypeThis wouldn’t be so surprising–after all, the Input Club did launch the K-Type on Massdrop last year–were it not for the public divorce the two companies recently went through. Read that article we linked to for a full breakdown of the situation, but in short, the two companies came to loggerheads about legal issues pertaining to certain IP, patents, and licensing agreements. That all came to light in September, and as far as our research can tell us, nothing has really been settled. Instead, the two parties seems to have simply moved on in lieu of protracted legal battles.
Or so we thought; the presence of this K-Type lookalike on Massdrop seems like…a statement. Of course, it could simply be that Massdrop 1) thinks the CTRL/K-Type is a great keyboard that 2) it can profit from and 3) that it has the rights to make and sell it. Even more noteworthy is the fact that you can order a CTRL with Cherry MX, Kailh, or–Halo switches. Those Halo switches were at the heart of the disagreement between Massdrop and the Input Club. (The Input Club moved on and made its not-so-subtly named “Hako” switches.)
It is true that the Input Club open sourced the K-Type’s design, so in a sense Massdrop is just doing at scale what anyone could do on their own. However, the Input Club does retain royalty rights of some kind. (We aren’t certain of the royalty details because we don’t have access to those documents, but from our conversations with the parties involved, that is the case.) If Massdrop is not paying those royalties, it could have a fresh round of legal challenges on its hands.
Whatever is going on between Massdrop and Input Club, though, the fact remains that you can snag a black-metal twin of the K-Type from Massdrop. In addition to the color difference and switch options, the CTRL uses QMK for programming instead of the Input Club’s KLL. The cost is $200, with an estimated ship date of August 15. You have eight days left to join the drop.
Meanwhile, the Input Club is on to its latest creation, the Kira, which has a unique condensed 99-key layout (pictured below). The full details of the Kira are under wraps for now, but its campaign will go live on Tuesday, March 27.
Last year, we saw innovation from AMD and Intel that we hadn’t experienced in a long, long time. The Zen architecture made AMD competitive in segments of the CPU market it previously couldn’t touch, and Intel moved as quickly as possible to defend its incumbent position. We thoroughly enjoyed the back-and-forth as both companies jockeyed for enthusiasts’ adoration.
But even as new platforms were springing up with more PCIe connectivity than ever before, graphics-card availability dried up. Cryptocurrency miners bought up everything they could find to capitalize on rising valuations. Even today, you can’t find modern models anywhere near their suggested retail pricing. We’ve resorted to buying pre-built systems and scouring the forums for previous-generation cards, trying to score a bargain.
The best spread of CPU technology in ages, paired with sky-high GPU prices, is a recipe for confusion for PC builders. For the same amount of money, enthusiasts can afford less graphics performance than they could not long ago. That makes it easy to overspend on host processing, since balance is thrown out of whack. But you can also get a lot more CPU for your dollar than this time a year ago. How do you make sure you’re getting the most for your budget?
Well, we set out on a mission with 14 CPUs and three different GPUs to find the best combination in nine popular games.
Moving The Goalposts
For the last 11 years, Core i7 and Core i5 CPUs featured four cores. Coffee Lake changed this. Now, Core i3s sport four cores, Core i5s include six cores, and Core i7s boast six Hyper-Threaded cores. Intel also gave its low- and high-end models a makeover: Skylake-X stretches up to 18 cores/32 threads for high-end desktops, while Pentium processors have now gained Hyper-Threading technology.
Of course, AMD introduced its line-up of Ryzen 7, 5, and 3 models with copious core counts. Moreover, the Ryzen Threadripper series landed with up to 16 cores/32 threads and such friendly prices that Intel was forced to make its Skylake-X chips more affordable.
As you might imagine, the old rules of picking a CPU family to go with certain graphics cards changed as a result of these new processors. Thus, we decided to investigate using the best performers from each CPU class.
Representing AMD, we have the Ryzen 7, 5, and 3 models. We didn’t bother testing last-generation Bulldozer-based CPUs, but we did throw in the Ryzen Threadripper 1950X for good measure.
For Intel, we have K-series Core i7, i5, and i3 CPUs from the Coffee Lake and Kaby Lake generations. We also added the Core i9-7900X and Core i9-7980XE to cover high-end desktops. Out of curiosity (or because we’re gluttons for benchmarking punishment?), we couldn’t help but include two of the latest Pentium processors, too.
That gives us 14 processors spread across five test platforms. We paired these with the GeForce GTX 1080, GTX 1070, and GTX 1060 (6GB) graphics cards. Although the GTX 1080 is considered an extravagance these days, we have to imagine it’ll come down in price someday.
Finally, we selected nine games for testing. Some of the titles are new, while others are older. We did weigh the suite, though, more toward modern games. Some of them are CPU-dependent, others are decidedly graphics-bound, and a few are actually pretty well split down the middle. This allows us to explore bottlenecks from different angles.
Today’s tests are all run at 1920×1080. (We have more data coming at 2560×1440 and 3840×2160, too, so expect a follow-up story or stories to present our findings there.) To best represent the experience we’d want to have, all benchmarks were run with the highest graphics settings possible.
To avoid variance from GPU Boost as our GeForce GTX graphics cards heat up, we use multiple runs from each benchmark in quick succession. We select the median value from the last recordings.
|Test System & Configuration|
||Intel LGA 1151 (Z370)
AMD Socket AM4
Intel LGA 1151 (Z270)
AMD Socket SP3 (TR4)
Intel LGA 2066
Common To All
MORE: Best CPUs
MORE: All CPUs Content
Entrepreneur Elon Musk has had the official Facebook pages for his Tesla and SpaceX companies deleted.
The #deletefacebook movement has grown after data firm Cambridge Analytica was accused of obtaining the personal information of about 50 million users.
Mr Musk had poked fun at speaker brand Sonos after it said it would suspend advertising on Facebook for one week.
His followers challenged him to have his own companies’ pages deleted, which he did within minutes.
Mr Musk said he “didn’t realise” that his SpaceX brand had a Facebook page. “Literally never seen it even once,” he wrote on Twitter. “Will be gone soon.”
Another follower pointed out that his battery firm Tesla also had a profile on the social network.
“Looks lame,” he replied. Both profiles disappeared within minutes of his posts.
The pages had more than 2.5 million followers each before they were deactivated.
In 2016, Facebook used SpaceX to launch a new communications satellite valued at more than $200m (£150m).
However, the rocket exploded on the launch pad and destroyed the satellite.
After a reporter tweeted that “@elonmusk blew up Mark Zuckerberg’s satellite”, Mr Must replied: “Yeah, my fault for being an idiot. We did give them a free launch to make up for it and I think they had some insurance.”
He said he would continue to use Facebook-owned Instagram for the time being, but lamented “FB influence is slowly creeping in”.