AMD Ryzen 7 1800X CPU Review

The conundrum AMD currently faces started when it launched Bulldozer to lackluster reviews back in 2011. The following years found it trying to right the ship with Piledriver, Steamroller, and Excavator. But it’s safe to say the company’s host processing portfolio never regained its lost luster. Meanwhile, Intel dominated the mobile, desktop, and server markets with a seemingly insurmountable performance built on the excellent Sandy Bridge design and an unrelenting cadence of incremental improvements.

Heading into today’s review, nobody can argue the fact that AMD is far behind by comparison. Wouldn’t that make a comeback all the more impressive, though?

We started seeing Zen micro-architecture teasers last year. Company representatives told us its next generation would usher in incredible performance gains, matching or surpassing Intel’s best efforts on multiple fronts. Zen also promised to revitalize an aging platform through Socket AM4 and new core logic. And AMD says it has a clear path forward planned for future versions of Zen.

At some point, though, the rubber has to meet the road. A good first step was taking aim at a competitor. Intel’s $1000+ Core i7-6900K seemed like an ambitious choice, but early hand-picked benchmark results made AMD’s eight-core engineering samples look formidable. Then, announcing that the flagship model would sell for less than half of of the -6900K’s price sent the masses into a frenzy. Most online vendors even sold out of their Ryzen 7 1800X allocation during pre-sales based on little more than AMD’s own endorsement.

Now it’s time for Ryzen to stand up on its own accord and show us what it can do in the real-world. We have several Ryzen SKUs in-house, spread across multiple Tom’s Hardware labs. We’ve identified a number of unexpected results that bear continued investigation. We’ll continue updating our coverage as answers materialize. But we want to start putting our findings in front of you so enthusiasts can make more informed buying decisions in the face of general availability, which begins today.

Finding Zen

Four years ago, AMD began its work on the Zen core, which is its first clean-sheet architecture since Bulldozer. AMD’s initial objective was to transition from the 28nm process used for its modern APUs to GloablFoundries’ 14nm FinFET node, which offers increased performance and density within a similar power envelope. The company also set an ambitious goal to increase instruction-per-clock throughput by 40% over Excavator through a series of design choices that significantly boost performance. Notably, AMD deployed a new architecture and a lithography shrink simultaneously, which is a daunting challenge.

Last year, we published Everything Zen: AMD Presents New Microarchitecture At HotChips. In that story, we stepped through the composition of Zen, from front to back, right up to describing the CPU complex (CCX) responsible for housing four execution cores, each core’s 512KB L2 cache, and 8MB of shared L3 cache. If you aren’t already familiar with Zen and how it differentiates from prior-gen designs, check that piece out.

Moving forward, you need to know that the Zen core is Ryzen’s fundamental building block. All three SKUs we’re introducing today employ two quad-core CCXes, adding up to 4.8 billion transistors across the entire die. The company says its Infinity Fabric connects the CPU complexes, but remains shy about how that’s a quantifiable benefit.

As we established in our architectural deep-dive, AMD also arms Zen with simultaneous multi-threading support, allowing each physical core to operate on two threads in parallel. In theory, this improves the utilization of available hardware resources. A lot of our real-world benchmarks bear that out with phenomenal performance gains. But other workloads expose teething pains we’re still trying to diagnose.

How about the 40% IPC improvement goal AMD set for itself? Well, after factoring in the new micro-op cache (bypassing the L1 and L2 for frequently-accessed micro-ops), the better branch prediction engine, the 1.75x-larger instruction scheduler window, and faster caches, the company cites a +52% final tally compared to Excavator. Naturally, we have our own single-threaded workloads to run and will gladly make comparisons using CPUs from our lab.

The Ryzen 7 Line-Up

AMD is splitting its newest CPUs into the eight-core Ryzen 7 family, a six-core Ryzen 5 series, and the quad-core Ryzen 3 line-up. Only the  Ryzen 7 SKUs are shipping today, but it’s easy to see that AMD is targeting Intel’s Core i7, i5, and i3 portfolios with a similar naming scheme.

Aside from Intel’s eight- and 10-core i7s, the Ryzen 7s deliver higher core counts across the board. The AMD CPUs also blow Intel’s Broadwell-E prices out of the water, though four-core/eight-thread Kaby Lake is generally cheaper (albeit with half as many cores).

It’s not entirely clear what features AMD plans to roll out across the Ryzen 5 and 3 CPUs, but we do know 7s sport the SenseMI suite. We’ll go into more depth on SenseMI shortly. What’s important here, though, is that SKUs with an X suffix include the eXtended Frequency Range capability. XFR automatically increases clock rate beyond the factory-set Precision Boost frequency if you provide additional thermal headroom with an aggressive cooler. This extra bit of speed applies to two of the chip’s cores.

Ryzen 7 is solely a host processor, devoid of integrated graphics. All three models debuting today drop into the Socket AM4 interface, include eight physical cores, and boast 16MB of shared L3 cache. They also sport unlocked ratio multipliers, though you’ll need a motherboard based on the X370, B350, or X300 chipsets to overclock.

The Ryzen 7 1800X features a 3.6 GHz base frequency able to hit 4 GHz under lightly-threaded workloads via Precision Boost technology. Both of those specifications are higher than Intel’s eight-core Core i7-6900K. With Precision Boost enabled, all of the 1800X’s cores can operate at 3.7 GHz. And with enough thermal headroom, two cores jump as high as 4.1 GHz.

Perhaps surprisingly, given the comparisons to Intel’s 140W Broadwell-E behemoths, 1800X bears a 95W TDP. If that’s not enough to make you believe AMD has a new lease on life, the $500 price tag should excite professional content creators especially. Of course, if you don’t regularly find yourself running heavily-threaded tasks, Ryzen 7’s value isn’t as pronounced. After all, Intel’s Kaby Lake-based Core i5s and i7s offer solid performance and generally sell for less than the top-end AMD chips. Ryzen 7’s performance in our benchmark suite will have to justify the premium.

The 95W Ryzen 7 1700X’s clock rates drop to 3.4 GHz base and 3.8 GHz under Precision Boost. Those frequencies compare favorably against the 140W Core i7-6800K, which tops out at 3.6 GHz in lightly-threaded tasks and only comes equipped with six cores. Worse, Intel charges $425 for the -6800K, while AMD is introducing Ryzen 7 1700X at $400. The Core i7-7700K also becomes relevant at this point, with its $350 price tag.

AMD’s Ryzen 7 1700 has a 65W TDP, making it the lowest-power eight-core desktop CPU available. A 3 GHz base clock rate and 3.7 Precision Boost ceiling are significantly lower than Intel’s 91W Core i7-7700K. However, the company compensates with twice as many physical cores and a comparable price tag. 

Ryzen Memory Support
MHz
Dual-Channel/Dual-Rank/Four-DIMM
1866
Dual-Channel/Single-Rank/Four-DIMM
2133
Dual-Channel/Dual-Rank/Two-DIMM
2400
Dual-Channel/Single-Rank/Two-DIMM
2677

The six-core/12-thread Ryzen 5 family should surface in Q2, and include at least two models. The Ryzen 5 1600X will feature a 3.6 GHz base and 4 GHz Precision Boost ceiling, while the 1500X is expected to start at 3.5 GHz and ramp up to 3.7 GHz in lightly-threaded workloads. AMD hasn’t shared cache configurations yet for those models. Ryzen 3s are also in the queue, though those aren’t expected until the second half of 2017.

AMD geared its pricing structure to target the 99% of enthusiasts it says buy CPUs priced under $500. If Ryzen 7 is successful, the stage is set for even more disruption in the mid-range and low-end segments as well. So, does Ryzen begin its life on stronger footing than Bulldozer? Let’s find out.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: Everything Zen: AMD Presents New Microarchitecture At HotChips

MORE: Intel Kaby Lake Core i7-7700K, i7-7700, i5-7600K, i5-7600 Review

MORE: Broadwell-E: Intel Core i7-6950X, 6900K, 6850K & 6800K Review

Go to Source

AMD’s official gaming benchmarks show just how fast Ryzen is against Intel

Fancy having a look at some comparative gaming benchmarks pitting AMD’s new Ryzen processors against Intel rivals? Then your luck’s in, because AMD has released a whole slew of official results to feast your peepers on.

In 4K gaming, AMD pitted its Ryzen 7 1800X against Intel’s Core i7-6900K, in a system which had 16GB of RAM and an Nvidia Titan X graphics card (the latest Pascal offering).

Average frame rate scores were, overall, very close across a range of games. AMD sneaked a win in Battlefield 4 with 72 fps (frames per second) playing Intel’s 70 fps, and it was a similar story in Ashes of the Singularity (54 fps versus 53 fps). Doom (using Vulkan) was more of a victory for the AMD processor which achieved 81 fps compared to 75 fps for the 6900K.

Intel’s processor just edged the GTA V benchmark, though, with 123 fps compared to 122 fps for AMD, and the Core i7 was a clear winner in Civilization VI, hitting 83 fps versus 69 fps.

Using 99th percentile frames per second benchmarks – i.e. excluding 1% of potential outliers, to get a better indication of overall smoothness, essentially a more accurate average – gave similar results, but tipped things a little more in AMD’s favor when it came to Battlefield 4 (54 fps beat Intel’s 44 fps).

1440p prowess

As for 1440p gaming, AMD pitted the Ryzen 1700X against Intel’s Core i7-6800K in a system which had 16GB of RAM and an Nvidia GTX 1080 graphics card.

The Intel processor had a slight lead across most benchmarks, but again it was a very close run thing. In favor of Intel, the average frame rate in GTA V was 145 fps versus 139 fps, with Battlefield 4 witnessing 115 fps beat 111 fps, and Ashes of the Singularity was 56 fps edging out 55 fps.

AMD’s chip did take one benchmark though – Doom (Vulkan) hit 127 fps on the 1700X compared to 123 fps. 99th percentile fps benchmarks were a similar story, save for one exception: GTA V swung heavily in AMD’s favor with 71 fps leaving Intel’s 52 fps in the dust.

When the Ryzen 1700 was matched against the Core i7-7700K (in a PC with 16GB RAM and a GTX 1070) it was a similar story of the former pretty much keeping up with the latter. With average frame rates, both processors hit 44 fps in Ashes of the Singularity, although Intel had the clear lead in GTA V with 190 fps beating out 167 fps.

AMD outdid Intel in a couple of games, though. Battlefield 4, for example: 94 fps played 93 fps – and in Doom (Vulkan) AMD achieved a narrow victory by 101 fps to 98 fps.

Of course, we have to bear the pricing in mind here, particularly when it comes to AMD’s flagship 1800X, which retails at $499 (£489 in the UK, around AU$790). Compare that to Intel’s Core i7-6900K which it was benchmarked against – that card runs to around the $1,000/£1,000 mark (around AU$1,600).

TechRadar’s tests

Here at TechRadar Towers, we’ve also been performing our own gaming benchmarks comparing the Ryzen 1800X with Intel’s Core i7-7700K. These tests were run at 1440p resolution with maximum detail levels, on a PC that consisted of an Asus Crosshair VI Hero AM4+ motherboard, 16GB of RAM and an Nvidia GTX 1080.

In Total War: Attila, the Ryzen chip achieved 39 fps, only lagging a touch behind the Core i7-7700K which managed 41 fps. It was a similar story in Far Cry Primal, with the AMD processor racking up 75 fps compared to 77 fps for Intel.

In both Rise of the Tomb Raider and Deus Ex: Mankind Divided, the CPUs scored identically with both hitting 43 fps and 13 fps respectively. (Note that Deus Ex is the exception here – only being run at ‘very high’ details, and suffering badly on the optimization front for Nvidia drivers, as you can see).

Compared to AMD’s 4K benchmarking of the 1800X versus the 6900K, this is a much more favorable comparison for Intel on the cost-effectiveness front, given that the Core i7-7700K is priced at around $340 or £300 in the UK (about AU$485).

Go to Source

Soundpeats Q16 review

Soundpeats Q16 review

When you’re working out and want to listen to your music without annoying those around you, a comfortable but secure pair of wireless earphones is a must. And that’s exactly what you get with the Soundpeats Q16. Also see: Best sport earphones 

Soundpeats isn’t one of the better-known brands in the headphone market, but that’s a good thing for you as it means they won’t cost you the earth. You can buy these wireless earbuds from Amazon UK for just £36.99 ($49.99 from Amazon US).

The sound quality from the Soundpeats Q16 is really very good. Audio is clear and easy to make sense of, with little interference from background noise. Wearing these wireless earphones we were able to switch off the outside world and concentrate on the task at hand.

We like the design, which is both sporty and stylish. The Q16’s ear hooks are flexible and, although plastic, don’t feel uncomfortable against the ear. In our experience the earphones remained securely in place throughout our workout – something that cannot be said for most wireless earphones on the market. The company’s logo is found on the front of each earpiece, but it doesn’t look offensive.

A small zip-up carry case comes in the box with the Soundpeats Q16, which helps you to keep together the individual earphones and charging cable. Each wireless earbud charges over Micro-USB separately, but a twin-pronged cable is supplied so you can charge both from a single USB charger. This is a flat and reasonably short cable, so you shouldn’t experience any issues keeping it free from tangles. Also see: Best budget headphones

As you might expect a range of silicone tips are included in the case, helping you to get the most comfortable fit for your ear canal.

You’ll find the controls on both earpieces, with two buttons for adjusting the volume or track up and down and a central multifunction button that does different things depending on for how long you press it. This also doubles up as a power button.

An LED is found at the bottom, alongside a mic and covered Micro-USB input.

Read next: Best headphones

Soundpeats Q10: Specs

  • Wireless sport earphones
  • Bluetooth 4.2
  • 33 feet working range
  • 6-hour battery life
  • 1-2-hour charging time
  • 48x40x32mm
  • 24g
  • 12-month warranty

OUR VERDICT

The Soundpeats Q16 are a great pair of wireless sport earphones. They come at an excellent price, yet offer decent comfort and audio quality.

Retailer Price Delivery  

Price comparision from , and manufacturers

Go to Source

Nintendo Switch cartridges 'taste so bad'

Cartridges for the Nintendo Switch console taste foul because of a “bittering agent” intended to prevent them from being accidentally swallowed.

The discovery was made after gamers noticed the repellent flavour.

“I can still taste it. Do not try this at home,” tweeted games writer Jeff Gerstmann last week.

However, other gamers have since posted videos online of their reactions to tasting the cartridges and Nintendo has confirmed the use of a chemical agent.

Cartridges for the Nintendo Switch, which is released worldwide on 3 March, are 34mm by 23mm (1in).

Nintendo revealed a non-toxic bittering agent, denatonium benzoate, had been applied to the game card, in a statement to video games site Polygon.

This was “to avoid the possibility of accidental ingestion”, the statement added.

Denatonium benzoate has an especially bitter taste and is commonly added to products such as paint to deter people from consuming them.

However, news that the cartridges are intended to taste disgusting has not discouraged some from licking them.

“Oh, it’s so… God… it’s so awful,” said one YouTuber.

Readers are advised not to try tasting Nintendo Switch cartridges at home.

Go to Source

Mobile World Congress 2017 offered a touch of tech optimism

This year’s Mobile World Congress 2017 in Barcelona offered all the delights I’ve come to expect — the gracious hosts, waiters, cops, beautiful venues and a great subway system.

But something about the MWC exhibits and the international workers manning those booths this year was a little different, almost odd, when compared with my previous six visits. That oddness probably had a good bit to do with international politics and the young presidency of Donald Trump, which will inevitably affect the tech and wireless industries, if not every other business sector and worker.

“What’s the world of Trump like?” a woman asked me at a booth of telecom companies from Greece. “We know he hates us all, not just the Greeks.”

“I don’t want to talk about Trump,” a visitor from China told me sternly on my morning walk into a hilly park called Montjuic. I hadn’t even mentioned Trump, but the man from China knew I was American.

Almost every vendor I met on the show floor seemed to either avoid the Trump ascendency entirely or was deeply curious to know the pulse of America. I tried to stay busy and on schedule, so I shrugged off the Trump queries and got back to the business at hand: When will smart city tech catch on broadly? Why did Samsung wait to release its Galaxy S8 smartphone? How has Huawei of China grown to become the third-largest smartphone maker in the world in a few short years? Will 5G wireless be widely deployed in the U.S. by 2020?

Kevin Burden, an analyst at 451 Research and longtime friend, made an insightful comment near the end of the conference. He said 2016 wasn’t a great year for many telecomm companies, so many of their booths featured exhibits on emerging technologies in an attempt to find new lines of business. Visitors saw smart city tech at Verizon and AT&T, for example, and even augmented reality and virtual reality displays, he said.

Both wireless carriers also showed how they are now content providers, as well. AOL news anchor Katie Couric talked on a large screen in a Verizon theater. Executives from both carriers boasted about their road to 5G, with its greater bandwidth. (I was assured it will be rolled out by 2020, but how widespread is another question.) As wireless competition in the U.S. has grown, wireless service revenues are not growing as fast as in years past, putting a premium on all manner of new technologies for growth.

The story of Huawei’s rise to global prominence was put dramatically on display when hundreds of journalists cursed and pushed at the start of the weeklong trade show to get inside a venue to see the company’s new, higher-priced P10 smartphone. Burden noted that Huawei got its start by selling low-cost phones, a strategy that seems to have been picked up by both Lenovo with its Moto phones and HMD with its Nokia phones.

But Huawei will be challenged in the U.S., Burden noted, partly because of the difficulties of taking on successful companies like Samsung and Apple that are willing to take on newcomers with lawsuits over patent infringement and other tactics.

Some of the most promising technologies I saw at MWC came from city government IT officials who have developed smart city concepts and pilots. The chief technology officer of Barcelona mentioned the city has a portal to allow citizens to report government corruption. The deputy CIO of Moscow said the city has a pilot project underway to use artificial intelligence to detect lung cancer with 97% accuracy through computer analysis of CT scans.

Some future technology envisioned by cities will eliminate jobs, some of the city officials admitted. It was reassuring to hear Barcelona CTO Francesca Bria speak of her city’s desire to include computer-based initiatives to help retrain the work force in coming years.

What is most gratifying about attending a show like MWC is how it demonstrates that technology can provide answers for many looming global concerns, including sufficient energy, water and food supplies. A great example of tech’s promise came from an interview with Thomas Engel, manager of John Deere’s enterprise innovation strategy.

Deere’s electric tractor

Deere demonstrated an electric tractor prototype at an agriculture trade show in Paris the same week as MWC in Barcelona. Engel also described how Deere has been making tractors and harvesters that operate with automatic guidance systems for more than a decade. Deere’s GPS and sensor-based technology is designed to make sowing and reaping of crops more efficient, partly by making each pass of a tractor more precise.

Over a large field on enormous farms in eastern Europe, Engel said, a tractor might travel 30 minutes in one direction before turning around to make another pass. On such a large field, inches can make a difference in crop yield. Deere is trying to eliminate errors that could become bigger when multiplied by many farms over many years, Engel said.

Farmers are still riding in the cabs of large tractors and combines, even though they aren’t really needed for the guidance, he said. “An autonomous tractor with no driver and no cab is technically feasible,” he said, but there’s not been a strong case made that doing so would create significant savings.

Deere is also testing other technology, including robots to cultivate vegetable fields, Engel said. In some parts of the world, robots could be needed because field laborers are in short supply, he said.

Infrared sensors are also being used to detect protein levels in harvested grain to derive a more accurate count of a crop’s nutritional value.

All of Deere’s tech ideas stem from a set of forecasts about global population growth that will require food production to double by 2050, Engel said.

“It’s important to increase yields through precise farming,” he said. “Technology is a key to get there. It’s a race.”

That sounded optimistic in an unsettled world.

To express your thoughts on Computerworld content, visit Computerworld’s Facebook page, LinkedIn page and Twitter stream.

Go to Source

Samsung Galaxy Book vs iPad Pro

Samsung Galaxy Book vs iPad Pro

It’s been a while since we’ve seen any new tablets from Samsung, then three arrive at once. The Galaxy Book is perhaps the most interesting, since it’s (another) rival to Microsoft Surface Pro 4, reviewed. Lenovo and other manufacturers also have Surface Pro alternatives, but here we’re comparing the Galaxy Book to another rival: Apple’s iPad Pro.

Two of Samsung’s new tablets are Galaxy Books and, like the iPad Pro, this means a choice of screen size: 10.6in and 12in. Currently, the iPad Pro is available with a 9.7in or 12.9in screen, but rumour has it that Apple is about to introduce a 10.5in iPad Pro, plus an updated version of the 12.9in iPad Pro (which is now well over a year old). For more, see iPad Pro 2 rumours.

Our brief comparison here is based on our extensive use and testing of the iPad Pro and our short time with the new Galaxy Books at MWC 2017. You can read our hands-on review of the Galaxy Book, and our in-depth iPad Pro review.

See also: Galaxy Book vs Surface Pro 4 

Samsung Galaxy Book vs iPad Pro: Price

Frustratingly, Samsung is yet to reveal any prices for the Galaxy Book, so it’s impossible to know how it will compare with Apple’s pricing.

Like Microsoft, Apple charges separately for the keyboard case, but the Galaxy Book comes with one in the box – plus an S Pen stylus. If you want one of those for your iPad Pro, that’s an extra £99. Apple’s Smart Keyboard cover costs a whopping £169. The 9.7in version isn’t much cheaper at £149.

Samsung Galaxy Book vs iPad Pro: Software

Unlike Samsung’s third new tablet – the Tab S3 – which runs Android, the Galaxy Book runs Windows 10. This means you can install desktop apps such as Photoshop and Office and have as many windows open as you like.

While the iPad Pro does have a few software features aimed at productivity, such as the ability to run two apps on screen at once, it can’t really compete with a full-blown Windows tablet if you want to use it primarily for work.

Samsung Galaxy Book vs iPad Pro: Connectivity

The 12in Galaxy Book has two USB-C ports which means it’s quite versatile in terms of what you can connect to it. Right now, though, USB-C is a bit of a pain, as you need convertors to attach standard USB devices such as a mouse. But in years to come, they will be very welcome as peripherals switch to the new standard.

The iPad has a Smart Connector, but its single Lightning port also has to be converted if you want to attach it to an HDMI screen or something else.

Of course, both tablets have Wi-Fi and Bluetooth for wireless peripherals, so you can print and do many other things without needing to connect any cables at all. Thanks to AirPlay, you can send video wirelessly from an iPad Pro to an Apple TV, and audio to an AirPlay speaker.

Samsung Galaxy Book vs iPad Pro: Core specifications

Samsung Galaxy Book

Apple iPad Pro 12.9

Windows 10 (version not specified)

iOS 10

12in Super AMOLED display, 2160×1440, 216ppi

12.9in display, 2732×2048, 264ppi

7th Gen Intel Core i5 processor, 3.1GHz

Apple A9X

4GB / 8GB

4GB RAM

128GB / 256GB

Up to 256GB storage

2x USB-C

Lightning port

Video output via USB-C

Video output via Lightning adaptor

MicroSD reader

No card reader

802.11ac Wi-Fi

802.11ac Wi-Fi

LTE model available

LTE model available

Bluetooth 4.1

Bluetooth 4.2

5Mp front camera

1.2Mp front camera

13Mp rear camera

8Mp rear camera

Pogo Keyboard case included

Smart Keyboard not included

S Pen included

Apple Pencil not included

291.3×199.8×7.4mm

307.7×220.6×6.9mm

754g (tablet only)

713g (723g LTE model)

Samsung Galaxy Book 12: Specs

  • Windows 10
  • 12in Super AMOLED display, 2160×1440, 216ppi
  • 7th Gen Intel Core i5 processor, 3.1GHz
  • 4GB / 8GB
  • 128GB / 256GB
  • 2x USB-C
  • Video output via USB-C MicroSD reader
  • 802.11ac Wi-Fi
  • LTE model available
  • Bluetooth 4.1
  • 5Mp front camera
  • 13Mp rear camera
  • Pogo Keyboard case included
  • S Pen included
  • 291.3×199.8×7.4mm
  • 754g (tablet only)

OUR VERDICT

It’s hard to say which is the best since Samsung has not yet announced pricing for the Galaxy Book. However, the bottom line – as in any comparison with a Windows 10 tablet and an iPad – is that you’re really choosing between iOS and Windows. That’s the fundamental difference, and what will determine which tablet is best for you. The iPad Pro is the better device for entertainment and for use as a tablet, while the Galaxy Book will be the better choice for productivity, especially if you want to install desktop Windows apps and use a keyboard and mouse.

Retailer Price Delivery  

Price comparision from , and manufacturers

Go to Source