Windows 12: Everything we know so far

In early 2021, the prospect of a brand-new version of Windows seemed highly unlikely. Microsoft previously described Windows 10 as “the last version of Windows”, and it continued adding new features at least twice a year.

However, the company tore up those plans with the release of Windows 11, which went from abstract concept to official announcement in a matter of weeks. The cancellation of Windows 10X influenced Microsoft’s decision, but Satya Nadella and Co had clearly been considering a new desktop OS for a while.

With no assurances regarding Windows 11’s lifespan, it’s logical we’ll see a successor at some point. What’s more, there are suggestions it might not be far away, with Microsoft rumoured to already be working on Windows 12 internally. Here’s everything we know at this early stage.

Will there be a Windows 12?

Most likely, yes. Windows XP and Windows 7 continued receiving updates for 12 and 11 years respectively, while Windows 10 will be a decade old when support ends in October 2025. Windows 8’s four years of mainstream support is the exception here, but that’s primarily due to its overwhelmingly negative reception. If Microsoft continues this trend, Windows 11 would reach end of life sometime between 2031 and 2033. If that’s the case, a new version will need to be available a few years earlier.

However, there are signs Windows 12 could arrive much sooner. A recent Windows Central article suggests a major new version of Windows will be released every three years. Author Zac Bowden even says that development has begun, with Windows 12 “currently in early planning and engineering stages”. It’s even been given an unofficial codename – “Next Valley”.

An earlier article from German tech site Deskmodder also suggested Microsoft was beginning to prepare for Windows 12. The article cited “our information”, although it also referred to a now-deleted tweet from SwiftOnSecurity, who has since revealed it to be a joke:

I have deleted this tweet, which was supposed to be a joke. I apologize for the confusion. pic.twitter.com/0z2MZN22JM

— SwiftOnSecurity (@SwiftOnSecurity) February 20, 2022

When will Windows 12 be released?

So, we have evidence that Windows 12 is on the way. The next question is obvious: when will it be available?

Assuming Microsoft sticks to the three-year update cycle Windows Central’s Zac Bowden reports, Windows 12 would be released at some point in 2024. Nothing more specific has been rumoured at this stage, and making predictions so far out is almost impossible.

For context, Windows 11 was announced in June 2021 and officially released a few months later in October. But a full rollout to all compatible devices took many months, so it’s likely to be a similar story.

Will Windows 12 be free?

It should be, at least initially. Microsoft offered a free upgrade to Windows 10, and it’s technically still available.

Updating to Windows 11 also won’t cost you a penny, provided your device meets the hardware requirements, and there’s no indication that Microsoft will end this anytime soon. If it does, you may end up paying close to the current Windows 10 asking price (from $139/£119.99).

Once Windows 12 is released, it’ll almost certainly be free for a while. Microsoft will be understandably keen to get as many people onto the new OS as possible.

Will Windows 12 have different hardware requirements?

Probably, but it’s impossible to predict what they might be. While laptops and PCs have retained the same core design for decades, plenty of other specs have rapidly changed over the years.

Windows 11’s hardware requirements have proven controversial, but security features such as TPM and Secure Boot are only likely to become more important to Microsoft in the future.

You’ll probably need a recent chip from the likes of Intel, AMD or Qualcomm, but other chipmakers may be popular by then. Expect the current minimums of 4GB of RAM, 64GB of storage and a 720p display to all be increased.

Microsoft Surface Go (2018) - front view
The first-gen Surface Go from 2018 isn’t compatible with Windows 11

Dominik Tomaszewski / Foundry

What new features will Windows 12 have?

As you might expect, we have no idea what new features will be available in Windows 12. At this early stage, it’s likely Microsoft doesn’t know yet either.

With Windows 11 getting new features throughout the year, many of those currently rumoured are likely to arrive well before a brand-new version. There has been some suggestion that the ‘Sun Valley 2’ (Windows 11 was initially codenamed Sun Valley) update will turn out to be Windows 12, but it’s much more likely to be the 22H2 update.

However, Deskmodder is suggesting Windows 12 will be built from the ground up, rather than being based on previous versions. That’s what we saw with Windows 10X, before many features ended up being incorporated into Windows 11.

Windows 10X running on the Surface Neo
Windows 10X’s Start menu will look familiar to all Windows 11 users

Microsoft

This opens up the possibility of a radically different design, although big changes might not prove popular with Windows’ huge user base. Indeed, Windows Central’s Zac Bowden said in an August 2022 video that’d he’d be “shocked if they did a Windows 8-style change, however, I wouldn’t write it off”.

But if foldable PCs take off in a big way, Bowden said “we’d have to see with Windows 12 lots of enhancements to the Windows design and UX”. Microsoft may decide to release a foldable alternative to the cancelled Surface Neo, but Windows 11 doesn’t cater to either of these form factors particularly well in its current form, which would raise the prospect of a specific tablet mode. This was ditched with the introduction of Windows 11, but could then return in Windows 12.

Aside from that, minor improvements are most likely. Making Windows 12 stable and mostly bug-free will probably be the priority, although that’s only speculation at this stage.

The original Windows Central article which hinted at the 2024 release date didn’t reveal any concrete new features either. But it did suggest that new features would be added to future versions of Windows every few months, potentially as frequently as four times a year. They’re known internally as “Moments”, but author Zac Bowden says that branding might not make its way into the public version.

Until more is known, check out our extensive coverage of Windows news on Tech Advisor, across both Windows 10 and Windows 11.

Windows 11

Go to Source

The best graphics cards for PC gaming: Great deals before next-gen GPUs arrive

Most people who are in the market for a new graphics card have one primary question in mind: Which card will give me the most bang for my buck? Obviously, the answer will vary depending on your budget. Beyond that, there are a number of factors to consider: Raw performance is important, but so are things like noise, the driver experience, and supplemental software. And do you want to pay a premium to get in on the bleeding edge of real-time ray tracing?

Let us make it easy for you. We’ve tested nearly every major GPU that’s hit the streets over the past couple of years, from $100 budget cards to $1,800 luxury models. Our knowledge has been distilled into this article—a buying guide with recommendations on which graphics card to buy, no matter what sort of experience you’re looking for.

And yes, you can finally buy a GPU again. After over two years of an insane graphics card crunch spurred by chip shortages and an insane cryptocurrency surge, the dam has finally burst. End result? GPU prices are finally plummeting across the board, with higher-end graphics cards seeing especially steep sales. While you can buy a used GPU for less cash, picking up a new model with a full warranty and no risk is a lot more enticing with prices finally approaching sanity.

Rumors of next-gen Nvidia GeForce RTX 4000-series graphics cards abound, and AMD has publicly said that its new RDNA 3-based Radeon GPUs will launch later this year. A new contender, Intel, also appears ready to release its debut Arc GPUs later this summer, gunning for the RTX 3060 (at least in newer games) with the world’s first game-changing AV1 GPU encoder for streaming in tow. But if you need a new graphics card today, here are your best options. Street pricing for these cards still fluctuates wildly, and these rankings take real-world costs into account—which currently give AMD’s Radeon GPUs an edge.

Note: There are customized versions of every graphics card from a host of vendors. For example, you can buy different GeForce GTX 3080 models from EVGA, Asus, MSI, and Zotac, among others.

We’ve linked to our complete review for each recommendation, but the buying links lead to models that hew closely to each graphics card’s MSRP. Spending extra can get you hefty out-of-the-box overclocks, beefier cooling systems, and more. Check out our “What to look for in a custom card” section below for tips on how to choose a customized card that’s right for you.

The best graphics cards for PC gaming

AMD Radeon RX 6500 XT – Best budget graphics card

AMD Radeon RX 6500 XT - Best budget graphics card

Prices may be relaxing, but currently, the much-maligned Radeon RX 6500 XT is still the only semi-reasonable sub-$250 option around. If you can find it for a good price, Nvidia’s GeForce RTX 3050 is a much more capable modern graphics card, but its pricing is typically inflated at around $300. The Radeon RX 6500 XT is less appealing thanks to its nerfed memory, PCIe lanes, and limited ports, not to mention lower performance, but you can often find them going for around $200 on the streets these days. Those hardware limitations mean you’ll need to stick to Medium or High graphics settings at 1080p resolution in modern games in order to achieve playable frame rates, but if you do that, you’ll enjoy the experience.

Read our full
Radeon RX 6500 XT review

AMD Radeon RX 6600 – Best 1080p graphics card

AMD Radeon RX 6600 - Best 1080p graphics card

AMD’s Radeon RX 6600 and Nvidia’s rival GeForce RTX 3060 both ostensibly carry the same $329 MSRP, but on the streets, there’s a much wider gap. You can find the 6600 going for prices starting around $300, while the cheapest RTX 3060 begins at $400. Those are both steep entry costs for 1080p gaming—at least compared to the GPUs of yesteryear—but with 8GB of fast GDDR6 memory, insanely good power efficiency, and AMD’s Radeon Super Resolution in tow, the Radeon RX 6600 is a great graphics card for people looking to game at 1080p resolution at 60fps or higher without compromising on visual fidelity. (Or breaking the bank.)

Read our full
Radeon RX 6600 Swft 210 review

Nvidia GeForce RTX 3050 – Best 1080p graphics card for ray tracing

Nvidia GeForce RTX 3050 - Best 1080p graphics card for ray tracing

Nvidia is on its second-generation of dedicated ray tracing hardware, and its killer DLSS upsampling feature is in hundreds of games to claw back the performance lost to turning on ray tracing (which carries a steep performance penalty). At $330-plus, it ain’t cheap, but if enabling those cutting-edge lighting effects is a priority, you’ll want to go with GeForce. The RTX 3060 is another solid option, but it’s $400 on the streets and delivers performance on par with the $300 Radeon RX 6600 in games that don’t use ray tracing.

Read our full
GeForce RTX 3050 review

AMD Radeon RX 6700 XT – Best 1440p graphics card

AMD Radeon RX 6700 XT - Best 1440p graphics card

In a sane world, Nvidia’s GeForce RTX 3060 Ti would dominate 1440p gaming at its $400 MSRP. It’s that good, and it offers superior ray tracing performance to AMD’s Radeon rivals. But we still aren’t living in a sane world, and the RTX 3060 Ti is going for $500+ on the streets, and often $550 to $600. Nvidia’s RTX 3070, ostensibly $500, goes for $650 to $700 online. Get AMD’s Radeon RX 6700 XT instead. It’s plenty fast for 1440p gaming at 60fps+ without compromise, while its beefy 12GB of GDDR6 memory provides plenty of headroom for flipping on all the most intense graphical features. The one downside? AMD’s card is only capable of playing ray-traced games at 1080p resolution unless you activate Radeon Super Resolution, or FSR 1 or 2 in games that support it. One the flip side, the Radeon RX 6700 XT can take advantage of AMD’s awesome performance-boosting Smart Access Memory feature if you’re running a modern Ryzen system that supports it.

Read our full
Radeon RX 6700 XT review

Nvidia GeForce RTX 3060 Ti – Best 1440p graphics card for ray tracing

Nvidia GeForce RTX 3060 Ti - Best 1440p graphics card for ray tracing

Yes, the RTX 3060 Ti remains overpriced compared to its MSRP, going for $500+ rather than the expected $400—but that’s because it’s that good. If you want a killer 1440p gaming experience with top-notch ray tracing as the cherry on top, this is the card to buy even at an inflated price. The step-down GeForce RTX 3060 is also worth considering, though you may need to turn down some graphics settings when you enable ray tracing, while the step-up RTX 3070 doesn’t deliver enough of a performance boost to justify spending yet more.

Read our full
GeForce RTX 3060 Ti review

Nvidia GeForce RTX 3080 Founders Edition – Best 4K graphics card

Nvidia GeForce RTX 3080 Founders Edition - Best 4K graphics card

If you’ve got a 4K monitor and want to put all those pixels to work, the RTX 3070, 3070 Ti and AMD’s Radeon RX 6750 XT and RX 6800 are all decent cheaper options. But if you want the best possible experience without any visual compromises, spend $800 and pick up the 10GB version of the RTX 3080. (The $1,000 12GB model isn’t worth the upcharge despite being slightly more future-proof.) The GeForce RTX 3080 packs enough power to blow through games even at 4K resolution with eye candy cranked, including ray traced games thanks to Nvidia’s killer combo of second-gen ray tracing hardware and DLSS.

The one problem? This is an insanely popular GPU, and it can still be difficult to find models around the 3080’s $800 MSRP (though they’re definitely popping up). AMD’s rival Radeon RX 6800 XT is easier to find, just as fast, and packs a whopping 16GB of GDDR6 memory, but Nvidia’s superior ray tracing and DLSS chops earn it the nod for this price point if you’re able to find one around MSRP.

All that said, new GPU generations from both Nvidia and AMD are expected before the end of the year, and when those launch, paying MSRP price for the two-year-old RTX 3080 or any of its rivals may sting, since new graphics families usually demolish the performance of last-gen’s high-end GPUs for a similar price. Consider whether you want to hop on board now, or risk waiting a few months to see what’s brewing.

Read our full
GeForce RTX 3080 review

AMD Radeon RX 6950 XT – Best high-end 4K graphics card

AMD Radeon RX 6950 XT - Best high-end 4K graphics card

Graphics cards that cost $1,000 didn’t used to exist, but now they’re commonplace, with the $1,000 12GB RTX 3080, $1,000 Radeon RX 6900 XT, $1,100 Radeon RX 6950 XT, $1,200 GeForce RTX 3080 Ti, $1,500 GeForce RTX 3090, and $2,000 RTX 3090 Ti all available in this price range.

Their steep price increases don’t translate into a lot of extra performance over the more affordable RTX 3080 or Radeon RX 6800 XT, so we recommend most people stick with those instead. But prices are being slashed rapidly at the high-end, and if you’re looking to splurge, we recommend the Radeon RX 6950 XT for most people.

The Radeon RX 6950 XT is faster than the RTX 3090 for $400 less, and comes with an ample 16GB of memory. Heck, it even surpasses the $2,000 RTX 3090 Ti in performance in some games. If you simply want to play some games with face-melting speed and fidelity, the 6950 XT is a killer value option, and it will deliver a killer experience—especially if you use AMD features like Radeon Super Resolution, Smart Access Memory, and FSR. (The same holds true for the Radeon RX 6900 XT, which we’ve seen on sale for less than $900 now that the GPU crunch is letting up.) It isn’t as good as Nvidia’s GPUs at ray tracing, however, so opt for the RTX 3080 Ti instead if you’re a gamer looking to flip on all those cutting-edge lighting effects. Note that the Sapphire Nitro+ Pure model we reviewed is an ultra-luxe enthusiast-class version that costs more, and deservedly so, though you can find other RX 6950 XTs for MSRP.

Read our full
Nitro+ Pure Radeon RX 6950 XT review

Nvidia GeForce RTX 3090 – Best high-end 4K graphics card for content creation and ray tracing

Nvidia GeForce RTX 3090 - Best high-end 4K graphics card for content creation and ray tracing

If you want some of the best gaming performance on the planet, including ray tracing, and also want to do some work on the side, the $1,500 RTX 3090 is the graphics card to buy. This card works hard and plays hard thanks to a massive 24GB of ultra-fast GDDR6 memory that makes it excel at content creation and machine learning tasks, especially high-res video rendering. And it slings gaming frames with the best of them. The newer RTX 3090 Ti offers slightly faster GPU and memory performance, but costs $500 more, making the non-Ti 3090 a much more (somewhat) practical choice. You pay for the privilege, however, with the Radeon RX 6950 XT and GeForce RTX 3080 Ti delivering similar gaming performance for significantly less.

Read our full
Nvidia GeForce RTX 3090 review

How we test graphics cards

We test graphics cards on a dedicated test system used only for this purpose, with minimal extra software involved. That ensures that any performance changes we see are generated solely by the graphics card being tested and new GPU drivers, without the variability of other hardware or software changes. Here is the configuration of our current testbed:

  • AMD Ryzen 5900X, stock settings
  • AMD Wraith Max cooler
  • MSI Godlike X570 motherboard
  • 32GB G.Skill Trident Z Neo DDR4 3800 memory
  • EVGA 1200W SuperNova P2 power supply
  • 2x 1TB SK Hynix Gold S31 SSD

As far as games go, we use a fixed set of games to test every graphics card that comes out in a given generation, and update the suite when a new generation of GPUs is introduced. We test a variety of games spanning most major game types (tactics, racing, FPS, etc.), engines (Unreal Engine, Unity, Anvil, etc.) and underlying graphics APIs (DirectX 11, DX12, Vulkan). We use the built-in benchmarks for each game, but only after validating the accuracy of the results by running the benchmarks and comparing the results to performance witnessed by third-party GPU measurement tools like OCAT. Each game is tested at least three times per resolution, generating an average from those runs, with additional tests run if we encounter any hiccups. We may also perform additional testing with tools like OCAT if any performance oddities are noticed. Power draw is measured on a whole-system basis, listing both idle and fully stressed states as measured via a Watts Up meter that the system is plugged into.

What to look for in a custom graphics card

If you want to shop beyond the scope of our picks, know that finding the right graphics card can be tricky. Various vendors offer customized versions of every GPU. For example, you can buy different Radeon RX 6700 XT models from Sapphire, XFX, Asus, MSI, and PowerColor.

To help narrow down the options and find the right card for you, you should consider the following things when doing your research:

Overclocks: Higher-priced custom models are often overclocked out-of-the-box to varying degrees, which can lead to higher performance. Most modern custom cards offer the same essential level of performance,however.

Cooling solutions: Many graphics cards are available with custom coolers that lower temperatures and fan noise. The vast majority perform well. Liquid-cooled graphics cards run even cooler, but require extra room inside your case for the tubing and radiator. Avoid graphics cards with single-fan, blower-style cooling systems if you can help it, unless you have a small-form-factor PC or plan on using custom water-cooling blocks.

Size: Many graphics cards are of a similar size, but longer and shorter models of many GPUs exist. High-end graphics cards are starting to sport especially massive custom cooling solutions to tame their enthusiast-class GPUs. Double-check that your chosen graphics card will fit in your case before you buy.

Compatibility: Not all hardware supports a wide range of connectivity options. Higher-end graphics cards may lack DVI ports, while lower-end monitors may lack DisplayPorts. Only the most modern Radeon and GeForce graphics cards support HDMI 2.1 outputs. Ensure your graphics card and monitor can connect to each other. Likewise, make sure your power supply meets the recommended wattage for the graphics card you choose.

Real-time ray tracing, FSR, and DLSS: AMD’s Radeon RX 6000-series graphics cards and all of Nvidia’s RTX offerings can play games with real-time ray tracing effects active. Nvidia’s RTX 30-series GPUs hold a massive advantage over everything else though, propelled even further by dedicated tensor cores for processing machine learning tasks such as Deep Learning Super Sampling, which uses AI to speed up the performance of your games with minimal hit to visual fidelity. GeForce RTX 20-series GPUs also support DLSS, while AMD’s rival FSR 2.0 and Radeon Super Resolution technologies are gaining traction by the day.

Go to Source

Python programming libraries found hiding security threats

Threat actors have been using typosquatting to attack Python developers with malware, researchers have claimed.

Experts from Spectralops.io recently analyzed PyPI, a software repository for Python programmers, and found ten malicious packages on the platform. All of these were given names that are almost identical to the names of legitimate packages in order to dupe developers into downloading, and adopting, the tainted ones.

This type of attack is called typosquatting, and is a common occurrence among cybercriminals. It’s not used just on code repositories (although we’ve seen numerous instances on GitHub, for example, in the past), but also in phishing emails, fake websites, and in identity theft.

Thousands of developers at risk

Should the victims adopt these packages, they’d be giving threat actors keys to their kingdoms, given that the malware enables private data theft, as well as the theft of developer credentials. The attackers would then send the data to a third party, with the victims never knowing what happened. As of today, Spectralops reminds, PyPi has more than 600,000 active users, suggesting that the threat landscape is quite large.

“These attacks rely on the fact that the Python installation process can include arbitrary code snippets, which is a place for malicious players to put their malicious code at,” explained Ori Abramovsky, Data Science Lead at Spectralops.io. “We discovered it using machine learning models which analyze the code of these packages and auto alert on the malicious ones.”

Here’s the full list of the affected packages: 

  • Ascii2text
  • Pyg-utils, Pymocks and PyProto2
  • Test-async
  • Free-net-vpn and Free-net-vpn2 
  • Zlibsrc
  • Browserdiv, 
  • WINRPCexpoit 

The researchers reached out to PyPI which, soon after, removed the malicious packages from its repository. Still, developers that downloaded them in the past are still at risk, and should refresh their passwords and other login credentials, just in case.

“What’s remarkable here is just how common these malicious packages are,” Abramovsky continued. “They are simple, yet dangerous. Personally, once I encountered these types of attacks, I started double checking every Python package I use. Sometimes I even download it and manually observe its code prior to installing it.”

Go to Source

AMD Zen 4: Everything you need to know

It’s hard to believe that AMD released its very first Ryzen CPUs as recently as 2017. They were based on a new Zen architecture, built from the ground up in the five years prior to release. Looking back, this was a defining moment for AMD, and the future of laptop and desktop chips more widely. 

Since then, we’ve seen five more generations of Ryzen chips and three subsequent Zen architectures. The latest of these is the Ryzen 6000 Series, which uses a tweaked version of the existing architecture known as Zen 3+, but includes no desktop CPUs.

Both are expected in the Ryzen 7000 Series, which is were the Zen 4 architecture looks set to make its debut. Here’s everything you need to know.

AMD Zen 4 release date

At the Zen 3 reveal in October 2020, AMD Chief Technology Officer Mark Papermaster confirmed that Zen 4 was “on track, in design”. His presentation was accompanied by the following timeline:

Our next official update came in July 2021, when AMD CEO Lisa Su confirmed that Zen 4 was on track to launch the following. At CES in January 2022, the company was a little more specific – the second half of 2022 was the target.

That looks set to be met with time to spare. As Wccftech reports, AMD has confirmed it’ll be holding a keynote at the event, just after it finishes on 29 August. That’s where the Ryzen 7000 Series is expected to debut, with all signs pointing to it using the Zen 4 architecture.

Those first few CPUs are expected to be reserved for high-end desktops, with the same Wccftech article hinting at a 15 September release date. But there’ll supposedly be only five desktop processors in that initial release, with more expected in 2023. That’s also when we’re expecting Zen 4-based laptop CPUs, as AMD’s earlier official roadmap shows:

AMD roadmap
Image: AMD

AMD Zen 4 devices

Of course, Zen 4’s official release date is expected to coincide with the first CPUs that will take advantage of it. As AMD itself has confirmed, these will be the Ryzen 7000 Series.

PC users regularly turn to AMD chips to update their existing machines, with the main limitation being a compatible motherboard. Moving to the new 5nm process, as indicated in the official screenshot above, will likely mean motherboards using the existing AM4 socket wouldn’t be supported. A new AM5 socket is expected, but that wouldn’t work with the AMD’s AMD’s current A520 and X570 motherboards.

That same Wccftech article reporting the initial launch and release dates has revealed the following specs for the initial Ryzen 7000 Series lineup:

  • AMD Ryzen 9 7950X – 16 cores, 32 threads, 5.5GHz max clock speed, 80Mb cache, 105-170W TDP
  • AMD Ryzen 9 7900X – 12 cores, 24 threads, 5.5GHz max clock speed, 76Mb cache, 105-170W TDP
  • AMD Ryzen 7 7800X – 8 cores, 16 threads, 5.3GHz max clock speed, 40Mb cache, 65-125W TDP
  • AMD Ryzen 7 7700X – 8 cores, 16 threads, 5.3GHz max clock speed, 40Mb cache, 65-125W TDP
  • AMD Ryzen 5 7600X – 6 cores, 12 threads, 5.2GHz max clock speed, 38Mb cache, 65-125W TDP

The
Zen 3-based Ryzen 5000 Series has a maximum of 16 cores, so this is a big upgrade. More cores doesn’t always yield performance gains though, so it remains to be seen how much of an impact this will have. The leaker in question does have history when it comes to component news, but there’s still no guarantee we’ll see a 24-core Zen 4 CPU.

Zen 4 will almost certainly make its way to laptop chips at some point, although we’ll probably be waiting until CES 2023 to see them. Even then, these processors are designed to be integrated into devices, so will be dependent on interest from laptop manufacturers (or OEMs, as they’re often known).

AMD Zen 4 spec news

Ahead of its expected release, we already have a few concrete rumours on what to expect from Zen 4.

As was first reported by Videocardz, the same official roadmap as above describes the architecture as “achieving the pinnacle of gaming performance”. It’ll include the ‘Raphael’ desktop chips, but also ‘Phoenix’ for thin and light gaming and a new ‘Dragon Range’ for ultra-powerful gaming laptops.

According to AMD, you can expect some big gains on CPUs that use Zen 4 architecture:

AMD Zen 4 summary screen

AMD

AMD has also confirmed that it will move to a 5nm process, down from 7nm you’ll find on Zen 3 and 6nm on Zen 3+. This could be a significant move, with the ability to provide the same amount of power within a smaller footprint. 

Indeed, a
WikiChip article from March 2020 suggests the move to 5nm could enable TSMC to provide a density improvement of as much as 87% when compared the 7nm process. TSMC directly works with AMD to produce Ryzen CPUs, so these sorts of gains could make their way into Zen 4-based chips. Transistor density is vital to the performance of a processor, so this could lead to huge gains in performance. 

A subsequent
post on tech blog Chips and Cheese suggests this could be as much as 40%, while IPC (instructions per clock) could increase by 25%. The article goes on to say that early samples of AMD’s less EPYC processors show a 29% speed improvement over the current generation, despite having the same number of cores and clocks.

AMD has since confirmed a rumour reported by
Wccftech – the new AM5 socket will make its debut on Zen 4. The platform will require a new architecture, so this makes sense. Prolific Twitter leaker @ExecuFix has revealed some of AM5’s key specs:

AM5 😏
– LGA-1718
– Dual-channel DDR5
– PCI-e 4.0
– 600 series chipset— ExecutableFix (@ExecuFix)
May 22, 2021

Subsequent tweets suggest that the existing 40×40 mm CPU socket will remain, but that PCIe 5.0 will be reserved for enterprise-level chips. However, at CES 2022, AMD suggested that PCIe 5.0 will be coming to all Zen 4 CPUs, alongside DDR5 RAM.

An earlier Zen 4 leak arrived courtesy of YouTube channel ‘
Moore’s Law is Dead‘:

The above video consolidates some information that’s already been revealed, suggesting Zen 4 chips will use a 5nm process designed by TSMC. DDR5 RAM support is expected, as well as increasing PCIe 4.0 lanes from 24 to 28.

Key new information includes Zen 4 chips improving IPC (instructions per clock) by around 25% over Zen 3. The architecture will potentially support a 24-core CPU at some point, but it’s not expected to be among the first wave of processors that launch.

However, we may see new high-end processors with more cores – ‘Genoa 7004’ CPUs have been detailed in a leaked roadmap unearthed by Videocardz. This will supposedly feature come with more than 64 cores and is expected to launch in 2022, before ‘3004’ chips with 32/64 cores debut in Q1 2023.

That was expected to be the top-spec Zen 4 chip you could buy, but a subsequent suggests it will be able to support many more than that. Prolific CPU leaker @Broly_X1 (whose account has since been deleted) appeared to confirm a 128-core CPU in June 2021, saying: “Wow, ZEN4 is really more than 96 cores. I was skeptical when I first saw this news in Chiphell. Now I can also confirm that ZEN4 is up to 128 cores”.

If true, this will mean Zen 4 supports twice as many cores as the current Zen 3. It’s also expected to double the maximum thread count (256 vs 128). This has the potential to deliver huge performance gains for Zen 4-based CPUs.

We’ll update this article as soon as we know more about Zen 4, but . There’s already news on its successor, too – check out our guide to the Zen 5 architecture. You may also be interested in learning more about the current Zen 3+ based Ryzen 6000 series CPUs, designed to be integrated into many of the best laptops and other mobile PCs of 2022.

Our Ryzen 7000 Series article runs through everything you need to know about the new CPUs.

CPUs and Processors

Go to Source

Samsung is closing out DDR3 and DDR4 memory to focus on DDR5

Time, like an ever-rolling stream, bears all its sons away. It’s unlikely that dual-channel memory was on the mind of the poet when those words were written, but they remain true nonetheless. As the industry shifts towards newer technologies, prolific memory supplier, Samsung, is reportedly finding fewer and fewer buyers for the older DDR3 standard. So, it’s cutting production dramatically, according to industry paper DigiTimes.

With DDR3 on the way out and demand for next-gen DDR5 memory ramping up, the paper reports (via WCCFTech) that Samsung is cutting prices on DDR3 dramatically while also cutting prices for the popular 4GB DDR4 modules. This comes amidst a general market shift towards cheaper memory as demand falls from pandemic highs. The Wall Street Journal reported that in the second quarter of 2022 alone, prices for DRAM fell almost 11% across the board, a dramatic shift even in such a volatile industry.

That being the case, it makes sense for Samsung and other memory suppliers to de-emphasize older, lower-margin DDR3 chips and even scale back production of DDR4 chips as demand evens out. DDR5, while certainly higher in profit margin, has yet to become the industry standard. Using this lull in the market to seek a dominant position as DDR5 becomes more prevalent certainly makes sense, especially if you’re a megacorp with billions of dollars to throw around.

What does this mean for PC enthusiasts? In the short term, rock-bottom prices for current-gen DDR4 memory and DDR5 memory that’s perhaps more affordable than you might think for cutting-edge tech. We can see similar trends in other parts of the PC hardware market, as prices for solid state drives and graphics cards continue to fall after a period of high demand and supply chain woes.

Go to Source

Logitech G203 LightSync review: A dependable low-cost gaming mouse

At a glance

Expert’s Rating

Pros

  • A comfortable design that resembles more expensive esports mice
  • Sturdy and well-built
  • The RGB lighting really pops

Cons

  • Some sensor lag was experienced with large, fast movements
  • The sensor’s lift-off distance is higher than some mice
  • The RGB logo on the top isn’t customizable

Our Verdict

The Logitech G203 LightSync performs well in games and is both comfortable and affordable. Its RGB lighting adds a welcome splash of color to your gaming den.

The G203 LightSync is the epitome of good value, delivering comfort, style, and performance without a hefty price tag. Its small size and ambidextrous design make it comfortable for all three main gamer grip styles. And while its 8,000 DPI proprietary gaming sensor won’t match the pointer performance you’ll get from more premium gaming mice, it’s still more than capable in just about every casual gaming scenario.

Note: This review is part of our roundup of best gaming mice. Go there for details about competing products and how we tested them.

Logitech G203 LightSync: Design and build

Measuring just 4.59 x 2.45 x 1.5 inches, the G203 is a small, wired, and well-built mouse, with a sturdiness you’re more likely to find in expensive gaming mice. Design-wise it oozes familiarity, featuring the kind of symmetrical, right-handed profile and six-button layout we see in esports mice like the HyperX Pulsefire Haste.

But while the G203’s design aesthetic is familiar, it feels anything but vanilla. Its main point of difference is its rounded back end that pushes up into the base of your palm providing palm grippers a snugger fit than you’ll get with some gaming mice. This equates to more precise pointer control, allowing your wrist to accomplish more on screen, but with smaller movements.

If you’re a fingertip or claw-style gripper, the G203’s small-sized body also works to your advantage, allowing you to easily curl your hand over the mouse’s body, or poise your fingertips right on the trigger without stretching or dragging your palm upwards.

There’s plenty to like about the G203’s right-handed button configuration too. It consists of two main clicks, two macros on the left-hand side, a plastic scroll wheel in the middle, and a DPI preset button just behind it. The central positioning of the buttons makes them somewhat easier to reach than in longer mice.

The Logitech G203 LightSync features an ambidextrous profile with a rounded back end. 

Logitech

The button quality is superb too, all six being programmable and feeling quick and responsive when triggered. There’s also a good amount of travel beneath the two main buttons, and they feel clicky – ideal for jitter clicking.

Logitech has thoughtfully incorporated a few other small touches that, given the G203’s budget status, could have been left out, but we’re glad they weren’t since they ultimately make the G203 more enjoyable to use.

One is a well-represented contingent of glide skates on the underside, which includes a bonus sensor ring as well as the obligatory top and bottom skates. These do a great job keeping movement smooth and frictionless, whether you’re using a mouse pad or not. The other nice addition is a lengthy cord, which allows you to sit up to two arms’ lengths away from your PC or laptop without hitting the end of your leash.

All these design pluses add up to make the G203 very ergonomic. One small peeve, however, is the G203’s weight: at 85 grams it’s a little heavy for its size. Does this make a huge difference? If you plan on using it as an esports mouse, it could—otherwise, it’s unlikely you’ll notice a difference.

Logitech G203 LightSync: Software

The G203 uses Logitech’s G Hub software, which is one of the most comprehensive apps available for personalizing gaming mice. On downloading it, my review unit was quickly recognized, and I could simply click through on a visual image of the device to change the mouse’s DPI setting, polling rate, RGB lighting, and also to assign commands and macros to buttons.

Assigning buttons is a little better in G Hub than in some mouse apps since the Assignments menu splits off into helpful submenus that allow you to differentiate between profiles for games or programs. You also get examples of commands you may wish to assign, which takes a lot of the guess work out of thinking them up yourself.

The ‘Actions’ submenu proved especially useful. It suggests key actions to assign for go-to programs like Overwolf, Discord, and OBS. Here you can configure buttons to, among other things, capture replays and video or take screenshots—three indispensable commands for streamers and game developers. You can also assign commands to the G203’s onboard memory, so you can use it without needing the G Hub software.

For control of the RGB lighting, G Hub lets you personalize three zones in the band at the G203’s back end. Having three zones to play with means you can light up your mouse like a firecracker with dazzling multicolored displays. The LightSync RGB system is a worthy upgrade over the G203’s predecessor, the G203 Prodigy, and left a good impression on me. In fact, if I hadn’t known the G203 cost just $40, I’d have been none the wiser.

The Logitech G203 LightSync’s three RGB ligting zones can be personalized in the G Hub app.

Dominic Bayley / IDG

You can also change the brightness and set various lighting effects and animations. Regrettably, though, Logitech hasn’t made the RGB logo on the mouse’s top one of configurable zones, so it just displays a fusion of colors from your other color selections. While it would have been nice to have independent control of the logo, there’s still plenty of customization available to keep you busy for a long while.

Logitech G203 LightSync: Performance

Budget gaming mice can vary considerably when it comes to their sensors, with resolutions ranging anywhere from 1,200 up to 24,000 DPI. Categorizing the G203’s 8,000 DPI sensor on this scale, it sits up from the bottom but still towards the lower end. This puts it just below near rivals the Razer Viper Mini and SteelSeries Rival 3. That’s about where performance lies too, which is to say, it’s decent, but not perfect.

On the whole, though, the G203’s sensor proved responsive and dependable. It tracked well in a range of games, from first-person shooters to RPGs. However, in some games where large, fast movements were necessary, it must be said that the G203’s sensor struggled slightly, showing up visible ghosting that would undoubtedly be a disadvantage in esports matches or serious competitive matches.

Additionally, if you’re prone to lifting your mouse, the G203 isn’t your best option, since it tends to have a higher lift-off distance than some other budget gaming mice. These drawbacks aren’t unexpected for the G203’s price, or things that you should necessarily worry about for causal gaming, but they may turn off some gamers who require a more flawless gaming experience.

What did impress, however, was the G203’s click latency, which is arguably the fastest I’ve seen in a budget gaming mouse. What’s more, the buttons fired off with a satisfying audible click.

Being able to change the DPI settings in smaller increments than competitors like the Rival 3 also proved really useful for targeting in first-person shooters, since I could more precisely find a DPI setting (and hence a targeting sensitivity) for different weapons classes—this being necessary since different weapons/character classes can be subject to different game physics.

The Logitech G203’s DPI settings can be personalized in the G Hub app.

Dominic Bayley / IDG

The mouse’s small design also proved an asset, keeping my hand comfortable and relaxed even after long gaming stints. Ambidextrous mice tend to feel wand-like, allowing you to point them precisely at targets as if you’re pointing a straight stick. This was the case with the G203, its miniature size allowing me to frame up targets a mere few millimeters wide in my field of view, thereby improving my precision.

Conclusion

The Logitech G203 LightSync may be a budget buy but it dishes up plenty of comfort and precision control in games thanks its small body and ambidextrous design that resembles mice in the esports category. While its 8,000-DPI sensor isn’t outstanding, it’s a capable performer in most causal games. Plus, the G203’s RGB lighting adds a welcome touch of color to your gaming den.

Go to Source