Marvel’s Guardians of the Galaxy system requirements are out of this world

Marvel’s Guardians of the Galaxy is out in a week, so there’s not long to wait now, and ahead of its release the PC system requirements have been made available.

The spec has just been posted on Steam, with one of the most eye-opening elements here being the amount of drive space which will be eaten up by Guardians of the Galaxy. To be precise, you’ll need 150GB free to install the game.

The full minimum and recommended requirements to run Guardians of the Galaxy on PC are as follows…

Guardians of the Galaxy minimum system requirements

  • Requires a 64-bit processor and operating system
  • OS: Windows 10 64-bit Build 1803
  • Processor: AMD Ryzen 5 1400 / Intel Core i5-4460
  • Memory: 8GB RAM
  • Graphics: Nvidia GeForce GTX 1060 / AMD Radeon RX 570
  • DirectX: Version 12
  • Storage: 150GB available space

Guardians of the Galaxy recommended system requirements

  • Requires a 64-bit processor and operating system
  • OS: Windows 10 64-bit Build 1803
  • Processor: AMD Ryzen 5 1600 / Intel Core i7-4790
  • Memory: 16GB RAM
  • Graphics: Nvidia GeForce GTX 1660 Super / AMD Radeon RX 590
  • DirectX: Version 12
  • Storage: 150GB available space

Analysis: Storage troubles ahead perhaps, but smooth sailing elsewhere

The big ask, as mentioned, is that 150GB stipulation for drive space, which is a seriously hefty chunk. It’s bigger than a lot of demanding contemporary games, and could cause some concern for those who have a smaller SSD in their system (or a fairly full solid-state drive, for that matter).

Indeed, Guardians of the Galaxy demands as much room on your drive as Microsoft Flight Simulator, which also takes 150GB (with full updates on board) given its hefty wedge of mapping data.

Other games which require 150GB include Red Dead Redemption 2, and Stalker 2 when it emerges next year. So, while these kinds of demands are far from unheard of, it’s unexpected to see Guardians of the Galaxy being so hungry for drive space, and grumbling around the lack of priority given to optimizing installation sizes has predictably been spilled forth online. (Sadly, there are doubtless a ton of issues and bug fixing which must be handled ahead of tweaking for space, and often game development deadlines are tight to say the least).

Otherwise, though, the spec requirements are easy-going enough, with a GTX 1060 or Radeon RX 570 being a palatable baseline on the graphics card front. For the recommended GPU, you only need a GTX 1660 Super or RX 590, and this is for a game which promises to sport a load of graphical bells and whistles (plus, of course, DLSS should definitely help Nvidia RTX owners get the most out of the game’s frame rates).

The recommendation for 16GB of system RAM has become pretty standard these days, though it’s possible to run Guardians of the Galaxy with 8GB which is the minimum requirement.

In our early time spent with Guardians of the Galaxy, we came away not quite convinced, but those first impressions could change when it comes to release (and it’s not like we didn’t enjoy the game, either; it’s more a case of having reservations around the combat).

Via PC Gamer

Go to Source

Microsoft issues fix for annoying Windows 10 Remote Desktop auth issue

Microsoft has resolved an issue in Windows 10 that would cause authentication failures when connecting to devices in an untrusted domain using Remote Desktop using smart card authentication.

The company explained that the issue only surfaced after installing the cumulative updates released that were part of September’s Patch Tuesday.

“After installing KB5005611 or later updates, when connecting to devices in an untrusted domain using Remote Desktop, connections might fail to authenticate when using smart card authentication,” explains Microsoft.

TechRadar needs you!

We’re looking at how our readers use VPNs with streaming sites like Netflix so we can improve our content and offer better advice. This survey won’t take more than 60 seconds of your time, and we’d hugely appreciate if you’d share your experiences with us.

>> Click here to start the survey in a new window

According to Microsoft, the issue pops up on several Windows 10 versions, including Windows 10 21H1, Windows 10 20H2, and Windows 10 2004, as well as on various Windows Server releases such as Windows Server 2022, Windows Serve 20H2, and Windows Server 2004.

Patched via rollback

Microsoft confirms that it has rolled out a fix to address the issue via the Known Issue Rollback (KIR) feature.

KIR is a Windows 10 specialty that enables Microsoft to revert buggy fixes delivered through WIndows Updates, in case they cause regressions and break functionality. According to BleepingComputer, Microsoft has been using KIR to revert fixes that introduce unexpected bugs, since late 2019.

Furthermore, KIR fixes don’t rollback security fixes, and although distributed via the Windows Update mechanism, they aren’t really updates in the truest sense of the word. KIRs are instead propagated as Windows Registry entries that simply disable the regression-causing changes made during a previous update.

While Microsoft has stated that the fix for the remote desktop authentication issue will propagate automatically to consumer devices and non-managed business devices, enterprise-managed devices can resolve the issue by installing and configuring the two released group policies.

Via BleepingComputer

Go to Source

AMD, Intel, look away – here’s a new 128-core rival with more transistors than the Apple M1 Max

Chinese bare metal and dedicated server purveyor, Alibaba Cloud, has unveiled a new custom-built processor called the Yitian 710, a cloud-first, server CPU built on TSMC’s 5nm manufacturing process (the same as Apple’s new M1 Max and M1 Pro) with a staggering 60 billion transistors.

In comparison, Apple’s processor – the M1 Max – has 57 billion transistors, while the AWS Graviton 2 and the AMD EPYC Rome server processors have around 30 and 40 billion transistors each respectively.

Based on Arm’s v9 architecture, the Yitian 710 packs a staggering 128 cores and can reach speeds of up to 3.2GHz and supports up to eight DDR5 memory channels and 96 PCIe 5.0 lanes. We know that it is a muti-processor platform as well. 

The Yitian 710 achieved a score of 440 in SPECint2017, which Alibaba Cloud claims surpasses that of the current state-of-the-art Arm server processor (probably referring to the Graviton2) by 20% in performance and 50% in energy efficiency. 

We don’t know whether it is base or peak numbers, but at first glance, the figures seem to be competitive with what rivals have to offer. The processor will be rolled out in a new range of home-grown servers called Panjiu, all of which were announced at Alibaba’s annual Apsara Conference event.

During the same keynote, Alibaba Cloud also announced that it will be devoting more resources to building custom-built processors based on the open-source RISC-V architecture in what looks like an attempt to move away from proprietary compute architectures. 

The etching is on the wall

AMD and Intel will take note that yet another vendor has committed to building its own server processors. 

After Apple, Sberbank, Microsoft, Huawei, Amazon and Google, Alibaba has joined the fray and it is likely that they won’t be the last. 

Nvidia will also – to a lesser extent – be concerned by the other announcement, the fact that Alibaba Cloud will be investing heavily in RISC-V, which is increasingly being viewed as a compelling alternative to the closed ecosystem that Arm has come to represent.

Go to Source

Here’s why PC folks will never give a damn about Apple’s M1 Max

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

One of founding fathers of hardcore tech reporting, Gordon has been covering PCs and components since 1998.

Go to Source

Intelligent industry: good for the plan(e)t

There is a growing prevalence across the manufacturing industry to harness the benefits of AI and robotics to unlock efficiencies, reduce energy consumption and improve speed to market. This is driven in part by the need for businesses to be more efficient with the (limited) resources and budgets they have available, but also by the fact that customers are now demanding both shorter lead times and greater transparency in how their products are sourced and manufactured.

About the author

Mike Dwyer, Director of Intelligent Industry at Capgemini Invent.

Faced with ever shifting demands and unpredictable events (such as a global pandemic) impacting supply and production, advanced AI and robotics technologies are playing a growing role in helping manufacturing businesses to adapt to new ways of doing things.

You would be mistaken, however, in thinking that the use of robots in manufacturing stops at small silver machines sorting on the factory floor. In fact, the role of RPA (robotic processing automation) extends much more broadly across the manufacturing process and has become integral to building greater resilience and sustainability across the whole ecosystem.

360 product visibility

At a time when the need to be more environmentally friendly is becoming a fundamental focus for businesses, smart technology can help streamline manufacturing from start to finish to lower overall emissions. But to do this, businesses must have a 360° view of their product journeys in their entirety – something that can only be achieved by capturing, processing and analyzing data effectively with the aid of advanced technologies. 

When harnessed quickly and effectively, this data goes a long way to ensure minimal usage of raw materials, help ramp production up and down as needed to reduce things like hot water usage and CO2 emissions, and optimize design processes to shorten production times.

But given that the efficient use of material is inherent to the manufacturing process, what more should manufacturers be doing with their sustainability efforts to ensure they are considering the issue beyond just analyzing and acting on data insights. As more and more pressure builds around sustainability and the trend towards greater automation takes hold, how else can manufacturers employ intelligent industry to balance better business operations with protecting our planet?

Material basics and beyond

Manufacturers need to go through a fundamental shift to become leaner in their resource consumption and a good place to start is to use less paper. In much the same way the financial services industry shifted to paperless mortgages, paperless manufacturing should soon be a given. Moreover, the packaging and transportation of these products and goods must also be considered to ensure sustainability is ingrained into the supply chain in its entirety. Through AI, greener alternative solutions can be identified, enabling manufacturers to make data-driven decisions which are not only innovative, but sustainable.

Another resource area that is likely to come under growing scrutiny from customers, is the management and sourcing of raw materials, particularly rare earth materials – with the expectation that businesses should be able to track and coordinate the journey of such materials from origin to plant, as well as ensure they are ethically sourced and transported.

Now is also the time for manufacturers to innovate beyond the production line – looking at how technology can help reduce our reliance on raw materials in the first place by coming up with environmentally sensible alternatives. This could be anything from deepening their commitment to R&D, or relying more on prismatic AI designs, which use less raw material to begin with.

Scaling for sustainability

The events of the past 18+ months ruptured supply chains across the globe, spurring a major rethink around resilience. For some sectors such as life sciences, the pressure to ramp up facilities to produce unprecedented amounts of vaccines would not have been possible without the support of advanced AI technologies.

But now that demand has subsided, the same companies find themselves faced with the challenge of how to scale back without losing the agility to respond to future influxes i.e. if booster shots become widespread. And all this must be done with the least possible impact on the environment.

To scale up on sustainability, manufacturers need to be careful that by protecting one aspect of the supply chain they are not impacting another. To do this, they will need to look a layer deeper in terms of what the data is telling them about their overall sustainability footprint. It’s no use building a production factory nearer your end customers, for example, if you are forced to transport your raw materials from afar – what matters is how you plan to lower emissions at each and every step of the journey.

Full spectrum sustainability

Moving forward, AI and robotics will be central to helping manufacturing companies match trends driving customer demand directly to the production line. But this will mean interrogating every aspect of the life cycle of a product to ultimately search for better, and cleaner, ways of doing things that can not only be implemented but also communicated to end audiences to show they are doing all they possibly can to play their part in protecting the environment.

Go to Source

How to get Office 365 on the cheap

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

Jared Newman covers personal technology from his remote Cincinnati outpost. He also publishes two newsletters, Advisorator for tech advice and Cord Cutter Weekly for help with ditching cable or satellite TV.

Go to Source

Apple’s M1 Max early Geekbench benchmark result is… alright

Benchmarks for Apple’s latest and greatest silicon SoC (system-on-a-chip), the M1 Max, have already appeared online, just a few hours after it was unveiled at the Apple Unleashed October 2021 event. The more powerful 10-core M1 Max with 32GB of memory was allegedly the variant used, though as with all leaked benchmark results, don’t take anything as gospel just yet.

With that disclaimer out of the way, the M1 Max achieved a single-core score of 1,749 and a multi-core score of 11,542 if these results are genuine. 

The previous M1-powered 13-inch MacBook Pro achieved 1,732 and 7,590 respectfully on the same test, so while there’s a decent improvement on multi-core performance, these scores feel a tad lackluster when you look at the performance boost Apple was showing off during its presentation.

A benchmark for the Apple M1 Max SoC on Geekbench

(Image credit: Geekbench)

Our socks have not been knocked off

As pointed out by TomsHardware, the M1 Max has twice the performance cores of the original M1 SoC, which makes these benchmarks a tad suspicious.

By another comparison on Geekbench, a Dell XPS 17 running an Intel Core i9-11980HK achieved a single-core score of 1,658 and a multi-core score of 10,059, a little under what’s supposedly being achieved by Apple’s latest flagship chip.

Let’s not get things twisted – it isn’t that the scores are unimpressive, as they sit very comfortably at the top-tier of portable workstation benchmarks, but the margins are pretty slim and the price of the new MacBook Pro 14-inch and MacBook Pro 16-inch can quickly feel ridiculous, with the most affordable M1 Max (32 core) 14-inch MacBook starting at $3,099 / £2,999 / AU$4,649.

Thankfully, there are other things that could also be impacting performance numbers, such as the benchmarks being run on a pre-release version of macOS Monterey

It also reported that the CPU is running a base clock of 24MHz, but Geekbench’s John Poole has since mentioned to MacRumors that this is likely attributed to Geekbench itself not correctly identifying the clock speed of the new M1 Max, rather than there being an issue with the processor.

Regardless of our feelings on the performance jump, Apple has proved itself to be a formidable rival to its competitors, despite Intel and AMD having decades of developmental experience.


Opinion: Mac is back baby

The MacBook Pro 14-inch (2021) featuring a model wearing vivid, colorful clothing

(Image credit: Apple)

Ugly camera notch aside, both of the latest MacBook Pro devices equipped with the M1 Max feel like the first Apple laptop to really appeal to its intended market for some time. These scores, while not mind-blowing when stacked against the rest of the mobile workstation market, have shown that you can get near-desktop Mac performance on a portable Mac laptop, with the ‌M1‌ Max outperforming every current Mac device other than the Mac Pro and iMac Pro models that are equipped with Intel’s high-end 16 to 24-core Xeon chips.

Additional variety to the CPU/GPU market is always welcome too, given the near-monopoly that previously existed. Macs have long been favored by those working in creative jobs such as video and audio editing, but the previous MacBook models missed a few beats by removing ports and including that divisive Touch Bar

With this fresh look and powerful SoC, the MacBook Pro no longer feels like an expensive folly for some people. Even if these benchmarks are accurate when re-testing on a full macOS Monterey build, the M1 Max is plenty powerful enough to run the demanding applications it promised it could during the Apple Fall Event. Now we just need to see if those promised performance boosts are achievable.

Go to Source