Can your PC or Mac run on-device AI? This handy new Opera tool lets you find out

[[{“value”:”

Opera wants to make it easy for everyday users to find out whether their PC or Mac can run AI locally, and to that end, has incorporated a tool into its browser.

When we talk about running AI locally, we mean on the device itself, using your system and its resources for the entire AI workload being done – in contrast to having your PC tap the cloud to get the computing power to achieve the task at hand.

Running AI locally can be a demanding affair – particularly if you don’t have a modern CPU with a built-in NPU to accelerate AI workloads happening on your device – and so it’s pretty handy to have a benchmarking tool that tells you how capable your hardware is in terms of completing these on-device AI tasks effectively.

There is a catch though, namely that the ‘Is your computer AI ready?’ test is only available in the developer version of the Opera browser right now. So, if you want to give it a spin, you’ll need to download that developer (test) spin on the browser.

Once that’s done, you can get Opera to download an LLM (large language model) with which to run tests, and it checks the performance of your PC in various ways (tokens per second, first token latency, model load time and more).

If all that sounds like gobbledegook, it doesn’t really matter, as after running all these tests – which might take anything from just a few minutes to more like 20 – the tool will deliver a simple and clear assessment of whether your machine is ready for AI or not.

There’s an added nuance, mind: if you get the ‘ready for AI’ result then local performance is good, and ‘not AI ready’ is self-explanatory – you can forget running local AI tasks – but there’s a middle result of ‘AI functional.’ This means your device is capable of running AI tasks locally, but it might be rather slow, depending on what you’re doing.

(Image credit: Opera)

There’s more depth to these results for experts, that you can explore if you wish, but it’s great to get an at-a-glance estimation of your PC’s on-device AI chops. It’s also possible to download different (increasingly large) AI models to test with, too, with heftier versions catering for cutting-edge PCs with the latest hardware and NPUs.

Analysis: Why local AI processing is important

It’s great to have an easily accessible test that anyone can use to get a good idea of their PC’s processing chops for local AI work. Doing AI tasks locally, kept within the confines of the device, is obviously important for privacy – as you’re not sending any data off your machine into the cloud.

Furthermore, some AI features will use local processing partly, or indeed exclusively, and we’ve already seen the latter: Windows 11’s new cornerstone AI functionality for Copilot+ PCs, Recall, is a case in point, as it works totally on-device for security and privacy reasons. (Even so, it’s been causing a storm of controversy since it was announced by Microsoft, but that’s another story).

So, to be able to easily discern your PC’s AI grunt is a useful capability to have, though right now, downloading the Opera developer version is probably not a hassle you’ll want to go through. Still, the feature will be inbound for the full version of Opera soon enough we’d guess, so you likely won’t have to wait long for it to arrive.

Opera is certainly getting serious about climbing the rankings of the best web browsers by leveraging AI, with one of the latest moves being drafting in Google Gemini to help supercharge its Aria AI assistant.

YOU MIGHT ALSO LIKE…

Opera One is a new AI-powered browser that aims to beat Chrome and EdgeOpera can now block trackers with one click, and is keeping its free VPN serviceOpera’s new AI bot might tempt users away from Google Chrome
“}]] This is a cool idea from Opera, and the browser will rate your PC in terms of its local AI performance – but there’s a catch.  Artificial Intelligence, Computing, Software TechRadar – All the latest technology news Read More 

Leave a Comment

Your email address will not be published.