Skip to main content

You’ll soon be able to control your Chromebook with just your face

Project Gameface being used to write an email.
Google

Chromebooks have AI too. Google’s recent comments on its AI advances arrived between Microsoft’s big Copilot+ announcement and the forthcoming Apple AI news. In addition to outlining a few new AI features that are now available for Google’s Chromebook Plus line of laptops, Google previewed a fascinating feature coming later that would let you control the entirety of your Chromebook with just your face.

Using computer vision and the webcam built into your Chromebook, you’ll soon be able to talk to it, move your face, and make hand gestures to control your Chromebook. Google calls it Project Gameface, and it’s being built right into ChromeOS. The feature was originally announced via a blog post on May 10, and is aimed at creating a “hands-free, AI-powered gaming mouse,” but now it’s being expanded and is officially coming to Chromebooks.

In settings, you can set specific head movements and facial gestures captured from the camera to do things like left click, reset the cursor to center, scroll with the mouse, activate the keyboard, and more. The types of facial gestures you can use, meanwhile, include opening your mouth, smiling, looking in different directions, or raising the eyebrows. You can even customize a “gesture size,” to be even more inclusive of the specific needs of users.

A screenshot of the settings open for Project Gameface.
Google
Google seems to be adding gestures into the mix, and now you’ll be able to accomplish many more tasks with Project Gameface. According to the update, you’ll be able to do things like send emails, use apps, and browse the web without touching your keyboard or screen. If you can do all that, we have to assume Google has added even more ways of controlling the device. It also doesn’t require downloading software and should — in theory — work across apps, services, and websites.

Google admits that it’s still “early in this project,” so we don’t yet know when this feature will roll out.

Interestingly, Apple recently announced eye tracking for the iPad as a similar way to improve accessibility and allow for a more hands-free ways of interacting with devices.

Beyond updates to Project Gameface, Google has announced it’s also working on a few other upcoming AI features. With Gemini right on the device, you’ll soon be able to do live translate and transcriptions from videos and video calls. Gemini will also be able to do something called “Help me read and understand,” which is a way of getting summaries of or asking questions about particular articles or pages. Lastly, you’ll be able to log into your Chromebook and get prompted to pick things up where you left off, with all opened apps and websites grouped together just as they were in just one click.

These new features would, in theory, need to be run on a device’s neural processing unit (NPU), which very few new Chromebooks actually have.

Luke Larsen
Luke Larsen is the Senior Editor of Computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
We just learned something surprising about how Apple Intelligence was trained
Apple Intelligence update on iPhone 15 Pro Max.

A new research paper from Apple reveals that the company relied on Google's Tensor Processing Units (TPUs), rather than Nvidia's more widely deployed GPUs, in training two crucial systems within its upcoming Apple Intelligence service. The paper notes that Apple used 2,048 Google TPUv5p chips to train its AI models and 8,192 TPUv4 processors for its server AI models.

Nvidia's chips are highly sought for good reason, having earned their reputation for performance and compute efficiency. Their products and systems are typically sold as standalone offerings, enabling customers to construct and operate them as the best see fit.

Read more
Google just gained exclusive access to Reddit
The Reddit app icon on an iOS Home screen.

Reddit has begun blocking all search engines except those that pay to crawl its site -- namely, Google. A report from 404 Media says that search engines like Bing or DuckDuckGo don't show any results from the last week, even when using the "site:reddit.com" search query. Because Google has paid the bill upfront, niche search engines like Kagi that rely on Google still have access to Reddit.

In the case of DuckDuckGo, the report claims that Reddit has blocked the search engine from pulling any data, stating, “We would like to show you a description here but the site won't allow us.”

Read more
This new free tool lets you easily train AI models on your own
Gigabyte AI TOP utility branding

Gigabyte has announced the launch of AI TOP, its in-house software utility designed to bring advanced AI model training capabilities to home users. Making its first appearance at this year’s Computex, AI TOP allows users to locally train and fine-tune AI models with a capacity of up to 236 billion parameters when used with recommended hardware.

AI TOP is essentially a comprehensive solution for local AI model fine-tuning, enhancing privacy and security for sensitive data while providing maximum flexibility and real-time adjustments. According to Gigabyte, the utility comes with a user-friendly interface and has been designed to help beginners and experienced users easily navigate and understand the information and settings. Additionally, the utility includes AI TOP Tutor, which offers various AI TOP solutions, setup guidance, and technical support for all types of AI model operators.

Read more